My old pals Will & Bryan and their teams have been hard at work on the brushing-savvy iPad app Fresco (see previous thoughts). Gizmodo offers a quick look at its current state, and Bryan has shared some perspective on its development.
Does anyone else remember when Adobe demoed automatic sky-swapping ~3 years ago, but then never shipped it… because, big companies? (No, just me?)
Anyway, Xiaomi is now offering a similar feature. Here’s a quick peek:
And here’s a more in-depth demo:
Coincidentally, “Skylum Announces Luminar 4 with AI-Powered Automatic Sky Replacement”:
It removes issues like halos and artifacts at the edges and horizon, allows you to adjust depth of field, tone, exposure and color after the new sky has been dropped in, correctly detects the horizon line and the orientation of the sky to replace, and intelligently “relights” the rest of your photo to match the new sky you just dropped in “so they appear they were taken during the same conditions.”
Check out the article link to see some pretty compelling-looking examples.
People have been trying to combine the power of vector & raster drawing/editing for decades. (Anybody else remember Creature House Expression, published by Fractal & then acquired by Microsoft? Congrats on also being old! 🙃) It’s a tough line to walk, and the forthcoming Adobe Fresco app is far from Adobe’s first bite at the apple (I remember you, Fireworks).
Back in 2010, I transitioned off of Photoshop proper & laid out a plan by which different mobile apps/modules (painting, drawing, photo library) would come together to populate a share, object-centric canvas. Rather than build the monolithic (and now forgotten) Photoshop Touch that we eventually shipped, I’d advocated for letting Adobe Ideas form the drawing module, Lightroom Mobile form the library, and a new Photoshop-derived painting/bitmap editor form the imaging module. We could do the whole thing on a new imaging stack optimized around mobile GPUs.
Obviously that went about as well as conceptually related 90’s-era attempts at OpenDoc et al.—not because it’s hard to combine disparate code modules (though it is!), but because it’s really hard to herd cats across teams, and I am not Steve Fucking Jobs.
Sadly, I’ve learned, org charts do matter, insofar as they represent alignment of incentives & rewards—or lack thereof. “If you want to walk fast, walk alone; if you want to walk far, walk together.” And everyone prefers “innovate” vs. “integrate,” and then for bonus points they can stay busy for years paying down the resulting technical debt. “…Profit!”
But who knows—maybe this time crossing the streams will work. Or, see you again in 5-10 years the next time I write this post. 😌
I’m intrigued by the wealth of enhancements arriving in Procreate for iPad, including new tapered strokes & “QuickShapes.” These remind me of shape-recognition tech in Adobe apps that dates back 20+ years to early Flash, but which is cleverly executed here (enabling quick movement & manipulation of what’s drawn):
This is a watershed moment for me: After 11+ years of shooting on iPhones & Canon DSLRs, this is the first time I’ve shot on an Android device that plainly outshines them both at something. Night Sight on Pixel 3 blows me away.
First, some important disclaimers:
Having said all that, I think my results reasonably represent what a normal-to-semi-savvy person would get from the various devices. Here’s what I saw:
What do you think?
By the way, Happy New Year! Here’s an animation created last night by shooting a series of Night Sight images, then combining them in Google Photos & finally cropping the output in Photoshop.
PS—I love the Queen-powered “Flash!” ad showing Night Sight:
It’s easier than ever to do more with your photos with a new, redesigned Google Lens experience on Android and iOS–now available in English, Spanish, French, German, Italian, Portuguese, and Korean.
“Water, fire, metal and light,” writes Apple, “were used to create these mesmerizing scenes using 4K, Slo-mo, and Time-lapse. #ShotoniPhone by Donghoon J. and Sean S.” Enjoy: