Check out all the new goods!
It’s not new to this release, but I’d somehow missed it: support for perspective lines looks very cool.

Check out all the new goods!
It’s not new to this release, but I’d somehow missed it: support for perspective lines looks very cool.
I know only what I’ve seen here, but this combination wireless charger & DSLR-style camera grip seems very thoughtfully designed. Its ability to function as a phone stand (e.g. for use while videoconferencing) while charging puts it over the top.
90 seconds well spent with the sensei:
And here’s how Camera Raw can feed into SO’s:
Nine years ago, Google spent a tremendous amount of money buying Nik Software, in part to get a mobile raw converter—which, as they were repeatedly told, didn’t actually exist. (“Still, a man hears what he wants to hear and disregards the rest…”)
If all that hadn’t happened, I likely never would have gone there, and had the acquisition not been so ill-advised & ill-fitting, I probably wouldn’t have come back to Adobe. Ah, life’s rich pageant… ¯\_(ツ)_/¯
Anyway, back in 2021, take ‘er away, Ryan Dumlao:
“People tend to overestimate what can be done in one year and to underestimate what can be done in five or ten years,” as the old saying goes. Similarly, it can be hard to notice one’s own kid’s progress until confronted with an example of that kid from a few years back.
My son Henry has recently taken a shine to photography & has been shooting with my iPhone 7 Plus. While passing through Albuquerque a few weeks back, we ended up shooting side by side—him with the 7, and me with an iPhone 12 Pro Max (four years newer). We share a camera roll, and as I scrolled through I was really struck seeing the output of the two devices placed side by side.
I don’t hold up any of these photos (all unedited besides cropping) as art, but it’s fun to compare them & to appreciate just how far mobile photography has advanced in a few short years. See gallery for more.
It’s cool to see these mobile creativity apps Voltron-ing together via the new Adobe Design Mobile Bundle, which includes the company’s best design apps for the iPad at 50% off when purchased together. Per the site:
More good stuff is coming to Fresco soon, too:
Then, there are live oil brushes in Fresco that you just don’t get in any other app. In Fresco, today, you can replicate the look of natural media like oils, watercolors and charcoal — soon you’ll be able to add motion as well! We showed a sneak peek at the workshop, and it blew people’s minds.
Suffice it to say, I can’t say enough good things about the team. 🙂 If this sounds like your kind of jam, check out the listing.
YouTube, the Google app, and Photos now offer options to show widgets via the new iOS 14:
To install a Google Widget, first make sure you have the Google Photos app, YouTube Music app or Google app downloaded from the App Store. Then follow these steps:
- Press and hold on the home screen of your iPhone or iPad
- Tap the plus icon on the upper left corner to open the widget gallery
- Search for & tap on the Google app, YouTube Music or the Google Photos app
- Swipe right/left to select the widget size
- Tap “Add Widget”
- Place the widget and tap “Done” at the upper right corner
Teamwork makes the dream work, baby:
Improvements to password filling on iOS
We recently launched Touch-to-fill for passwords on Android to prevent phishing attacks. To improve security on iOS too, we’re introducing a biometric authentication step before autofilling passwords. On iOS, you’ll now be able to authenticate using Face ID, Touch ID, or your phone passcode. Additionally, Chrome Password Manager allows you to autofill saved passwords into iOS apps or browsers if you enable Chrome autofill in Settings.
Looom is all about simple, approachable cel animation:
Meanwhile, per DesignTaxi,
Shapr3D is an iPad drawing app that lets you create 3D drawings without having to use a desktop computer or CAD software. Designs created in this “pro-level” tool are compatible with major CAD file formats and support instant exports for 3D printing.
This is kinda inside-baseball, but I’m really happy that friends from my previous team will now have their work distributed on hundreds of millions, if not billions, of devices:
[A] face contours model — which can detect over 100 points in and around a user’s face and overlay masks and beautification elements atop them — has been added to the list of APIs shipped through Google Play Services…
Lastly, two new APIs are now available as part of the ML Kit early access program: entity extraction and pose detection… Pose detection supports 33 skeletal points like hands and feet tracking.
Let’s see what rad stuff the world can build with these foundational components. Here’s an example of folks putting an earlier version to use, and you can find a ton more in my Body Tracking category:
[Via]
After being announced last fall, it’s free to download for iOS & Android.
TBH I’m a little nonplussed about the specific effects shown here, but I remain intrigued by the idea of a highly accessible, results-oriented app that could also generate layered imagery for further tweaking in Photoshop and other more flexible tools.
As for the rationale, PetaPixel notes,
The main goal of apps like this might simply be to introduce more people to the Adobe ecosystem. Adobe CTO Abhay Parasnis said as much in an interview with The Verge, in which he calls Photoshop Camera “the next one in that journey for us.” Photoshop Camera could act as the “gateway drug” to a Creative Cloud subscription for anybody who discovers a dormant love of photo editing.
I’ve long joked-not-joked that I want better parental controls on devices, not so that I can control my kids but so that I can help my parents. How great would it be to be able to configure something like this, then push it to the devices of those who need it (parents, kids, etc.)?
I’ve always been part of that weird little slice of the Adobe user population that gets really hyped about offbeat painting tools—from stretching vectors along splines & spraying out fish in Illustrator (yes, they’re both in your copy right now; no, you’ve never used them), to painting with slick features that got pulled from Photoshop before release & somehow have never returned. I still wish we’d been able to shoehorn GPU-powered watercolor into Photoshop’s, er, venerable compositing engine, but so it goes. (A 15-year-old demo still lives at one of my best URLs ever, jnack.com/BlowingYourMindClearOutYourAss )
In any event, the Adobe Fresco team has just unveiled a raft of new features, including some trippy multicolor painting capabilities. Check it out:
[Please note: I don’t work on the Pixel team, and these opinions are just those of a guy with a couple of phones in hand, literally shooting in the dark.]
In Yosemite Valley on Friday night, I did some quick & unscientific but illuminating (oh jeez) tests shooting with a Pixel 4 & iPhone 11 Pro Max. I’d had fleeting notions of trying some proper astrophotography (side note: see these great tips from Pixel engineer & ILM vet Florian Kainz), but between the moon & the clouds, I couldn’t see a ton of stars. Therefore I mostly held up both phones, pressed the shutter button, and held my breath.
Check out the results in this album. You can see which camera produced which images by tapping each image, then tapping the little comment icon. I haven’t applied any adjustments.
Overall I’m amazed at what both devices can produce, but overall I preferred the Pixel’s interpretations. They were darker, but truer to what my eyes perceived, and very unlike the otherworldly, day-for-night iPhone renderings (which persisted despite a few attempts I made to set focus, then drag down the exposure before shooting).
Check out the results, judge for yourself, and let me know what you think.
Oh, and for a much more eye-popping Pixel 4 result, check out this post from Adobe’s Russell Brown:
Man, how great is it that after 35 years in the game (!!), Russell hasn’t lost a bit of his madcap energy. He’s bringing back The Russell Brown Show & starting out in nighttime Tokyo:
The clip above is fun, but the really meaty bits are in the associated tutorials he’s posting to his site; enjoy!
[YouTube]
Boy, what I wouldn’t have given to have had this tech in Photoshop Touch, where Scribble Selection was the hotness du jour. Pam Clark writes,
This feature on the iPad works exactly the same as on Photoshop on the desktop and produces the same results, vastly enhancing selection capabilities and speed available on the iPad. With cloud documents, you can make a selection on the desktop or the iPad and continue your work seamlessly using Photoshop on another device with no loss of fidelity; no imports or exports required.
We originally released Select Subject in Photoshop on the desktop in 2018. The 2019 version now runs on both the desktop and the iPad and produces cleaner selection edges on the mask and delivers massively faster performance (almost instantaneous), even on the iPad.
Days Of Miracles & Wonder, Vol. ∞…
The feature is rolling out today; I was able to try it on my Pixel 4 without a hitch. It works across 44 languages, and is available on both Android and iOS. Google Assistant is built into Android phones and no separate app is required. For iOS, simply download the Google Assistant app to try it out.
Now, you can turn a photo into a portrait on Pixel by blurring the background post-snap. So whether you took the photo years ago, or you forgot to turn on portrait mode, you can easily give each picture an artistic look with Portrait Blur in Google Photos.
I’m also pleased to see that the realtime portrait-blurring tech my team built has now come to Google Duo for use during video calls:
“Hey, y’all got a water desalination plant, ‘cause I’m salty as hell.” 🙃
First, some good news: Lightroom is planning to improve the workflow of importing images from an SD card:
I know that this is something that photographers deeply wanted, starting in 2010. I just wonder whether—nearly 10 years since the launch of iPad—it matters anymore.
My failure, year in & year out, to solve the problem at Adobe is part of what drove me to join Google in 2014. But even back then I wrote,
I remain in sad amazement that 4.5 years after the iPad made tablets mainstream, no one—not Apple, not Adobe, not Google—has, to the best of my knowledge, implemented a way to let photographers to do what they beat me over the head for years requesting:
- Let me leave my computer at home & carry just my tablet** & camera
- Let me import my raw files (ideally converted to vastly smaller DNGs), swipe through them to mark good/bad/meh, and non-destructively edit them, singly or in batches, with full raw quality.
- When I get home, automatically sync all images + edits to/via the cloud and let me keep editing there or on my Mac/PC.
This remains a bizarre failure of our industry.
Of course this wasn’t lost on the Lightroom team, but for a whole bunch of reasons, it’s taken this long to smooth out the flow, and during that time capture & editing have moved heavily to phones. Tablets represent a single-digit percentage of Snapseed session time, and I’ve heard the same from the makers of other popular editing apps. As phones improve & dedicated-cam sales keep dropping, I wonder how many people will now care.
On we go.
[YouTube]
This new iOS & Android app (not yet available, though you can sign up for prerelease access) promises to analyze images, suggest effects, and keep the edits adjustable (though it’s not yet clear whether they’ll be editable as layers in “big” Photoshop).
I’m reminded of really promising Photoshop Elements mobile concepts from 2011 that went nowhere; of the Fabby app some of my teammates created before being acquired by Google; and of all I failed to enable in Google Photos. “Poo-tee-weet?” ¯\_(ツ)_/¯ Anyway, I’m eager to take it for a spin.
[YouTube]
“We are, by nature, explorers…” You tell ’em, Dr. Hawking:
[YouTube]
This looks so rad. Back in the day, I really wanted a solution that would record the “bizarre, freewheeling bedtime stories” my sons & I made up every night, then let us put them into an illustrated journal. The new Recorder app solves the most critical piece of that puzzle.
The new Recorder app on Pixel 4 brings the power of search and AI to audio recording. You can record meetings, lectures, jam sessions — anything you want to save and listen to later. Recorder automatically transcribes speech and tags sounds like music, applause, and more, so you can search your recordings to quickly find the part you’re looking for. All Recorder functionality happens on-device, so your audio never leaves your phone. We’re starting with English for transcription and search, with more languages coming soon.
[YouTube]
“So simple, even a product manager can do it…” 😌 (Courtesy of my colleague Navin.) Click through for a higher-res version.
All through October. Looks like fun—and great to see how far the world has come from the “Thoughts On Flash”/“Sympathy for the Devil” days.
We've partnered up with @Apple for The Big Draw!
Join free art sessions October 1–31 at Apple Stores to get hands-on experience with #AdobeFresco on iPad and instruction from top talent: https://t.co/ArOZTwF6wD #TodayatApple #TheBigDraw pic.twitter.com/eXlyqX7ghB
— Adobe Drawing (@AdobeDrawing) September 17, 2019
My old pals Will & Bryan and their teams have been hard at work on the brushing-savvy iPad app Fresco (see previous thoughts). Gizmodo offers a quick look at its current state, and Bryan has shared some perspective on its development.
[YouTube]
Does anyone else remember when Adobe demoed automatic sky-swapping ~3 years ago, but then never shipped it… because, big companies? (No, just me?)
Anyway, Xiaomi is now offering a similar feature. Here’s a quick peek:
And here’s a more in-depth demo:
Coincidentally, “Skylum Announces Luminar 4 with AI-Powered Automatic Sky Replacement”:
It removes issues like halos and artifacts at the edges and horizon, allows you to adjust depth of field, tone, exposure and color after the new sky has been dropped in, correctly detects the horizon line and the orientation of the sky to replace, and intelligently “relights” the rest of your photo to match the new sky you just dropped in “so they appear they were taken during the same conditions.”
Check out the article link to see some pretty compelling-looking examples.
Sadly (lulz-wise) it’s not an Apple “Nimitz” keyboard superglued to someone’s chest, but this little dingus is… interesting?
The world is your oyster! Er, we mean keyboard pic.twitter.com/LOrGQCAJyo
— Mashable (@mashable) July 10, 2019
People have been trying to combine the power of vector & raster drawing/editing for decades. (Anybody else remember Creature House Expression, published by Fractal & then acquired by Microsoft? Congrats on also being old! 🙃) It’s a tough line to walk, and the forthcoming Adobe Fresco app is far from Adobe’s first bite at the apple (I remember you, Fireworks).
Back in 2010, I transitioned off of Photoshop proper & laid out a plan by which different mobile apps/modules (painting, drawing, photo library) would come together to populate a share, object-centric canvas. Rather than build the monolithic (and now forgotten) Photoshop Touch that we eventually shipped, I’d advocated for letting Adobe Ideas form the drawing module, Lightroom Mobile form the library, and a new Photoshop-derived painting/bitmap editor form the imaging module. We could do the whole thing on a new imaging stack optimized around mobile GPUs.
Obviously that went about as well as conceptually related 90’s-era attempts at OpenDoc et al.—not because it’s hard to combine disparate code modules (though it is!), but because it’s really hard to herd cats across teams, and I am not Steve Fucking Jobs.
Sadly, I’ve learned, org charts do matter, insofar as they represent alignment of incentives & rewards—or lack thereof. “If you want to walk fast, walk alone; if you want to walk far, walk together.” And everyone prefers “innovate” vs. “integrate,” and then for bonus points they can stay busy for years paying down the resulting technical debt. “…Profit!”
But who knows—maybe this time crossing the streams will work. Or, see you again in 5-10 years the next time I write this post. 😌
[YouTube]
I’m intrigued by the wealth of enhancements arriving in Procreate for iPad, including new tapered strokes & “QuickShapes.” These remind me of shape-recognition tech in Adobe apps that dates back 20+ years to early Flash, but which is cleverly executed here (enabling quick movement & manipulation of what’s drawn):
[YouTube]
This is a watershed moment for me: After 11+ years of shooting on iPhones & Canon DSLRs, this is the first time I’ve shot on an Android device that plainly outshines them both at something. Night Sight on Pixel 3 blows me away.
First, some important disclaimers:
Having said all that, I think my results reasonably represent what a normal-to-semi-savvy person would get from the various devices. Here’s what I saw:
What do you think?
By the way, Happy New Year! Here’s an animation created last night by shooting a series of Night Sight images, then combining them in Google Photos & finally cropping the output in Photoshop.
PS—I love the Queen-powered “Flash!” ad showing Night Sight:
[YouTube]
My team makes tech that Lens uses to do things like track text in the screenshot below, so I’m pleased that Google Lens is getting integrated into Google search on iOS and upgraded in Google Photos:
It’s easier than ever to do more with your photos with a new, redesigned Google Lens experience on Android and iOS–now available in English, Spanish, French, German, Italian, Portuguese, and Korean.
Enjoy!
I could attempt to explain this gizmo from Teenage Engineering—or you could spend 60 seconds being charmed/baffled (chaffled?) by it:
[YouTube] [Via Zalman Stern]
“Water, fire, metal and light,” writes Apple, “were used to create these mesmerizing scenes using 4K, Slo-mo, and Time-lapse. #ShotoniPhone by Donghoon J. and Sean S.” Enjoy:
[YouTube]
Google has launched “Mini” stickers for iOS and Android, which use machine learning to craft personalized emoji from your photo. More precisely, the feature uses a combination of machine learning, neural networks and artist illustrations to conjure up the best representation of you, taking into account various characteristics like your skin tone, hair color and style, eye color, face shape and facial hair. Just access Mini from within Gboard and start the creation process by taking a selfie. It will then automatically create your avatar and generate packs of stickers you can use.
I’m a sucker for a purposeful, well-crafted bit of kit, and this fun iPhone app (download) is just that:
[YouTube]
Fun stuff from the Shanghai office:
In order to give everyone the opportunity to experience just how natural AI-powered interactions can now be, we’re launching 猜画小歌 (“Guess My Sketch”) from Google AI, a fun, social WeChat Mini Program in which players team up with our AI to sketch everyday items in a race against the clock. In each round, players sketch the given word (like “dog”, “clock”, or “shoe”) for their AI teammate to guess correctly before time runs out.
When the AI successfully guesses your sketch, you’ll move on to the next round and increase your sketching streak. You can invite friends and family to compete for the longest streak, share interesting sketches with each other, and collect new words and drawings as you continue playing.
The one-man (I believe) band behind the Focus app for iOS continues to apply the awesome sauce—now adding the ability to create & modify light sources in portrait-mode images (which it treats as 3D). Check it out:
[YouTube]
My team has just added some fun new characters to Motion Stills for Android. 9to5 Google writes,
A dog (clear favorite), UFO, heart, basketball, and spider join the dinosaur, chicken, alien, gingerbread man, planet, and robot. The latter six stickers have been slightly rearranged, while the new ones are at the beginning of the carousel.
Enjoy! And let us know what else you’d like to see.
As I mentioned the other day, Moment is Kickstarting efforts to create an anamorphic lens for phones like Pixel & iPhone. In the quick vid below, they explain its charms—cool lens flares, oval bokeh, and more:
[YouTube]
Well, that escalated quickly: For this new set of mobile filmmaking tools (lens, battery, gimbal) Moment hit their $50k funding goal in in just over half an hour, and as of this writing they’ve easily cleared the $750k mark. Check ‘em out:
Now available on both iOS & Android, and offering a few neat tricks:
Lens works on photos of business cards, books, landmarks and buildings, paintings in a museum, plants or animals, and flyers and event billboards. When you use Lens on a photo that has phone numbers or an address, you can automatically save this information as a contact on your phone, while events will be added to your calendar.
Get ready for a whole new wave of AR gaming:
Unity integration will also allow developers to customize maps with what appears to be a great deal of flexibility and control. Things like buildings and roads are turned into objects, which developers can then tweak in the game engine. During a demonstration, Google showed off real-world maps that were transformed into sci-fi landscapes and fantasy realms, complete with dragons and treasure chests.
Jacoby says that one of the goals of the project was to help developers build detailed worlds using Maps data as a base to paint over. Developers can do things like choose particular kinds of buildings or locations — say, all stores or restaurants — and transform each one. A fantasy realm could turn all hotels into restorative inns, for instance, or anything else.
[YouTube]
Continuing our series of Research Blog posts (see realtime segmentation, motion tracking), my teammates have provided an inside look at the tech they’ve developed—this time covering how motion photos get stabilized on the fly:
By combining software-based visual tracking with the motion metadata from the hardware sensors, we built a new hybrid motion estimation for motion photos on the Pixel 2.
Check out the blog post for details, or just enjoy lots of good before/after examples of stabilization in action.
Check it out:
As Ars Technica explains,
Flutter apps don’t directly compile to native Android and iOS apps; they run on the Flutter rendering engine (written in C++) and Flutter Framework (written in Dart, just like Flutter apps), both of which get bundled up with every app, and then the SDK spits out a package that’s ready to go on each platform. You get your app, a new engine to run the Flutter code on, and enough native code to get the Flutter platform running on Android and iOS.
Also, I’m totally creating a band called Stateful Hot Reload. 🙂
[YouTube]