When the fam & I headed out on Friday for two weeks (twoooo weeeeks!) of road-tripping adventures, I didn’t expect to have *zero* connectivity with which to share updates, but so it goes; hence the radio silence here.
The disconnection (something I can rarely grant myself) has been mostly a blessing, and I’ll try to be good about staying off the keyboard until I return. Still, I’ll try to share good stuff when time permits. I hope that you, too, get a little downtime & get to go outside—where I hear that the graphics are amazing. 😌🤘
As part of the new search tab, you’ll see an interactive map view of your photos and videos, which has been one of our most-requested features since we launched Google Photos. You can pinch and zoom around the globe to explore photos of your travels…
In addition, the “Stories”-style strip up top is getting upgrades:
We’re adding more types of Memories, like the best pics of you and your closest friends and family over the years, trips, and even just the highlights from last week… We’ve also moved our automatic creations–like movies, collages, animations, stylized photos and more–from the “For you” tab (which is now gone) and into Memories.
Today, we’re taking a major step forward and announcing the Depth API is available in ARCore 1.18 for Android and Unity, including AR Foundation, across hundreds of millions of compatible Android devices.
As we highlighted last year, a key capability of the Depth API is occlusion: the ability for digital objects to accurately appear behind real world objects. This makes objects feel as if they’re actually in your space, creating a more realistic AR experience.
Check out the new Depth Lab app (also available as an open-source Unity project) to try it for yourself. You can play hide-the-hot-dog with Snap, as well as check out an Android-exclusive undersea lens:
Wanna feel like walking directly into the ocean? Try painstakingly isolating an object in frame after frame of video. Learning how to do this in the 90’s (using stone knives & bear skins, naturally), I just as quickly learned that I never wanted to do it again.
Happily the AE crew has kept improving automated tools, and they’ve just rolled out Roto Brush 2 in beta form. Ian Sansevera shows (below) how it compares & how to use it, and John Columbo provides a nice written overview.
In this After Effects tutorial I will explore and show you how to use Rotobrush 2 (which is insane by the way). Powered by Sensei, Roto Brush 2 will select and track the object, frame by frame, isolating the subject automatically.
This is kinda inside-baseball, but I’m really happy that friends from my previous team will now have their work distributed on hundreds of millions, if not billions, of devices:
[A] face contours model — which can detect over 100 points in and around a user’s face and overlay masks and beautification elements atop them — has been added to the list of APIs shipped through Google Play Services…
Lastly, two new APIs are now available as part of the ML Kit early access program: entity extraction and pose detection… Pose detection supports 33 skeletal points like hands and feet tracking.
Let’s see what rad stuff the world can build with these foundational components. Here’s an example of folks putting an earlier version to use, and you can find a ton more in my Body Tracking category:
To my slight chagrin—having been a naysayer about turning Camera Raw into a filter one can use in Photoshop, on the grounds that doing so would be a crutch at a time when Adobe should do the hard work of revamping a motley set of disparate, 30-year-old adjustment dialogs—I find myself hitting Shift-Cmd-A all the damn time. Thus I’m glad to see the UI freshened up & tools made easier to access:
As for getting the rest of the adjustments-house in order, I wasn’t wrong, but ACR-in-PS gives me fewer reasons to care. On we go!
They’ve achieved this by treating each facial feature locally first, and then the face as a whole, basically assigning a probability to each feature. That way you don’t need a professional sketch to generate a realistic-looking image, but the better the sketch, the better and more accurate the results become. What’s more, the software can work in near-real-time,
The Land Rover app for iOS & Android, leveraging Unity + ARKit & ARCore, goes beyond ye old “spin car in space & maybe change wheels” approach we’ve long seen. Somehow I can’t embed the video, but it’s worth a look, and you can get a taste of the immersive environments it enables here: