Many years ago, I trekked with fellow Photoshop PM Bryan O’Neil Hughes to explore Death Valley’s Racetrack Playa. It was just as amazing as one would hope, even if it cost us a tire and a couple of blown-out shocks.
I think back to this fondly upon seeing Russell Brown, Eric Paré and Jeremy Verinsky venture out to the ‘track to capture some beautiful light paintings:
When Google debuted Night Sight mode on the Pixel 3, I was blown away at how well it worked compared to my iPhone X & even my DSLR. In the time since then, Apple has greatly stepped up its game, but I still find Night Sight (now on Pixel 4) to be unmatched for low-light imaging.
Having test-driven Pixel 3 at light artist Bruce Munro’s installation in Saratoga (gallery), I was excited to visit his new work, Sensorio, in Paso Robles. Happily both the installation and the Pixel 4 dazzled. You can check out some results here:
Shapr3D is an iPad drawing app that lets you create 3D drawings without having to use a desktop computer or CAD software. Designs created in this “pro-level” tool are compatible with major CAD file formats and support instant exports for 3D printing.
Google started adding augmented reality animals to searches last year at Google I/O and has since introduced a veritable menagerie, covering cats, scorpions, bears, tigers, and many more. Now, a herd of dinosaurs has also been added to this list, each of which uses graphics developed for the Jurassic World Alive augmented reality mobile game.
The full list of available dinosaurs includes the Tyrannosaurus rex, Velociraptor, Triceratops, Spinosaurus, Stegosaurus, Brachiosaurus, Ankylosaurus, Dilophosaurus, Pteranodon, and Parasaurolophus.
When the fam & I headed out on Friday for two weeks (twoooo weeeeks!) of road-tripping adventures, I didn’t expect to have *zero* connectivity with which to share updates, but so it goes; hence the radio silence here.
The disconnection (something I can rarely grant myself) has been mostly a blessing, and I’ll try to be good about staying off the keyboard until I return. Still, I’ll try to share good stuff when time permits. I hope that you, too, get a little downtime & get to go outside—where I hear that the graphics are amazing. 😌🤘
As part of the new search tab, you’ll see an interactive map view of your photos and videos, which has been one of our most-requested features since we launched Google Photos. You can pinch and zoom around the globe to explore photos of your travels…
In addition, the “Stories”-style strip up top is getting upgrades:
We’re adding more types of Memories, like the best pics of you and your closest friends and family over the years, trips, and even just the highlights from last week… We’ve also moved our automatic creations–like movies, collages, animations, stylized photos and more–from the “For you” tab (which is now gone) and into Memories.
Today, we’re taking a major step forward and announcing the Depth API is available in ARCore 1.18 for Android and Unity, including AR Foundation, across hundreds of millions of compatible Android devices.
As we highlighted last year, a key capability of the Depth API is occlusion: the ability for digital objects to accurately appear behind real world objects. This makes objects feel as if they’re actually in your space, creating a more realistic AR experience.
Check out the new Depth Lab app (also available as an open-source Unity project) to try it for yourself. You can play hide-the-hot-dog with Snap, as well as check out an Android-exclusive undersea lens:
Wanna feel like walking directly into the ocean? Try painstakingly isolating an object in frame after frame of video. Learning how to do this in the 90’s (using stone knives & bear skins, naturally), I just as quickly learned that I never wanted to do it again.
Happily the AE crew has kept improving automated tools, and they’ve just rolled out Roto Brush 2 in beta form. Ian Sansevera shows (below) how it compares & how to use it, and John Columbo provides a nice written overview.
In this After Effects tutorial I will explore and show you how to use Rotobrush 2 (which is insane by the way). Powered by Sensei, Roto Brush 2 will select and track the object, frame by frame, isolating the subject automatically.
This is kinda inside-baseball, but I’m really happy that friends from my previous team will now have their work distributed on hundreds of millions, if not billions, of devices:
[A] face contours model — which can detect over 100 points in and around a user’s face and overlay masks and beautification elements atop them — has been added to the list of APIs shipped through Google Play Services…
Lastly, two new APIs are now available as part of the ML Kit early access program: entity extraction and pose detection… Pose detection supports 33 skeletal points like hands and feet tracking.
Let’s see what rad stuff the world can build with these foundational components. Here’s an example of folks putting an earlier version to use, and you can find a ton more in my Body Tracking category: