Google started adding augmented reality animals to searches last year at Google I/O and has since introduced a veritable menagerie, covering cats, scorpions, bears, tigers, and many more. Now, a herd of dinosaurs has also been added to this list, each of which uses graphics developed for the Jurassic World Alive augmented reality mobile game.
The full list of available dinosaurs includes the Tyrannosaurus rex, Velociraptor, Triceratops, Spinosaurus, Stegosaurus, Brachiosaurus, Ankylosaurus, Dilophosaurus, Pteranodon, and Parasaurolophus.
Today, we’re taking a major step forward and announcing the Depth API is available in ARCore 1.18 for Android and Unity, including AR Foundation, across hundreds of millions of compatible Android devices.
As we highlighted last year, a key capability of the Depth API is occlusion: the ability for digital objects to accurately appear behind real world objects. This makes objects feel as if they’re actually in your space, creating a more realistic AR experience.
Check out the new Depth Lab app (also available as an open-source Unity project) to try it for yourself. You can play hide-the-hot-dog with Snap, as well as check out an Android-exclusive undersea lens:
This is kinda inside-baseball, but I’m really happy that friends from my previous team will now have their work distributed on hundreds of millions, if not billions, of devices:
[A] face contours model — which can detect over 100 points in and around a user’s face and overlay masks and beautification elements atop them — has been added to the list of APIs shipped through Google Play Services…
Lastly, two new APIs are now available as part of the ML Kit early access program: entity extraction and pose detection… Pose detection supports 33 skeletal points like hands and feet tracking.
Let’s see what rad stuff the world can build with these foundational components. Here’s an example of folks putting an earlier version to use, and you can find a ton more in my Body Tracking category:
The Land Rover app for iOS & Android, leveraging Unity + ARKit & ARCore, goes beyond ye old “spin car in space & maybe change wheels” approach we’ve long seen. Somehow I can’t embed the video, but it’s worth a look, and you can get a taste of the immersive environments it enables here:
Resolve allows project teams to review large BIM files on the Oculus Quest in a collaborative environment. With a few clicks teams can bring in files from Autodesk BIM 360 and use Resolve to leave speech-to-text annotations, measure, markup, and inspect BIM properties in VR.
It’s as much about testing/showcasing emerging standards as anything. Per The Verge:
If you’ve got an Android device, just open up the Chrome browser and go to goo.gle/sodar to launch the tool, named SODAR. There’s no app required, though it won’t work on iOS or older Android devices. Your phone will use augmented reality to map the space around you, superimposing a two-meter radius circle on the view from your camera.
Inspired by her hometown in Congo, Anifa was intentional about shedding light on issues facing the Central African country with a short documentary at the start of the show. From mineral site conditions to the women and children who suffer as a result of these issues, Anifa’s mission was to educate before debuting any clothes. “Serving was a big part of who I am, and what I want to do,” she said in the short documentary.
With the exception of a single yellow chair, it appears as though every visual shown during the performance was generated in post. What really sells the performance, however, is the choreography. Throughout the entirety of the performance, Perry reacts and responds to every visual element shown “on-stage”.