Category Archives: VR/AR

AR dinosaurs stomp into Google search

Check out some fun new work from my team, available now on both Android & iOS:

Google started adding augmented reality animals to searches last year at Google I/O and has since introduced a veritable menagerie, covering cats, scorpions, bears, tigers, and many more. Now, a herd of dinosaurs has also been added to this list, each of which uses graphics developed for the Jurassic World Alive augmented reality mobile game.

The full list of available dinosaurs includes the Tyrannosaurus rex, Velociraptor, Triceratops, Spinosaurus, Stegosaurus, Brachiosaurus, Ankylosaurus, Dilophosaurus, Pteranodon, and Parasaurolophus.

ARCore rolls out depth support

Exciting news from my teammates:

Today, we’re taking a major step forward and announcing the Depth API is available in ARCore 1.18 for Android and Unity, including AR Foundation, across hundreds of millions of compatible Android devices.

As we highlighted last year, a key capability of the Depth API is occlusion: the ability for digital objects to accurately appear behind real world objects. This makes objects feel as if they’re actually in your space, creating a more realistic AR experience.

Check out the new Depth Lab app (also available as an open-source Unity project) to try it for yourself. You can play hide-the-hot-dog with Snap, as well as check out an Android-exclusive undersea lens:

ML Kit gets pose detection

This is kinda inside-baseball, but I’m really happy that friends from my previous team will now have their work distributed on hundreds of millions, if not billions, of devices:

[A] face contours model — which can detect over 100 points in and around a user’s face and overlay masks and beautification elements atop them — has been added to the list of APIs shipped through Google Play Services…

Lastly, two new APIs are now available as part of the ML Kit early access program: entity extraction and pose detection… Pose detection supports 33 skeletal points like hands and feet tracking.

Let’s see what rad stuff the world can build with these foundational components. Here’s an example of folks putting an earlier version to use, and you can find a ton more in my Body Tracking category:

[Via]

Google releases “Sodar,” visualizing social distancing via WebXR

It’s as much about testing/showcasing emerging standards as anything. Per The Verge:

If you’ve got an Android device, just open up the Chrome browser and go to goo.gle/sodar to launch the tool, named SODAR. There’s no app required, though it won’t work on iOS or older Android devices. Your phone will use augmented reality to map the space around you, superimposing a two-meter radius circle on the view from your camera.

Garments strut on their own in a 3D fashion show

No models, no problem: Congolese designer Anifa Mvuemba used software to show off her designs swaying in virtual space:

Cool context:

Inspired by her hometown in Congo, Anifa was intentional about shedding light on issues facing the Central African country with a short documentary at the start of the show. From mineral site conditions to the women and children who suffer as a result of these issues, Anifa’s mission was to educate before debuting any clothes. “Serving was a big part of who I am, and what I want to do,” she said in the short documentary.

Katy Perry live pushes the limits of mixed-reality storytelling

I can’t say I love the song, but props to the whole team who pulled off this ambitious set piece:

As VR Scout notes,

With the exception of a single yellow chair, it appears as though every visual shown during the performance was generated in post. What really sells the performance, however, is the choreography. Throughout the entirety of the performance, Perry reacts and responds to every visual element shown “on-stage”.