Depth sensing comes to ARCore

I’m delighted to say that my team has unveiled depth perception in ARCore. Here’s a quick taste:

Check out how it enables real objects to occlude virtual ones:

Here’s a somewhat deeper dive into the whole shebang:

The features are designed to be widely available, not requiring special sensors:

The Depth API is not dependent on specialized cameras and sensors, and it will only get better as hardware improves. For example, the addition of depth sensors, like time-of-flight (ToF) sensors, to new devices will help create more detailed depth maps to improve existing capabilities like occlusion, and unlock new capabilities such as dynamic occlusion—the ability to occlude behind moving objects.

And we’re looking for partners:

We’ve only begun to scratch the surface of what’s possible with the Depth API and we want to see how you will innovate with this feature. If you are interested in trying the new Depth API, please fill out our call for collaborators form.

[YouTube 1 & 2]

Leave a Reply

Your email address will not be published. Required fields are marked *