Category Archives: AR/VR

AR: How the giant Carolina Panther was made

By now you’ve probably seen this big gato bounding around:

https://twitter.com/Panthers/status/1437103615634726916?s=20

I’ve been wondering how it was done (e.g. was it something from Snap, using the landmarker tech that’s enabled things like Game of Thrones dragons to scale the Flatiron Building?). Fortunately the Verge provides some insights:

In short, what’s going on is that an animation of the virtual panther, which was made in Unreal Engine, is being rendered within a live feed of the real world. That means camera operators have to track and follow the animations of the panther in real time as it moves around the stadium, like camera operators would with an actual living animal. To give the panther virtual objects to climb on and interact with, the stadium is also modeled virtually but is invisible.

This tech isn’t baked into an app, meaning you won’t be pointing your phone’s camera in the stadium to get another angle on the panther if you’re attending a game. The animations are intended to air live. In Sunday’s case, the video was broadcast live on the big screens at the stadium.

I look forward to the day when this post is quaint, given how frequently we’re all able to glimpse things like this via AR glasses. I give it 5 years, or maybe closer to 10—but let’s see.

Behind the scenes with Olympians & Google’s AR “Scan Van”

I swear I spent half of last summer staring at tiny 3D Naomi Osaka volleying shots on my desktop. I remain jealous of my former teammates who got to work with these athletes (and before them, folks like Donald Glover as Childish Gambino), even though doing so meant dealing with a million Covid safety protocols. Here’s a quick look at how they captured folks flexing & flying through space:

You can play with the content just by searching:

[Via Chikezie Ejiasi]

AR: Olympians come to Google search

Last summer my former teammates got all kinds of clever in working around Covid restrictions—and the constraints of physics and 3D capture—to digitize top Olympic athletes performing their signature moves. I wish they’d share the behind-the-scenes footage, as it’s legit fascinating. (Also great: seeing Donald Glover, covered in mocap ping pong balls for the making of Pixel Childish Gambino AR content, sneaking up behind my colleague like some weird-ass phantom. 😝)

Anyway, after so much delay and uncertainty, I’m happy to see those efforts now paying off in the form of 3D/AR search results. Check it out:

https://twitter.com/davidiwanow/status/1419913878222393361?s=20

PixARface: Scarface goes Pixar

One, it’s insane what AR can do in realtime.
Two, this kind of creative misuse of tech is right up my alley.

Update/bonus: Nobody effs with the AR Jesus:

“Supernatural” offers home workouts in VR

Hmm—this looks slick, but I’m not sure that I want to have a big plastic box swinging around my face while I’m trying to get fit. As a commenter notes, “That’s just Beat Saber with someone saying ‘good job’ once in a while”—but a friend of mine says it’s great. ¯\_(ツ)_/¯

This vid (same poster frame but different content) shows more of the actual gameplay:

Body Movin’: Adobe Character Animator introduces body tracking (beta)

You’ll scream, you’ll cry, promises designer Dave Werner—and maybe not due just to “my questionable dance moves.”

Live-perform 2D character animation using your body. Powered by Adobe Sensei, Body Tracker automatically detects human body movement using a web cam and applies it to your character in real time to create animation. For example, you can track your arms, torso, and legs automatically. View the full release notes.

Check out the demo below & the site for full details.

Vid2Actor: Turning video of humans into posable 3D models

As I’m on a kick sharing recent work from Ira Kemelmacher-Shlizerman & team, here’s another banger:

Given an “in-the-wild” video, we train a deep network with the video frames to produce an animatable human representation.

This can be rendered from any camera view in any body pose, enabling applications such as motion re-targeting and bullet-time rendering without the need for rigged 3D meshes.

I look forward (?) to the not-so-distant day when a 3D-extracted Trevor Lawrence hucks a touchdown to Cleatus the Fox Sports Robot. Grand slam!!

Check out the Spark AR Master Class

I remain fascinated by what Snap & Facebook are doing with their respective AR platforms, putting highly programmable camera stacks into the hands of hundreds of millions of consumers & hundreds of thousands of creators. If you have thoughts on the subject & want to nerd out some time, drop me a note.

A few months back I wanted to dive into the engine that’s inside Instagram, and I came across the Spark AR masterclass put together & presented by filter creator Eddy Adams. I found it engaging & informative, if even a bit fast for my aging brain 🙃. If you’re tempted to get your feet wet in this emerging space, I recommend giving it a shot.

Niantic sneaks 5G AR “Urban Legends”; what does it all mean?

“‘Augmented Reality: A Land Of Contrasts.’ In this essay, I will…”

Okay, no, not really, but let me highlight some interesting mixed signals. (It’s worth noting that these are strictly my opinions, not those of any current or past employer.)

Pokémon Go debuted almost exactly 5 years ago, and last year, even amidst a global pandemic that largely immobilized people, it generated its best revenue ever—more than a billion dollars in just the first 10 months of the year, bringing its then-total to more than $4 billion.

Having said that…

  • In the five years since its launch, what other location-based AR games (or AR games, period) have you seen really take off? Even with triple-A characters & brands, Niantic’s own Harry Potter title made a far smaller splash, and Minecraft Earth (hyped extensively at an Apple keynote event) is being shut down.
  • When I launched Pokémon Go last year (for the first time in years), I noticed that the only apparent change since launch was that AR now defaults to off. That is, Niantic apparently decided that monster-catching was easier, more fun, and/or less resource-intensive when done in isolation, with no camera overlay.
  • The gameplay remains extremely rudimentary—no use (at least that I could see) of fancy SLAM tracking, depth processing, etc., despite Niantic having acquired startups to enable just this sort of thing, showing demos three years ago.
  • Network providers & handset makers really, really want you to want 5G—but I’ve yet to see it prove to be transformative (even for the cloud-rendered streaming AR that my Google team delivered last year). Even when “real” 5G is available beyond a couple of urban areas, it’s hard to imagine a popular title being 5G-exclusive.

So does this mean I think location-based AR games are doomed? Well, no, as I claim zero prognostication-fu here. I didn’t see Pokémon Go coming, despite my roommate in Nepal (who casually mentioned that he’d helped found Google Earth—as one does) describing it ahead of launch; and given the way public interest in the app dropped after launch (see above), I’d never have guessed that it would be generating record revenue now—much less during a pandemic!

So, who knows: maybe Niantic & its numerous partners will figure out how to recapture lighting in a bottle. Here’s a taste of how they expect that to look:

If I had to bet on someone, though, it’d be Snap: they’ve been doing amazing site-specific AR for the last couple of years, and they’ve prototyped collaborative experiences built on the AR engine that hundreds of millions of people use every day; see below. Game on!

AR: Google is working on indoor walking nav

I spent my last couple of years at Google working on a 3D & AR engine that could power experiences across Maps, YouTube, Search, and other surfaces. Meanwhile my colleagues have been working on data-gathering that’ll use this system to help people navigate via augmented reality. As TechCrunch writes:

Indoor Live View is the flashiest of these. Google’s existing AR Live View walking directions currently only work outdoors, but thanks to some advances in its technology to recognize where exactly you are (even without a good GPS signal), the company is now able to bring this indoors.

This feature is already live in some malls in the U.S. in Chicago, Long Island, Los Angeles, Newark, San Francisco, San Jose and Seattle, but in the coming months, it’ll come to select airports, malls and transit stations in Tokyo and Zurich as well (just in time for vaccines to arrive and travel to — maybe — rebound). Because Google is able to locate you by comparing the images around you to its database, it can also tell which floor you are on and hence guide you to your gate at the Zurich airport, for example.

“You Look Like A Thing, And I Love You”

I really enjoyed listening to the podcast version of this funny, accessible talk from AI Weirdness writer Janelle Shane, and think you’d get a kick out of it, too.

On her blog, Janelle writes about AI and the weird, funny results it can produce. She has trained AIs to produce things like cat names, paint colors, and candy heart messages. In this talk she explains how AIs learn, fail, adapt, and reflect the best and worst of humanity.

Lego Vidiyo promises AR filmmaking for kids

TikTok Micronaxx, here we come!

The high-key nutty (am I saying that right, kids?) thing is that they’ve devised a whole musical persona to go with it, complete with music videos:

L.L.A.M.A. is the first ever Lego mini-figure to be signed to a major label and the building toy group’s debut attempt at creating its own star DJ/ producer.

A cross between a helmet headed artist like Marshmello and a corporate synergy-prone artificial entity like Lil Miquela, L.L.A.M.A., which stands for “Love, Laughter and Music Always” (not kidding), is introducing himself to the world today with a debut single, “Shake.”

It appears that this guy & pals fly around on giant luckdragon-style copies of our goldendoodle Seamus, and I am here for that.

A quick, cool demo of markerless body tracking

AR fashion star:

No markers, no mocap cameras, no suit, no keyframing. This take uses 3 DSLR cameras, though, and pretty far from being real-time. […]

Under the hood, it uses #OpenPose ML-network for 2d tracking of joints on each camera, and then custom Houdini setup for triangulating the results into 3d, stabilizing it and driving the rig (volumes, CHOPs, #kinefx, FEM – you name it 🙂

[Via Tyler Zhu]

Google Lens hits 500 million downloads

This is one of the many Google projects to which I’ve been lucky enough to contribute just a bit (focusing on object tracking & graphical adornments). It’s built into Google Photos, among other surfaces, and I’m really pleased that people are seeking it out:

We at AP don’t think that Lens gets enough praise; we even named it one of our 10 favorite Android features from 2020. Lens is an AR-powered service that can help you translate, identify, and scan things around you. Last year, it added support for solving homework questionstext-to-speech and “copy to computer” functions, and helping choose the best dishes at restaurants. There’s lots of nifty stuff that Lens can do.

AR: Super high-res car interiors arrive in Google Search

Imagine loading multi-gigabyte 3D models nearly instantaneously into your mobile device, then placing them into your driveway and stepping inside. That’s what we’ve now enabled via Google Search on Android:

Take it for a spin via the models listed below, and please let us know what you think!

Volvo: Volvo XC40, Volvo XC40 Recharge, Volvo XC60, Volvo XC90

Porsche: Porsche 911, Porsche Cayenne, Porsche Macan, Porsche Panamera, Porsche Taycan

Fiat Chrysler: Jeep Wrangler 4xE

AR: Google & Fiat-Chrysler bring super detailed models to your driveway

Psst—wanna see a multi-gig, super detailed 3D model appear in your driveway almost instantaneously?

I’m delighted to say that our work in cloud-rendered streaming 3D is paying off via this year’s virtual CES show. Per the Google Cloud blog:

As part of Fiat Chrysler’s Virtual Showroom CES event, you can experience the new innovative 2021 Jeep Wrangler 4xe by scanning a QR code with your phone. You can then see an Augmented Reality (AR) model of the Wrangler right in front of you—conveniently in your own driveway or in any open space. Check out what the car looks like from any angle, in different colors, and even step inside to see the interior with incredible details.

A bit on how it works:

The Cloud AR tech uses a combination of edge computing and AR technology to offload the computing power needed to display large 3D files, rendered by Unreal Engine, and stream them down to AR-enabled devices using Google’s Scene Viewer. Using powerful rendering servers with gaming-console-grade GPUs, memory, and processors located geographically near the user, we’re able to deliver a powerful but low friction, low latency experience.

This rendering hardware allows us to load models with tens of millions of triangles and textures up to 4k, allowing the content we serve to be orders of magnitude larger than what’s served on mobile devices (i.e., on-device rendered assets).

And to try it out:

Scan the QR code below, or check out the FCA CES website. Depending on your OS, device, and network strength, you will see either a photorealistic, cloud-streamed AR model or an on-device 3D car model, both of which can then be placed in your physical environment.

AR_Wrangler-4xe.jpg

AR: Baby Yoda comes to Google Search

Putting the “AR” in “Galaxy far, far away…” 😌

Just don’t expect to get a Baby Yoda Hellraiser Pinhead version. 😬

Also, remember that if you have a Pixel 5G or a compatible Android 5G device, you can install The Mandalorian” AR Experience.

#ThisIsTheWay

AR: Come try new cars & makeup in Google Search

I’m delighted to be closing out 2020 on a pair of high notes, welcoming the arrival of my two biggest efforts from the last year+.

First, Google Search now supports 150+ new cars that you can view in 3D and AR (via iPhone or Android device), including in beautiful cloud-rendered quality (provided you have a good connection & up-to-date Android). As we initially previewed in October:

Bring the showroom to you with AR

You can easily check out what the car looks like in different colors, zoom in to see intricate details like buttons on the dashboard, view it against beautiful backdrops and even see it in your driveway. We’re experimenting with this feature in the U.S. and working with top auto brands, such as Volvo and Porsche, to bring these experiences to you soon.

Second, you can try on AR beauty products right through Search:

Now, when you search for a lipstick or eyeshadow product, like L’Oreal’s Infallible Paints Metallic Eyeshadow, you can see what it looks like on a range of skin tones and compare shades and textures to help you find the right products.

To help you find the perfect match, you can now also virtually try makeup products right from the Google app.

New tech promises super fast, high quality background removal

Google researchers Ira Kemelmacher-Shlizerman, Brian Curless, and Steve Seitz have been working with University of Washington folks on tech that promises “30fps in 4K resolution, and 60fps for HD on a modern GPU.”

Our technique is based on background matting, where an additional frame of the background is captured and used in recovering the alpha matte and the foreground layer.

Check it out:


See the PDF, abstract, and GitHub repo for details.

Google Photos rolls out Cinematic Photos & more

Nearly 20 years ago, on one of my first customer visits as a Photoshop PM, I got to watch artists use PS + After Effects to extract people from photo backgrounds, then animate the results. The resulting film—The Kid Stays In The Picture—lent its name to the distinctive effect (see previous).

Now I’m delighted that Google Photos is rolling out similar output to its billion+ users, without requiring any effort or tools:

We use machine learning to predict an image’s depth and produce a 3D representation of the scene—even if the original image doesn’t include depth information from the camera. Then we animate a virtual camera for a smooth panning effect—just like out of the movies.

Photos is also rolling out new collages, like this:

And they’re introducing new themes in the stories-style Memories section up top as well:

Now you’ll see Memories surface photos of the most important people in your life…  And starting soon, you’ll also see Memories about your favorite things—like sunsets—and activities—like baking or hiking—based on the photos you upload.

Enjoy!

AR: Google Maps can point you towards your friends

I love these simple, practical uses of augmented reality. The Maps team writes,

Last month, we launched Live View in Location Sharing for Pixel users, and we’ll soon expand this to all Android and iOS users around the globe. When a friend has chosen to share their location with you, you can easily tap on their icon and then on Live View to see where and how far away they are–with overlaid arrows and directions that help you  know where to go.

Live View in Location Sharing will soon expand to all Android and iOS users globally on ARCore and ARKit supported phones.

They’re also working hard to leverage visual data & provide better localization and annotation.

With the help of machine learning and our understanding of the world’s topography, we’re able to take the elevation of a place into account so we can more accurately display the location of the destination pin in Live View. Below, you can see how Lombard Street—a steep, winding street in San Francisco—previously appeared far off into the distance. Now, you can quickly see that Lombard Street is much closer and the pin is aligned with where the street begins at the bottom of the hill.

50 new AR animals arrive in Google search

“If your dog woke up 10 times its current size, it would lick you; if your cat woke up 10 times bigger, it would eat you,” or so I’ve heard.

In any case, building on the world’s viral (in every sense) adoption of AR animals already in search, my team has added a bunch more:

https://twitter.com/Google/status/1337506620612485120?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1337506620612485120%7Ctwgr%5E%7Ctwcon%5Es1_&ref_url=https%3A%2F%2Fwww.theverge.com%2F2020%2F12%2F12%2F22171601%2Fgoogle-adds-augmented-reality-ar-animals-search-3d

The Verge writes,

When Google started putting 3D animals in Search last year it only had a few standard animals available like a tiger, a lion, a wolf, and a dog. It added more creatures in March, including alligators, ducks, and hedgehogs. In August, Google made prehistoric creatures and historical artifacts available in AR via its Arts and Culture app— and who among us wouldn’t love to check out the ancient crustacean Cambropachycope up close and personal?

Meanwhile my man Seamus abides. 🐕😌

Google gives apps simultaneous on-device face, hand and pose prediction

From sign language to sports training to AR effects, tracking the human body unlocks some amazing possibilities, and my Google Research teammates are delivering great new tools:

We are excited to announce MediaPipe Holistic, […] a new pipeline with optimized poseface and hand components that each run in real-time, with minimum memory transfer between their inference backends, and added support for interchangeability of the three components, depending on the quality/speed tradeoffs.

When including all three components, MediaPipe Holistic provides a unified topology for a groundbreaking 540+ keypoints (33 pose, 21 per-hand and 468 facial landmarks) and achieves near real-time performance on mobile devices. MediaPipe Holistic is being released as part of MediaPipe and is available on-device for mobile (Android, iOS) and desktop. We are also introducing MediaPipe’s new ready-to-use APIs for research (Python) and web (JavaScript) to ease access to the technology.

Check out the rest of the post for details, and let us know what you create!

Google’s wearable ML helps blind runners

Call it AI, ML, FM (F’ing Magic), whatever: tech like this warms the heart and can free body & soul. Google’s Project Guideline helps people with impaired vision navigate the world on their own, independently & at speed. Runner & CEO Thomas Panek, who is blind, writes,

In the fall of 2019, I asked that question to a group of designers and technologists at a Google hackathon. I wasn’t anticipating much more than an interesting conversation, but by the end of the day they’d built a rough demo […].

I’d wear a phone on a waistband, and bone-conducting headphones. The phone’s camera would look for a physical guideline on the ground and send audio signals depending on my position. If I drifted to the left of the line, the sound would get louder and more dissonant in my left ear. If I drifted to the right, the same thing would happen, but in my right ear. Within a few months, we were ready to test it on an indoor oval track. […] It was the first unguided mile I had run in decades.

Check out the journey. (Side note: how great is “Blaze” as a name for a speedy canine running companion? ☺️)

New Google Research witchcraft retimes humans in video

There’s no way the title can do this one justice, so just watch as this ML-based technique identifies moving humans (including their reflections!), then segments them out to enable individual manipulation—including syncing up their motions and even removing people wholesale:

https://youtu.be/2pWK0arWAmU

Here’s the vid directly from the research team, which includes longtime Adobe vet David Salesin:

Google + Disney bring The Mandalorian to life in AR

This is the way:

Google and Lucasfilm have teamed up to bring iconic moments from the first season of “The Mandalorian” to life with “The Mandalorian” AR Experience (available on the Play Store for 5G Google Pixels and other select 5G Android phones) as fans follow the show’s second season.

The app uses ARCore’s new Depth API to enable occlusion for more realistic environmental interactions:

New content will keep rolling out in the app each week on Mando Mondays, so stay tuned—and Pixel owners should keep an eye out for additional exclusive content outside of the app as well.

Celebrate Diwali through Google AR

Visit this page in your mobile browser, or just take a peek (below) at the project, brought to you by the Google Arts & Culture Lab:

Some context for folks like me, who didn’t grow up with a connection to Indian traditions:

Diwali is the Indian festival of lights, usually lasting five days and celebrated during the Hindu Lunisolar month Kartika. One of the most popular festivals of Hinduism, Diwali symbolizes the spiritual “victory of light over darkness, good over evil, and knowledge over ignorance”.

L’Oreal brings exclusive digital makeup to Google Duo, other apps

I’m really pleased to see my last team’s engine (which is already powering makeup try-in in YouTube) getting put to good use:

L’Oréal Paris is also releasing one “exclusive look” on Google Duo, making it the first beauty brand to be used directly within Google’s video conference system. 

It’ll be interesting to see how the market for digital makeup & apparel evolves in a more socially distant, WFH world.

New Adobe tech promises 3D & materials scanning

Probably needless to say, 3D model creation remains hard AF for most people, and as such it’s a huge chokepoint in the adoption of 3D & AR viewing experiences.

Fortunately we may be on the cusp of some breakthroughs. Apple is about to popularize LIDAR on phones, and with it we’ll see interesting photogrammetry apps like Polycam:

Meanwhile Adobe is working to enable 3D scanning using devices without fancy sensors. Check out Project Scantastic:

They’re also working to improve the digitization of materials—something that could facilitate the (presently slow, expensive) digitization of apparel:

Snap puts the AR in graffiti art

The notion of a metaverse, “a collective virtual shared space, created by the convergence of virtually enhanced physical reality and physically persistent virtual space,” has long beguiled those of us captivated by augmented reality. Now Snap has been doing the hard work of making this more real, being able to scan & recognize one’s surroundings and impose a “persistent, shared AR world built right on top of your neighborhood.” Check it out:

This experience (presently available on just one street in London, but presumably destined to reach many others) builds on the AR Landmarkers work the company did previously. (As it happens, I think David Salesin—who led Adobe Research for many years—contributed to this effort during his stopover at Snap before joining Google Research.)

Come see the AR feature I’ve been working on all year!

I’m delighted to share that my team’s work to add 3D & AR automotive results to Google Search—streaming in cinematic quality via cloud rendering—has now been announced! Check out the demo starting around 36:30:

Here’s how we put it on via the Google Keyword blog:

Bring the showroom to you with AR

You can easily check out what the car looks like in different colors, zoom in to see intricate details like buttons on the dashboard, view it against beautiful backdrops and even see it in your driveway. We’re experimenting with this feature in the U.S. and working with top auto brands, such as Volvo and Porsche, to bring these experiences to you soon.

Cloud streaming enables us to take file size out of the equation, so we can serve up super detailed visuals from models that are hundreds of megabytes in size:

Right now the feature is in testing in the US, so there’s a chance you can experience it via Android right now (with iOS planned soon). We hope to make it available widely soon, and I can’t wait to hear what you think!

AR in Google Maps can point you to your friends

This is one of the far-flung projects I’ve been glad to help support. New features (like this one that’s available on Pixel, and coming soon to iOS & Android):

When a friend has chosen to share their location with you, you can easily tap on their icon and then on Live View to see where and how far away they are–with overlaid arrows and directions that help you  know where to go.

It’s also getting smarter about recognizing landmarks:

Soon, you’ll also be able to see nearby landmarks so you can quickly and easily orient yourself and understand your surroundings. Live View will show you how far away certain landmarks are from you and what direction you need to go to get there.

Warrior dogs to get AR goggles

“Are they gonna use the Snapchat dancing hot dog to steer them or what?” — Henry Nack, age 11, bringing the 🔥 feature requests 😌

Funded by the US military and developed by a Seattle-based company called Command Sight, the new goggles will allow handlers to see through a dog’s eyes and give directions while staying out of sight and at a safe distance.

While looking through the dog’s eyes thanks to the goggle’s built-in camera, the handler can direct the dog by controlling an augmented reality visual indicator seen by the dog wearing the goggles.

Put the “AR” in “art” via Google Arts & Culture

I’m excited to see the tech my team has built into YouTube, Duo, and other apps land in Arts & Culture, powering five new fun experiences:

Snap a video or image of yourself to become Van Gogh or Frida Kahlo’s self-portraits, or the famous Girl with a Pearl Earring. You can also step deep into history with a traditional Samurai helmet or a remarkable Ancient Egyptian necklace.

To get started, open the free Google Arts & Culture app for Android or iOS and tap the rainbow camera icon at the bottom of the homepage.

Bring cultural treasures into your home in AR through Google Search

I’m pleased to be playing a very small role in making very large things, well, rather small.

Today, Google Arts & Culture has brought together a new collection to help anyone choose their perfect virtual travel with thousands of museums and cultural destinations to explore. And with the help of our partner CyArk, we’ve launched on Google Search 37 cultural heritage sites from across the world in Augmented Reality (AR). Hop from your couch and search on your mobile phone to bring the Moai statues of Ahu Ature Huki, Rapa Nui (Easter Island), the Brandenburg Gate in Germany, or the Maya pyramid of Chichén Itzá, Mexico right into your living room.

Here’s a list of landmarks you can search & explore:

  • El Castillo – Chichen Itza, Mexico
  • Brandenburg Gate, Germany
  • Ayutthaya, Thailand
  • Eim ya kyaung Temple – Bagan, Myanmar
  • Palace of Fine Arts, Mexico
  • Chacmol statue – Templo Mayor, Mexico
  • Thomas Jefferson Memorial, US
  • Lanzón – Chavín de Huántar, Peru
  • War Canoe – Waitangi Treaty Grounds, New Zealand
  • Ahu Ature Huki, Easter Island
  • Tomb of Tu Duc (Complex of Hué Monuments), Vietnam
  • Nine Dome Mosque – Bagerhat, Bangladesh
  • Shait Gumbad Mosque – Bagerhat, Bangladesh
  • Chunakhola Masjid – Bagerhat, Bangladesh
  • Church of Sveta Sofia, North Macedonia
  • Jaulian, Pakistan
  • Flanders Field American Cemetery, Belgium
  • NASA Apollo 1 Mission Memorial, US
  • Rano Rarako, Easter Island
  • Mexico City Metropolitan Cathedral, Mexico
  • Normandy American Cemetery, France
  • Ananias Chapel, Syrian Arab Republic
  • Ancient Corinth, Greece
  • Ahu Nau Nau, Easter Island
  • Lukang Longshan Temple, Taiwan
  • Temple of Apollo (Portara), Greece
  • Audiencia in Palacio Tschudi – Chan Chan, Peru
  • Palacio Tschudi – Chan Chan, Peru
  • Edinburgh Castle, UK
  • Mesa Verde, USA
  • Temple of Echmoun, Lebanon
  • Fort York National Historic Site, Canada
  • Great Mosque – Kilwa Kisiwani, Tanzania
  • Gateway of India
  • Martin Luther King Memorial
  • Lincoln Memorial

AR soups up the humble car manual

The new Ram comes with an augmented reality feature for exploring one’s 700hp whip:

Hovering the camera over the steering wheel will show customers how to use the steering wheel controls or paddle shifters, while pointing at the dashboard will show infotainment functionality.

The app was developed in just three months to roll out on the 2021 Ram TRX. The wild truck will be the first vehicle to use the Know & Go app, and it will be available on other FCA vehicles down the line.

Employee-Developed Know & Go Mobile App Debuts on 2021 Ram 1500 TRX

Give FX the finger in AR

The free new app Diorama pairs with the $99 finger-worn Litho device to let you create AR movies directly inside your phone, using a selection of props & tapping into the Google Poly library:

VR Focus writes,

“Diorama will democratize the creation of special effects in the same way the smartphone democratized photography. It will allow anyone to create beautiful visual effects the likes of which have previously only been accessible to Hollywood studios,” said Nat Martin, Founder at Litho in a statement.

When combined with the Litho controller users can animate objects simply by dragging them, fine tuning the path by grabbing specific points. Mood lighting can be added thanks to a selection of filters plus the app supports body tracking so creators can interact with a scene.