…and got 3D-captured via Google’s volumetric scanning array. “Days of Miracles & Wonder,” part 9,217…
Much like his big bro did last year (see below), our son Henry stepped up to el micrófono to tell tales of meteorological mayhem. It took a village, with mom scoring a green-screen kit from her Adobe video pals & me applying some AR effects created by my talented teammates.
Here’s a behind-the-scenes peek at our advanced VFX dojo/living room. 😌
“With great power…” I’m pleased to see some of my collaborators in augmented reality working to help fight deceptive content:
To make this dataset, over the past year we worked with paid and consenting actors to record hundreds of videos. Using publicly available deepfake generation methods, we then created thousands of deepfakes from these videos. The resulting videos, real and fake, comprise our contribution, which we created to directly support deepfake detection efforts. As part of the FaceForensics benchmark, this dataset is now available, free to the research community, for use in developing synthetic video detection methods.
Potentially cool idea:
Onyx puts the world’s smartest trainer in your pocket. With just the camera on your phone it counts your reps, corrects your form, brings tracking to nearly any exercise, and provides audio workouts personalized to your performance in real time.
, though I have no idea how to create these; “open the app to the camera, navigate to the 3D option option in the dropdown menu, and voila:
The Verge writes,
Starting today, people with an iPhone X or newer can use “3D Camera Mode” to capture a selfie and and apply 3D effects, lenses, and filters to it.
Snap first introduced the idea for 3D effects with Snaps when it announced its latest version of Spectacles, which include a second camera to capture depth. The effects and filters add things like confetti, light streaks, and miscellaneous animations.
Look Ma, no depth sensor required.
People seem endlessly surprised that one is not only allowed to use an iPhone at Google, but that we also build great cross-platform tech for developers (e.g. ML Kit). In that vein I’m delighted to say that my team has now released an iOS version (supporting iPhone 6s and above) of the Augmented Faces tech we first released for ARCore for Android earlier this year:
It provides a high-quality, 468-point 3D mesh that lets users attach fun effects to their faces — all without a depth sensor on their smartphone. With the addition of iOS support rolling out today, developers can now create effects for more than a billion users. We’ve also made the creation process easier for both iOS and Android developers with a new face effects template.
Here’s a quick overview from my teammate Sam:
Neat idea from El Pollo Loco + Snapchat:
El Pollo Loco… is looking to bring back lost Latino-themed murals in downtown Los Angeles, if only in virtual form. Beginning Sunday, open the Snapchat smartphone app, tap on the background to activate the World Lenses feature, and point the phone at the now blank wall. With that, the old murals come back to life on the screen.
I know this will seem like small beans—literally—but over time it’ll be a big deal, and not just because it’s an instance of the engine I’m working to enhance.
Through Lens, you’ll get meal recommendations based on your tastes, dietary preferences, and allergies, along with a personalized score for products like Uncle Ben’s Ready Rice, Flavored Grains, Flavor Infusions, and beans.
VentureBeat goes on to note,
The growing list of things Lens can recognize covers over 1 billion products… The new feature follows a Lens capability that highlights top meals at a restaurant and a partnership with Wescover that supplies information about art and design installations. Lens also recently gained the ability to split a bill or calculate a tip after a meal; [and] to overlay videos atop real-world publications.
Check out the latter, from a couple of months ago. As I say, big things have small beginnings.
— NBA (@NBA) May 31, 2019