Do I know what the hell is going on here? No, of course not! (Have you ever met me? 😌) But thankfully my colleagues Noah, Richard, and co. do, and it promises a way to capture & display rich, dimensional photos (see interactive example that lets you play with parallax & see depth; more are on the site). Check it out:
The glue my team developed to connect & coordinate machine learning, computer vision, and other processes is now available for developers:
The main use case for MediaPipe is rapid prototyping of applied machine learning pipelines with inference models and other reusable components. MediaPipe also facilitates the deployment of machine learning technology into demos and applications on a wide variety of different hardware platforms (e.g., Android, iOS, workstations).
If you’ve tried any of the Google AR examples I’ve posted in the last year+ (Playground, Motion Stills, YouTube Stories or ads, etc.), you’ve already used MediaPipe, and now you can use it to remove some drudgery when creating your own apps.
Today, we’re introducing AR Beauty Try-On, which lets viewers virtually try on makeup while following along with YouTube creators to get tips, product reviews, and more. Thanks to machine learning and AR technology, it offers realistic, virtual product samples that work on a full range of skin tones. Currently in alpha, AR Beauty Try-On is available through FameBit by YouTube, Google’s in-house branded content platform.
M·A·C Cosmetics is the first brand to partner with FameBit to launch an AR Beauty Try-On campaign. Using this new format, brands like M·A·C will be able to tap into YouTube’s vibrant creator community, deploy influencer campaigns to YouTube’s 2 billion monthly active users, and measure their results in real time.
As I noted the other day with AR in Google Lens, big things have small beginnings. Stay tuned!
Hey, I’m as surprised as you probably are. 🙃 And yet here we are:
What if creating games could be as easy and fun as playing them? What if you could enter a virtual world with your friends and build a game together in real time? Our team within Area 120, Google’s workshop for experimental projects, took on this challenge. Our prototype is called Game Builder, and it is free on Steam for PC and Mac.
Earlier this week I was messing around with Apple’s new Reality Composer tool, thinking about fun Lego-themed interactive scenes I could whip up for the kids. After 10+ fruitless minutes of trying to get off-the-shelf models into USDZ format, however, I punted—at least for the time being. Getting good building blocks into one’s scene can still be a pain.
This new 3D scanner app promises to make the digitization process much easier. I haven’t gotten to try it, but I’d love to take it for a spin:
It’s a small step, to be sure, but I’m exited to see that lensing a Raptors or (for good people 🙃) Warriors logo lets you see animated results, scores, stats, and more. Things are gonna get really interesting from here.
In addition to moving augmented images (see previous), my team’s tracking tech enables object detection & tracking on iOS & Android:
The Object Detection and Tracking API identifies the prominent object in an image and then tracks it in real time. Developers can use this API to create a real-time visual search experience through integration with a product search backend such as Cloud Product Search.