Onyx puts the world’s smartest trainer in your pocket. With just the camera on your phone it counts your reps, corrects your form, brings tracking to nearly any exercise, and provides audio workouts personalized to your performance in real time.
Starting today, people with an iPhone X or newer can use “3D Camera Mode” to capture a selfie and and apply 3D effects, lenses, and filters to it.
Snap first introduced the idea for 3D effects with Snaps when it announced its latest version of Spectacles, which include a second camera to capture depth. The effects and filters add things like confetti, light streaks, and miscellaneous animations.
People seem endlessly surprised that one is not only allowed to use an iPhone at Google, but that we also build great cross-platform tech for developers (e.g. ML Kit). In that vein I’m delighted to say that my team has now released an iOS version (supporting iPhone 6s and above) of the Augmented Faces tech we first released for ARCore for Android earlier this year:
It provides a high-quality, 468-point 3D mesh that lets users attach fun effects to their faces — all without a depth sensor on their smartphone. With the addition of iOS support rolling out today, developers can now create effects for more than a billion users. We’ve also made the creation process easier for both iOS and Android developers with a new face effects template.
El Pollo Loco… is looking to bring back lost Latino-themed murals in downtown Los Angeles, if only in virtual form. Beginning Sunday, open the Snapchat smartphone app, tap on the background to activate the World Lenses feature, and point the phone at the now blank wall. With that, the old murals come back to life on the screen.
This slick tool helps retarget “cinematic 16:9, square 1:1, or vertical 9:16, without losing track of your subject.” PetaPixel writes,
If you’re working with a timeline that includes multiple clips, there’s also an “Auto Reframe Sequence” option that allows you to select the aspect ratio you want and apply it to every clip in your timeline at once. Best of all, the effect isn’t only applied to the video footage, titles and motion graphics are also resized to fit the new aspect ratio.
I know this will seem like small beans—literally—but over time it’ll be a big deal, and not just because it’s an instance of the engine I’m working to enhance.
Through Lens, you’ll get meal recommendations based on your tastes, dietary preferences, and allergies, along with a personalized score for products like Uncle Ben’s Ready Rice, Flavored Grains, Flavor Infusions, and beans.
VentureBeat goes on to note,
The growing list of things Lens can recognize covers over 1 billion products… The new feature follows a Lens capability that highlights top meals at a restaurant and a partnership with Wescover that supplies information about art and design installations. Lens also recently gained the ability to split a bill or calculate a tip after a meal; [and] to overlay videos atop real-world publications.
Check out the latter, from a couple of months ago. As I say, big things have small beginnings.
An app that allows users to try on clothes virtually. Users feed the app basic information including gender, height and weight. The app records the user’s head movements and creates a virtual version of the user modelling clothes.