Hmm—I really like the promise of this app (leveraging depth data to apply realistic lighting effects), but I’m finding the UI vexing & the results highly hit-or-miss. Judge for yourself:
Well, that escalated quickly: For this new set of mobile filmmaking tools (lens, battery, gimbal) Moment hit their $50k funding goal in in just over half an hour, and as of this writing they’ve easily cleared the $750k mark. Check ‘em out:
Lens works on photos of business cards, books, landmarks and buildings, paintings in a museum, plants or animals, and flyers and event billboards. When you use Lens on a photo that has phone numbers or an address, you can automatically save this information as a contact on your phone, while events will be added to your calendar.
Get ready for a whole new wave of AR gaming:
Unity integration will also allow developers to customize maps with what appears to be a great deal of flexibility and control. Things like buildings and roads are turned into objects, which developers can then tweak in the game engine. During a demonstration, Google showed off real-world maps that were transformed into sci-fi landscapes and fantasy realms, complete with dragons and treasure chests.
Jacoby says that one of the goals of the project was to help developers build detailed worlds using Maps data as a base to paint over. Developers can do things like choose particular kinds of buildings or locations — say, all stores or restaurants — and transform each one. A fantasy realm could turn all hotels into restorative inns, for instance, or anything else.
Continuing our series of Research Blog posts (see realtime segmentation, motion tracking), my teammates have provided an inside look at the tech they’ve developed—this time covering how motion photos get stabilized on the fly:
By combining software-based visual tracking with the motion metadata from the hardware sensors, we built a new hybrid motion estimation for motion photos on the Pixel 2.
Check out the blog post for details, or just enjoy lots of good before/after examples of stabilization in action.
Check it out:
As Ars Technica explains,
Flutter apps don’t directly compile to native Android and iOS apps; they run on the Flutter rendering engine (written in C++) and Flutter Framework (written in Dart, just like Flutter apps), both of which get bundled up with every app, and then the SDK spits out a package that’s ready to go on each platform. You get your app, a new engine to run the Flutter code on, and enough native code to get the Flutter platform running on Android and iOS.
Also, I’m totally creating a band called Stateful Hot Reload. 🙂
Augment all the humans! Check out this new perceptual enhancement:
The app, Soundscape, calls out roads and landmarks as they’re passed, and lets users set audio beacons at familiar destinations. If at any time you’re unsure of where you are, or which direction to head in, you can simply hold the phone flat in your hand and use the buttons on the bottom of the screen to locate nearby roads and familiar destinations.
This app (sadly unavailable in the US, it seems) looks really creative & fun:
“To achieve a seamless transition from the TV ad to Augmented Reality we use computer vision to detect the quattro coaster TV ad. Then, we sync and position the augmented content on the screen. What’s interesting is that the car remains in the room even after the ad has ended. [more]
Here’s what it looks like in action: