I haven’t yet seen it, but my friend Andy swears that this quirky dive into the world of industrial musicals is a delight. Sign me up!
[YouTube] [Via Andy Russell]
- Support for iOS 12.
- Support for iPhone XR, XS and XS Max.
- Support for iPad Pro 11 and 12.9.
Ooh—I’ll have to show this story to my coin- and detector-loving 9yo son Henry.
Although Peter has been doing this for decades, his success rate of uncovering historic finds has grown since the launch of Google Earth, which helps him research farmlands to search and saves him from relying on outdated aerial photography. In December 2014, Peter noticed a square mark in a field with Google Earth. It was in this area where the Weekend Wanderers discovered the £1.5 million cache of Saxon coins. The coins are now on display in the Buckinghamshire County Museum, by order of the queen’s decree.
In addition to moving augmented images (see previous), my team’s tracking tech enables object detection & tracking on iOS & Android:
The Object Detection and Tracking API identifies the prominent object in an image and then tracks it in real time. Developers can use this API to create a real-time visual search experience through integration with a product search backend such as Cloud Product Search.
I hope you’ll build some rad stuff with it (e.g. the new Adidas app)!
Hmm—I’ve never had occasion to use this solipsistic-but-cool flight mode on my drone, but now I’m tempted to try capturing some epic dronies. (Just gotta figure out where I misplaced my moody Scottish highlands…)
Apropos of nothing, but you might enjoy this dog’s journey as I did 🐕🏎🚀:
[YouTube] [Via Bryan O’Neil Hughes]
A couple of years ago, we used Google’s super-resolution tech (think “Genuine Fractals gone wild,” fellow oldsters) to dramatically reduce bandwidth costs for users without any perceptible loss in visual quality. Now that tech is used in the Pixel phone camera, and this quick video gives a nice overview of how it works:
Who knew that the goofball mannequin challenge could generate a 2000-video dataset that could help train AI to compute depth, segment humans, and (optionally) content-aware fill them out of existence? This new work from Google Research handles scenes where both the camera & human subjects are moving. Check it out:
What happens when you use machine learning & the capacitive-sensing properties of fruit to make music? The Flaming Lips teamed up with Google to find out:
During their performance that night, Steven Drozd from The Flaming Lips, who usually plays a variety of instruments, played a “magical bowl of fruit” for the first time. He tapped each fruit in the bowl, which then played different musical tones, “singing” the fruit’s own name. With help from Magenta, the band broke into a brand-new song, “Strawberry Orange.”
The Flaming Lips also got help from the audience: At one point, they tossed giant, blow-up “fruits” into the crowd, and each fruit was also set up as a sensor, so any audience member who got their hands on one played music, too. The end result was a cacophonous, joyous moment when a crowd truly contributed to the band’s sound.