During their performance that night, Steven Drozd from The Flaming Lips, who usually plays a variety of instruments, played a “magical bowl of fruit” for the first time. He tapped each fruit in the bowl, which then played different musical tones, “singing” the fruit’s own name. With help from Magenta, the band broke into a brand-new song, “Strawberry Orange.”
The Flaming Lips also got help from the audience: At one point, they tossed giant, blow-up “fruits” into the crowd, and each fruit was also set up as a sensor, so any audience member who got their hands on one played music, too. The end result was a cacophonous, joyous moment when a crowd truly contributed to the band’s sound.
New research from Samsung Moscow can turn a single image (or, for better quality results, a series of images) into a puppet that can be driven by another person’s performance. (Hmm, new feature for Google Arts & Culture’s artistic doppelgänger-finder? 😌)
This is the first time people will be able to use Tilt Brush on a completely wireless VR system. It costs $19.99, though if you previously purchased it on Oculus Home, you’ll have it for free on Oculus Quest.
The original Glass will be to AR wearables as the Apple Newton was to smartphones—ambitious, groundbreaking, unfocused, premature. After that first… well, learning experience… Google didn’t give up, and folks have cranked away quietly to find product-market fit. Check out the new device—dramatically faster, more extensible, and focused on specific professionals in medicine, manufacturing, and more:
My team has been collaborating with TensorFlow Lite & researchers working on human-pose estimation (see manypreviousposts) to accelerate on-device machine learning & enable things like the fun “Dance Like” app on iOS & Android:
Hmm… am I a size 10.5 or 11 in this brand? These questions are notoriously tough to answer without trying on physical goods, and cracking the code for reliable size estimation promises to enable more online shoe buying with fewer returns.
Now Nike seems to have cracked said code. The Verge writes,
With this new AR feature, Nike says it can measure each foot individually — the size, shape, and volume — with accuracy within 2 millimeters and then suggest the specific size of Nike shoe for the style that you’re looking at. It does this by matching your measurements to the internal volume already known for each of its shoes, and the purchase data of people with similar-sized feet.
My team has been accelerating machine learning on devices and enabling AR face effects for developers (via ARCore & ML Kit). In recent months we’ve worked with Care OS, makers of smart mirror technology, to enable virtual try-ons via their hardware. Here’s a quick demo from Google I/O: