Flaming Lips + Google AI + … fruit?

What happens when you use machine learning & the capacitive-sensing properties of fruit to make music? The Flaming Lips teamed up with Google to find out:

During their performance that night, Steven Drozd from The Flaming Lips, who usually plays a variety of instruments, played a “magical bowl of fruit” for the first time. He tapped each fruit in the bowl, which then played different musical tones, “singing” the fruit’s own name. With help from Magenta, the band broke into a brand-new song, “Strawberry Orange.”

The Flaming Lips also got help from the audience: At one point, they tossed giant, blow-up “fruits” into the crowd, and each fruit was also set up as a sensor, so any audience member who got their hands on one played music, too. The end result was a cacophonous, joyous moment when a crowd truly contributed to the band’s sound.

NewImage

 [YouTube]

Google Glass resurrected, this time for enterprise

The original Glass will be to AR wearables as the Apple Newton was to smartphones—ambitious, groundbreaking, unfocused, premature. After that first… well, learning experience… Google didn’t give up, and folks have cranked away quietly to find product-market fit. Check out the new device—dramatically faster, more extensible, and focused on specific professionals in medicine, manufacturing, and more:

NewImage

[YouTube]

Check out “Dance Like,” a fun ML-driven app from Google

My team has been collaborating with TensorFlow Lite & researchers working on human-pose estimation (see many previous posts) to accelerate on-device machine learning & enable things like the fun “Dance Like” app on iOS & Android:

[YouTube]

Here’s brave Aussie PM Tim Davis busting the coldest emotes [note to my sons: am I saying this right?] while demoing on stage at Google I/O (starts at timestamp: 42:55] in case the embed gets wonky):

NewImage

[YouTube

AR: Nike aims for visual foot-size estimation, jersey try-on

Hmm… am I a size 10.5 or 11 in this brand? These questions are notoriously tough to answer without trying on physical goods, and cracking the code for reliable size estimation promises to enable more online shoe buying with fewer returns.

Now Nike seems to have cracked said code. The Verge writes,

With this new AR feature, Nike says it can measure each foot individually — the size, shape, and volume — with accuracy within 2 millimeters and then suggest the specific size of Nike shoe for the style that you’re looking at. It does this by matching your measurements to the internal volume already known for each of its shoes, and the purchase data of people with similar-sized feet.

Seems like size estimation could be easily paired with visualization a la Wanna Kicks.

NewImage

On a semi-related note, Nike has also partnered with Snapchat to enable virtual try-on of soccer jerseys:

NewImage