Google collaborated with artist Es Devlin to help London passerby contribute to an ever-evolving poem projected on Nelson’s Column in Trafalgar Square.
Cast in 1867, the four monumental lions in Trafalgar Square have been sitting as silent British icons at the base of Nelson’s Column for the past 150 years. Overnight on Monday 17 September, a fifth fluorescent red lion will join the pride. This new lion will roar poetry, and the words it roars will be chosen by the public. Everyone is invited to “feed the lion”, but this lion only eats words.
Ah—so this is the backstory on the large installation now populating our lobby.
The flowers are built using Raspberry Pi running Android Things, our Android platform for everyday devices like home speakers, smart screens and wearables. An “alpha flower” has a camera in it and uses an embedded TensorFlow neural net to analyze which emotion it sees, and the surrounding flowers change colors based on the image the camera captures of your face. All processing is done locally, so no data is saved or sent to any servers.
The primary innovation in Sononym is something called “similarity search”, which enable users to find similar-sounding samples in their sample collection based on any source sound. Essentially, a bit like how Google’s reverse image search works, but with audio.
The initial release focuses strictly on the core functionality of the software. That is, to offer similarity search that work with large collections of samples. Technically, our approach is a combination of feature extraction, machine learning and modern web technologies.
Not entirely dissimilar: Font Map helps you see relationships across more than 750 web fonts.
To demonstrate what’s possible, we built an animated short over the course of three days.
To do it, we invited some like-minded artists who share our vision to set up a live cloud-based animation studio on the second floor of Moscone Center. These artists worked throughout the three days of the show to model, animate, and render the spot, and deliver a finished short. […]
We used Zync Render, a Renderfarm-as-a-Service running on GCP that can be deployed in minutes, and works with major 3D applications and renderers. The final piece was rendered in V-Ray for Maya.
Zync is able to deploy up to 500 render workers per project, up to a total of 48,000 vCPUs.
Pretty dope—though in my heart, these dabbing robots won’t ever compete with my then-5yo son Finn as a dancing robot:
My crazy-talented buddy Dave (whose hiring at Adobe is one of the best things for which I can take fragmentary credit) has created an interactive mystery using—and showing off—Adobe Character Animator: