Google showcased its cloud-rendering & collaboration chops by deploying a cloud-based animation studio, enabling a creative team to design & render this short over three days:
To demonstrate what’s possible, we built an animated short over the course of three days.
To do it, we invited some like-minded artists who share our vision to set up a live cloud-based animation studio on the second floor of Moscone Center. These artists worked throughout the three days of the show to model, animate, and render the spot, and deliver a finished short. […]
We used Zync Render, a Renderfarm-as-a-Service running on GCP that can be deployed in minutes, and works with major 3D applications and renderers. The final piece was rendered in V-Ray for Maya.
Zync is able to deploy up to 500 render workers per project, up to a total of 48,000 vCPUs.
Pretty dope—though in my heart, these dabbing robots won’t ever compete with my then-5yo son Finn as a dancing robot:
[YouTube 1 & 2]
NASA, using a digital 3D model of the Moon built from Lunar Reconnaissance Orbiter global elevation maps and image mosaics, produced this lovely tour of our nearby neighbor. The lighting is derived from actual Sun angles during lunar days in 2018.
The filmmakers write,
The visuals were composed like a nature documentary, with clean cuts and a mostly stationary virtual camera. The viewer follows the Sun throughout a lunar day, seeing sunrises and then sunsets over prominent features on the Moon. The sprawling ray system surrounding Copernicus crater, for example, is revealed beneath receding shadows at sunrise and later slips back into darkness as night encroaches.
Gonna be a hot time in the (deeply poorly conceived) virtual town tonight:
Some cool making-of details:
The results look so realistic that they could almost be stop-motion. “I built a big virtual set, I guess, is how you could describe it,” he said. “The characters are like stop-motion marionettes in a way; they have joints to the arms and the knees and all of that, and controllers.” He then used a low-budget motion-capture process — a D.I.Y. version of Hollywood’s green screens and Ping-Pong-ball suits — using the XBox Kinect and special software. “It sees you doing the motions you want the character to do, and then you can transfer that to the animation so you can transfer that onto your characters,” he said. He considered having a giant robot attack Cardboard City, and then settled on fire: that looked kind of cool, too.
#BobRossIsABoss—weirdly brilliant! Kottke writes,
As a fundraiser for the Leukemia & Lymphoma Society, Micah Sherman and Mark Stetson produced a web series called The Bob Ross Challenge in which 13 comedians attempt to paint along with Bob Ross as he does his thing with the trees and little fluffy clouds. Here’s the first episode, featuring Aparna Nancherla:
My crazy-talented buddy Dave (whose hiring at Adobe is one of the best things for which I can take fragmentary credit) has created an interactive mystery using—and showing off—Adobe Character Animator:
As a special bonus, you can download the rigged puppets from Dave’s site. (Hat tip to AE superfans who grok some of the character names. 😌)
Fun stuff from the Shanghai office:
In order to give everyone the opportunity to experience just how natural AI-powered interactions can now be, we’re launching 猜画小歌 (“Guess My Sketch”) from Google AI, a fun, social WeChat Mini Program in which players team up with our AI to sketch everyday items in a race against the clock. In each round, players sketch the given word (like “dog”, “clock”, or “shoe”) for their AI teammate to guess correctly before time runs out.
When the AI successfully guesses your sketch, you’ll move on to the next round and increase your sketching streak. You can invite friends and family to compete for the longest streak, share interesting sketches with each other, and collect new words and drawings as you continue playing.
Nifty, even if it doesn’t include the actual images produced on-device. More details.
“So we beat on, boats against the current, borne back ceaselessly into the past…”
Sitting in my parents’ house, surrounded by my dad’s old college books and mine, I’m struck by a certain melancholy—a mix of memory, gratitude, and loss. As it happens, Margot just told me about Insta Repeat, a feed that catalogs the repetitiousness of Instagram photography. This makes me think of “vemödalen” (“the frustration of photographing something amazing when thousands of identical photos already exist”)—and just searching this blog for that term shows my current unoriginality in its use:
Ah well—so it goes. Until next time…
Heh—here’s to making things gratuitously nicer & more fun than strictly needed:
Bean machine busts the freshest emotes of 2018* while rocking out to Blue Danube. Enjoy:
*Why yes, I am constantly being force-fed a bunch of Fortnite gibberish by a 10-year-old. 😌