Monthly Archives: February 2019

3-wheelin’ EV feeling

“Oh, that’s easy,” said my wife on our first date, answering my question about what kind of car she’d be: “I’d be one of those little three-wheeled French jobs like Audrey Hepburn drove in Funny Face.” Ever since then we’ve had a thing for three-wheelers, putting one on our save-the-date wedding card.

It’s hard to imagine the electric Nobe car really hitting the highways, but I love the look of it:



[YouTube] [Via]

Googlers win VFX Oscars

Congrats to Paul Debevec, Xueming Yu Wan-Chun Alex Ma, and their former colleague Timothy Hawkins for the recognition of their groundbreaking Light Stage work! 


Now they’re working with my extended team:

“We try to bring our knowledge and background to try to make better Google products,” Ma says. “We’re working on improving the realism of VR and AR experiences.”

I go full SNL Sue thinking about what might be possible.


Oh, and they worked on Ready Player One (nominated for Best Visual Effects this year) and won for Blade Runner 2049 last year:

Just prior to heading to Google, they worked on “Blade Runner 2049,” which took home the Oscar for Best Visual Effects last year and brought back the character Rachael from the original “Blade Runner” movie. The new Rachael was constructed with facial features from the original actress, Sean Young, and another actress, Loren Peta, to make the character appear to be the same age she was in the first film.

Check out their work in action:

[YouTube 1 & 2]

A Google UX opening to work on cool streaming tech

The listing for Interaction Designer, Enterprise Design in Munich might sound a touch dry at first, but it’s related to a very interesting project:

We’ve been working on Project Stream, a technical test to solve some of the biggest challenges of streaming. For this test, we’re going to push the limits with one of the most demanding applications for streaming—a blockbuster video game.

We’ve partnered with one of the most innovative and successful video game publishers, Ubisoft, to stream their soon-to-be released Assassin’s Creed Odyssey® to your Chrome browser on a laptop or desktop. Starting on October 5, a limited number of participants will get to play the latest in this best-selling franchise at no charge for the duration of the Project Stream test.

“Would You Like To Know More?” Check out the listing to see if it’s a fit for you.


A peek at Oppo’s new 10x optical zoom for phones

Looks pretty nifty, though it’s interesting that it doesn’t (at least currently) work for capturing video or macro shots:

The Verge explains,

The key component to Oppo’s system is a periscope setup inside the phone: light comes in through one lens, gets reflected by a mirror into an array of additional lenses, and then arrives at the image sensor, which sits perpendicular to the body of the phone. That’s responsible for the telephoto lens in Oppo’s array, which has a 35mm equivalence of 160mm. Between that lens, a regular wide-angle lens, and a superwide-angle that’s 16mm-equivalent, you get the full 10x range that Oppo promises.


ZOMG: A life-sized Lego Westy

Ladies & gentlemen, we are approaching Peak JNack…

Using 400,000 LEGO® bricks, two experienced LEGO® model makers have built what is probably the world’s biggest camper from LEGO® bricks. The full-size T2 was revealed at the leisure and travel fair in Munich. Visitors young and old to (20 – 24 February) will be able to admire the 700 kg Bulli up close. The vehicle that served as the blueprint for the model was the T2a camper van, built from 1967 to 1971 – to this day the truly iconic camper for globetrotters.

See more photos & details here.




Photography: Beautiful orbital shots of “The World Below”

Bruce Berry (not Neil Young’s late roadie) created some beautiful time lapse imagery from images captured aboard the International Space Station:

On Vimeo he writes,

All footage has been edited, color graded, denoised, deflickered, stabilized by myself. Some of the 4K video clips were shot at 24frames/sec reflecting the actual speed of the space station over the earth. Shots taken at wider angels were speed up a bit to match the flow of the video.

Some interesting facts about the ISS: The ISS maintains an orbit above the earth with an altitude of between 330 and 435 km (205 and 270 miles). The ISS completes 15.54 orbits per day around the earth and travels at a speed of 27,600 km/h; 17,100 mph).

The yellow line that you see over the earth is Airgolw/Nightglow. Airglow/Nightglow is a layer of nighttime light emissions caused by chemical reactions high in Earth’s atmosphere. A variety of reactions involving oxygen, sodium, ozone, and nitrogen result in the production of a very faint amount of light (Keck A and Miller S et al. 2013).

I love the choice of music & wondered whether it comes from Dunkirk. Close: that somewhat anxious tock-tock undertone is indeed a Hans Zimmer jam, but from 20 years earlier (The Thin Red Line).


Machine learning in your browser tracks your sweet bod

A number of our partner teams have been working on both the foundation for browser-based ML & on cool models that can run there efficiently:

We are excited to announce the release of BodyPix, an open-source machine learning model which allows for person and body-part segmentation in the browser with TensorFlow.js. With default settings, it estimates and renders person and body-part segmentation at 25 fps on a 2018 15-inch MacBook Pro, and 21 fps on an iPhone X. […]

This might all make more sense if you try a live demo here.

Check out this post for more details.

David Salesin joins Google Research

“Man, that dude looks eerily like David Salesin,” I thought the other day as I was getting coffee, “but nah, he’s wearing a new-employee badge. But wait, holy crap… that dude is David Salesin wearing an employee badge!”

Perhaps you don’t know his name, but for 11+ years David (a tango-dancing Aikido black belt) led a wing of Adobe Research, and we collaborated on more projects than I can begin to count. Now he’s at the Goog (having led Snapchat research in the interim), teaming back up with several of our fellow Adobe alums. I can’t wait to see what he does here!

Last Monday I began work as a Principal Scientist / Director at Google AI Perception, based in San Francisco. I’m excited to collaborate with so many good friends and colleagues who are already at Google, and, in time, to hire many more. Google’s products and reach are incredibly broad, and so is the mandate for my lab: I look forward to continue inventing tools for creative expression, as well as to begin working on some brand new far-reaching challenges potentially well outside my area of expertise, like applying AI to healthcare. In my new role, I’m energized to grow in new ways, working on projects that, in Larry Page’s words, are “uncomfortably exciting”!