I’m stupid-excited to say that I’ve just joined Google’s Skynet Machine Perception team to build kickass creative, expressive experiences, delivering augmented reality to (let’s hope) a billion+ people. I told you sh*t just got real. 🙂
Now, the following career bits may be of interest only to me (and possibly my mom), but in case you’re wondering, “Wait, don’t you work on Google Photos…?”
Well, like SNL’s Stefon, “I’ve had a weird couple of years…”
The greatly smoothed version goes basically like this:
- I joined Google in early 2014 to work on Photos. I liked to say I was “Teaching Google Photoshop,” meaning getting computers to see & synthesize like humans (making your Assistant your artist!). Among other things, we created a brand-new image editor, did some early AR face-painting work (a year+ ahead of Snapchat et al), and made movies for tens of millions of people.
- After a bit over a year, I wanted to explore some crazier photo- and video-related ideas (stuff not ready for Photos to include then, if ever), so I left the team & walked across the hall to work with & learn from Luke Wroblewski. Thus I was “working at Google on photos, just not Photos.” This was a subtle distinction, and as I was working on secret stuff, I didn’t spend time publicizing it. I remained closely involved with the ex-Nik Photos folks in building out Snapseed & the next rev of the new editor we’d started.
- Meanwhile I spent the better part of the next year thinking up, prototyping, and iterating on a bunch of little photo apps. It was a tough but enlightening process. I know we were on to something, but I also felt like Edison saying some variant of “I have not failed. I’ve just found 10,000 ways not to make a light bulb.”
- Somewhat tired from the process & eager to make concrete contributions, I was set to join an imaging hardware team. When project plans changed, however, I agreed to help improve photography experiences on social apps including Google+.
- Having witnessed on Photos the massive importance of speed, I teamed up with my future teammates in Research to build out RAISR, a machine-learning library and ship it in Google+, saving users immense amounts of bandwidth (critical in the developing world).
- Since then, and up until this week, I’ve been focusing on enterprise social needs. Though it wasn’t an area I sought out, I ended up really digging the experience, and I look forward to eventually sharing some of the rad stuff my team was building.
- And then, Google bought this little company in Belarus & my old Research friends came calling…
So now we’ve come full circle, and to capture my feelings, I’ll cite SNL yet again. Wish me luck. 🙂
More power & speed for the millions of people who use Snapseed every day:
We’re excited to announce that Snapseed 2.18 has started rolling out today to users on Android and iOS. This update includes a fresh new UI, designed for faster editing with more efficient access to your favorite features.
You’ll find Looks are now available from the main screen, making it easier than ever to apply your customized filters to your photos. Looks are a powerful way to save your favorite combinations of edits and apply them to multiple images. We’ve added 11 beautiful new presets (handcrafted by the Snapseed team) to help you get started – give them a try!
We’re also bringing the Perspective tool to iOS to allow you to easily adjust skewed lines and perfect the geometry of horizons or buildings.
Creeptastic! But quick, cool, and impressive: Visit The University of Nottingham’s demo site, and check out the project site for more details. As The Verge writes,
“3D face reconstruction is a fundamental computer vision problem of extraordinary difficulty.” You usually need multiple pictures of the same face from different angles in order to map every contour. But, by feeding a bunch of photographs and corresponding 3D models into a neural network, the researchers were able to teach an AI system how to quickly extrapolate the shape of a face from a single photo.
[Via Alex Kauffmann]
“Arbiter of Focus”—that’s how David Lieb, who was the CEO of Bump & who now leads product for Google Photos—describes a PM’s job. Elsewhere I’ve heard, “What game are we playing, and how do we keep score?” In a similar vein, I found resonance in these remarks from Francis Ford Coppola:
Q. What is the one thing to keep in mind when making a film?
A. When you make a movie, always try to discover what the theme of the movie is in one or two words. Every time I made a film, I always knew what I thought the theme was, the core, in one word. In “The Godfather,” it was succession. In “The Conversation,” it was privacy. In “Apocalypse,” it was morality. The reason it’s important to have this is because most of the time what a director really does is make decisions. All day long: Do you want it to be long hair or short hair? Do you want a dress or pants? Do you want a beard or no beard? There are many times when you don’t know the answer. Knowing what the theme is always helps you.
Here’s the rest of the interview.
“That’s not an explosion—it’s just a rapid unscheduled disassembly!”
Think of the swagger a company must have to own their history of mishaps this hard. Respect !
Check out this multi-track song explorer from a cool podcast (see previous) & Google’s WebVR team:
What if you could step inside a song? Inside Music is a simple experiment that explores that idea. It features the music of Phoenix, Natalia Lafourcade, Perfume Genius, Alarm Will Sound, Clipping, and Ibeyi.
if you’re a musician, you can explore your own songs in VR or put them up on the web for others to explore.
[YouTube 1 and 2] [Via]
I’m eager to try this out:
When framing a subject, you’ll have a number of different lighting options to choose from for giving your portrait different looks — things like Contour Light, Natural Light, Studio Light, Stage Light, and Stage Light Mono.
These “aren’t filters,” Apple says. Instead, the phone is actually studying your subject’s face and calculating the look based on light that’s actually in the scene using machine learning.
Check out PetaPixel or Apple’s site for larger sample images.
Anyone I’ve interviewed at Google will immediately know why I find this so great—but it would be great no matter what: