Monthly Archives: November 2016

Snapseed enhances White Balance, raw support, and more

More powerful & easier to use; I’ll let the team explain:

Snapseed 2.13 started rolling out today. This version includes an improved UI for selecting and changing parameters. In addition to swiping up and down to choose parameters, you can now also tap the adjust icon on the bottom bar to bring up a tap-enabled parameter selector. The selected parameter will always be shown as a slider at the top of the screen. To adjust the parameter you can still swipe left and right anywhere on the screen as before.

On iOS, this release now also includes the dedicated White Balance tool that got launched on Android a while ago. This tool allows you to adjust the colors in your photo to look more natural. Just choose the auto correct option, or use the included color picker for fine control.

Finally, when opening raw images that have been captured with a creative setting on your camera Snapseed will now show the default raw colors. Previously the embedded color profiles sometimes limited the creative choices in editing your raw files.

Your feedback is, as always, most welcome!


Google Quickdraw is magical

Just go try this thing, or watch the short demo below. Meanwhile I’ll be trying to pry my guffawing 8-year-old Finn away from the iPad. :-p

Wired notes,

Of course, AI Experiments isn’t just a free education for neural network nitwits. Every interaction, be it with Quick, Draw! or one of the other applets in this virtual playground, improves Google’s ability to more nimbly recognize images and language. That makes the company’s products stronger, but it also services users. The data fuels apps like Google Photos, which uses AI to swiftly organize all your pictures. It’s a system of give and take—and with games like Quick, Draw!, it’s fun, too


A paper robot owl helps teach kids coding

 The Micronaxx really want this guy (currently on Kickstarter). (Too bad I didn’t get an autograph from creator Brent Bushnell, who keynoted Google I/O Youth this year.):

Oomiyu (pronounced “umiyoo”) is a maker kit that allows beginner inventors (8 years and up) to build a fun, customized and interactive paper craft robotic owl while getting a taste of basic mechanical principles, electronics and programming based on an Arduino 101.


“Lost in Light” — lovely timelapses of of starry skies

By day Googler Sriram Murali keeps spam out of your inbox; by night he captures thrilling images of the stars whirling past us—or rather, of us whirling past them:

He writes,

Lost in Light, a short film on how light pollution affects the view of the night skies. Shot mostly in California, the movie shows how the view gets progressively better as you move away from the lights. Finding locations to shoot at every level of light pollution was a challenge and getting to the darkest skies with no light pollution was a journey in itself. Here’s why I think we should care more.

The night skies remind us of our place in the Universe. Imagine if we lived under skies full of stars. That reminder we are a tiny part of this cosmos, the awe and a special connection with this remarkable world would make us much better beings – more thoughtful, inquisitive, empathetic, kind and caring. Imagine kids growing up passionate about astronomy looking for answers and how advanced humankind would be, how connected and caring we’d feel with one another, how noble and adventurous we’d be. How compassionate with fellow species on Earth and how one with Nature we’d feel. Imagine a world where happiness of the soul is more beautiful. Ah, I feel so close to inner peace. I can only wonder how my and millions of other lives would have changed.

On a related note, check out how two towns in Colorado have become a haven for star watchers.



How To: Changing photo dates in Google Photos

So, now that you’ve downloaded PhotoScan & digitized a bunch of images, how can you give them proper dates? Here’s how:

On, just select the group of photos you’d like to adjust and click “Edit date & time” in the menu dropdown. You’ll be able to shift or set the time stamps, and preview the changes before saving.


Here’s a quick video demo (showing how to edit one image, but applicable to multiple simultaneously):


“Like Bob Ross meets Lawnmower Man”: Adobe previews “Wet Brush”

Old Man Nack needs a trigger warning for stuff like this… There’s almost no better way for a Photoshop PM to break his or her own heart than to ship new painting features. I remember crestfallen Kevin Connor after Photoshop 7 introduced a ton of new power, and when we shipped the 3D Mixer Brush in CS5, we heard crickets again. Even amazing tech like MoXi, Fresh Paint, and Expresii, and Mischief never seems to find much of an audience. Most people can’t paint, will never paint, and DGAF.

But, what the heck? WetBrush looks cool, and the 3D extrusion looks like a good fit for the laser carving/printing techniques Russell Brown debuted a few years back.



SprayPrinter: Bluetooth-powered graffiti writing

How cool is this thing?

SprayPrinter lets you spray paint images from your phone. It knows where to release paint so it’s perfect for creatives at any level.

While SprayPrinter may seem like it operates on unicorn magic, it’s not quite so mysterious: SprayPrinter communicates to your phone via bluetooth and an LED-light on the printer. The image is printed layer by layer, pixel by pixel, and you can use as many colors as you like.



Google Brain enables realtime remixing of styles

Style transfer gets super fast & interactive:

TechCrunch writes,

The latest work from Google Brain, however, makes style transfer almost trivial to compute… Instead of learning the look of a single painting, the new style transfer network learns the style shared by multiple paintings by the same artist. […]

In this case, it’s all being done by one super-efficient neural network that knows and can combine dozens of styles based on lower-level features. That may sound academic, but it’s actually a major step forward — a highly generalizable model.




An art director’s rather brilliant Instagram self-promo

Hats off to the clever & industrious Aric Guite:

The idea began with Aric making a list of his top 30 art directors. He combed through each of their Instagram feeds and selected one iconic photo. Using the photo as inspiration, Aric shot a second photo that complemented the subject matter. The two photos were then posted to Aric’s feed, with each art director tagged along with a caption asking to collaborate. Together, the photos create an entirely fresh and one-of-a-kind promo piece that is unique to each art director.


[Vimeo] [Via]

Google Photos introduces new concept movies

The first batch of movie concepts (“the kinds of movies you might make yourself, if you just had the time”) that Photos introduced in September have been really well received, and now the team is rolling out more:

More automatic movies, made for you: baby’s first months, holiday traditions, highlights from the year, and more.

As before, just live your life, back up your pics, and keep an eye out for movies arriving via the Assistant in Photos.



Adobe shows off “Project Dali” for painting in VR

“Why doesn’t designing feel like dancing?” I used to ask Photoshop teammates. Then they’d stare back blankly and I’d say, “Yeah yeah—crack don’t smoke itself…”

But here’s to the crazy ones, and Erik Natzke’s work has long inspired me. Seeing a talk of his years ago, in which he showed how he’d build custom interfaces in Flash that let other artists customize images & animation, sent me on a years-long inquiry into what could happen if Flash or HTML were a layer type in Adobe apps. The point is, he tends to open eyes & get juices flowing.

Thus I’m excited to see Erik & co. working on “Project Dali”:

Erik writes,

I don’t think of Project Dali as digital or analog. It’s something that mixes the two and comes out completely unique. It could incorporate texture (think of the exquisite feel of graphite) and time (your paint is drying) with the unending flexibility of digital. It takes art that used to feel static and lets us manipulate it in three-dimensional space. In the process, the art becomes different, magical.

I’m starting to think about it like a musical instrument: If you are a musician, your instrument enables your creativity; it doesn’t stand between you and the idea in your head. And just like with VR, you learn by playing.

I can’t wait to take it for a spin & see how it evolves.



The Google Photos editor gets smarter & more powerful

“Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.” – Antoine de Saint-Exupery


When I joined the Google Photos team, they’d just integrated Snapseed into Google+ (the predecessor of Photos). As I hope is obvious, I’m a huge Snapseed fan, but when we looked at what most users actually did in G+ (crop, rotate, tweak brightness, and maybe apply a filter), it became clear that Snapseed was dramatically more complex & powerful than they needed.

Therefore we made the hard decision to reset & build a new editor from scratch. We aimed to deliver great results in a single tap, offer just a few powerful sliders (which under the hood adjusted numerous parameters), and keep Snapseed just one extra tap away (via the overflow menu) for nerds like me.

The vision was always to keep learning from users’ behavior, then thoughtfully enable just the controls needed to deliver extra power when needed. I’m delighted to say that Photos now does just that: the update released Tuesday on iOS, Android, and Web (try it here) manages to keep a simple top-level UI while revealing a lot more of the power under the hood.

The filters UI applies Auto (which can now produce more accurate results) as part of every filter:

These unique looks make edits based on the individual photo and its brightness, darkness, warmth, or saturation, before applying the style. All looks use machine intelligence to complement the content of your photo. [1]


In the adjustments section, in addition to the Light, Color, and Pop sliders:

  • Light opens to reveal Exposure, Contrast, Highlights, Shadows, Whites, Blacks, and Vignette
  • Color opens to reveal Saturation, Warmth, Tint, Skin Tone, and Deep Blue

I continue to find Auto to be highly effective for the bulk of my images, but I like being able to pop the hood when needed.

Please take the new features for a spin & let us know what you think!

Oh, and since you’ve been kind enough to read this far, here are some useful shortcuts for use on desktop:

  • E to enter & exit the editor
  • R to enter & exit crop/rotate
  • Shift-R to rotate 90º
  • A to Auto Enhance
  • O (press & hold) to see original
  • Z to zoom
  • Left/right arrows to move among images
  • Cmd-C/V to copy/paste edits among images
  • After clicking a slider, use arrow keys to adjust it & press Tab to put focus onto the next slider

Check out Google Earth in VR

This looks totally bananas:

Ten years ago, Google Earth began as an effort to help people everywhere explore our planet. And now, with more than two billion downloads, many have. Today, we are introducing Google Earth VR as our next step to help the world see the world. With Earth VR, you can fly over a city, stand at the top of the highest peaks, and even soar into space. 

You can grab it now for the HTC Vive.



Introducing Google PhotoScan

“Photos from the past, meet scanner from the future.” I think you’re gonna dig this (available now on Android & iOS). 🙂

Don’t just take a picture of a picture. Create enhanced digital scans, with automatic edge detection, perspective correction, and smart rotation.

PhotoScan stitches multiple images together to remove glare and improve the quality of your scans.

Check it out:

So, how does it work? Let’s hear right from the team:

Enjoy, and as always please let us know what you think!


[YouTube 1 & 2]

Check out Google RAISR: Sharp images with machine learning

If you share a picture of a tree in a forest, but no one can see it, did you really share it?

Working at Google, where teams aspire to “three-comma moments” (i.e. reaching 1,000,000,000 users), it’s become overwhelmingly clear to me that all the fancy features in the world don’t mean squat if people can’t access them. And traveling in Nepal, I got a taste of just how slow & expensive connectivity can be. Anything that helps deliver content faster & more cheaply means more democratic access to ideas & inspiration.

That’s why I’ve been really excited about RAISR (“Rapid and Accurate Image Super-Resolution”). The Google Research team writes,

[It’s] a technique that incorporates machine learning in order to produce high-quality versions of low-resolution images. RAISR produces results that are comparable to or better than the currently available super-resolution methods, and does so roughly 10 to 100 times faster, allowing it to be run on a typical mobile device in real-time.


I’ve been championing this tech within the company and—because the research paper is public—encouraging friends at Facebook, Adobe, Apple, and elsewhere to check it out. Fast, affordable access is good for everyone.

It’s funny: I came here to “teach Google Photoshop” (i.e. to make computers see & create like artists), yet if I do my job right here, you’ll never spot a thing. I’ve come to prioritize access far ahead of synthesis. Funny ol’ world.

PS—Obligatory (?) Old Man Nack remark: “In my day, Genuine Fractals, blah blah…”

Adobe demos automatic sky-swapping

My old Photoshop boss Kevin used to show a chart that nicely depicted the march of tools from simple & broad (think Clone Stamp) to sharp & purposeful (Healing Brush), smartly tailored to specific needs. I love to see how computer vision is helping to extend that arc, as demonstrated here:

Adobe says SkyReplace uses deep learning to automatically figure out the boundary lines between the sky and the rest of the shot (e.g. buildings and ground). It can then not only swap out the old sky and insert a completely new one, but it can adjust the rest of the photo to take on the same look and feel as the new sky, creating a more realistic look.


The N-up UI reminds me of Photoshop’s early-90’s Variations dialog. Maybe graphically, as well as politically, everything old is new again.



YouTube VR arrives on Daydream

I’d love to experience a next-gen Awesome; I Fuckin’ Shot That! as a series of 360º video bubbles that I can jump among (landing on stage, in the crowd, etc.). (Ah, but can I get Moby Dick there?)

The team writes,

This new standalone app was built from the ground up and optimized for VR. You just need a Daydream-ready phone like Pixel and the new Daydream View headset and controller to get started. Every single video on the platform becomes an immersive VR experience, from 360-degree videos that let you step inside the content to standard videos shown on a virtual movie screen in the new theater mode. The app even includes some familiar features like voice search and a signed in experience so you can follow the channels you subscribe to, check out your playlists and more.


Google desktop radar can identify materials, body parts

Seriously (unless, of course, the UI demo is just some elaborate trolling). I can’t wait for social media to let you apply a “Facepalm” reaction by literally jamming your phone/palm against your face. Check out the demo & read on for details:

(Of course, in the current political climate I can’t help but think, “Great, I’m glad this is the critically important shit we spend our biggest brains on.”)



[YouTube] [Via]

Adobe demos “Audio Photoshopping”

“You’re a witch!” says Jordan Peele, having just heard words inserted in his mouth. Oh, there’s no way this gets misused, right? 😏

When recording voiceovers, dialogue, and narration, wouldn’t you love the option to edit or insert a few words without the hassle of recreating the recording environment or bringing the voiceover artist in for another session? #VoCo allows you to change words in a voiceover simply by typing new words.



Visual storytelling: Harry Caray calls the Cubs’ final out

I couldn’t take it; I just couldn’t take it.

In 1984, for the first time since 1945 (when my lifelong-fan dad was 5 years old), the Cubs won their division championship & reached the postseason. I remember sitting in the living room with my dad watching Harry Caray call Rick Sutcliffe’s winning pitch on WGN. Holy Cow!!

The Cubs proceeded to take a two-games-to-none lead in the best-of-five series with the San Diego Padres. No problem! Just win one game—one of three!—and they were through to the Series. Then… then they went full-on Cub, and during that fateful game 5 my cousin Andy & I headed upstairs to play Lego boats in the tub while the team blew their lead, the series, their shot. I just couldn’t take it.

And then… this past Wednesday. You know what happened. We went insane.

I’ll be honest, though, and confess just a little melancholy. It’s not 1984, and I’m not a kid watching the game with my dad. I’m thrilled for the team, but I can still tell you more players’ names from ’84 than from ’16. I’ve long lived in California and won’t tell you I follow the game or the team as I once did. The victory is great—amazing!—but… you can’t quite go home again.

Ah, but what if you could, just a little? Thanks to some clever & lightning-fast editing, Budweiser brought back to life that iconic voice of my youth, Harry “Cub fan, Bud man” Caray. Just watch. I can’t stop.

And Harry, wherever you are, thanks. This Bud’s for you.



Illustration: Hilariously unsatisfying things

Resistentialism: 1. “a jocular theory to describe ‘seemingly spiteful behavior manifested by inanimate objects,’ where objects that cause problems (like lost keys or a runaway bouncy ball) are said to exhibit a high degree of malice toward humans.” 2. Pretty much my wife’s entire life (according to her).

Like Bill Clinton, the creators of The Unsatisfying Challenge feel our pain. Come, let us groan together:

During the summer of 2016, We created and directed a video about unsatisfying situations: the frustrating, annoying, disappointing little things of everyday life, that are so painful to live or even to watch.We quickly realized that there are a lot of other situations that would be fun to see animated, so we decided to run an animation challenge around this idea.



An expressive new camera app coming from Facebook

Snapchat often describes itself as being “a camera company,” and clearly Facebook (owning Instagram, and having bought MSQRD), is taking a similar path:

The new camera will be accessible by simply swiping right while you’re looking at your news feed, making it easy to quickly snap a filtered moment to share.

 TechCrunch reports that there will be filters for your face (selfie masks, overlaid graphics, and geofilters), art-themed filters, and filters that “respond to your body’s movements.” 

We’ll have to see, though: A lot of the appeal of Snapchat’s filters has been in their social context: Here today, gone today—transient fun. As I’ve often said, the genius of Instagram was in helping regular people be a bit better, and the genius of Snapchat has been in letting them not care. Posting lasting stuff to FB generally equals caring. But who knows, perhaps like Instagram FB will introduce “here today, gone today” ephemeral stories. (I’d certainly use them.) Interesting times.



Adobe + Reuters, sittin’ in a tree

Another day, another recollection from Old Man Nack. 👴🏻

It’s funny to see things come to fruition… at their own… damn… pace. Back when we shipped Bridge in 2005, photographers were extraordinarily pissed. They hated the presence of royalty-free stock search in that app. They saw Adobe as making them purchase & underwrite the seeds of their livelihoods’ destruction. They weren’t wrong, though that first incarnation of Adobe Stock Photos cratered with barely a whimper.

Cut to 2016, Adobe has spent $800 million buying a stock photo provider (about which no one made a peep), and now A) you can upload your own photos to Adobe Stock right within Bridge, and B) you can search for stock photos—now including those from Reuters—right within Photoshop proper (something I tried to enable in Mini Bridge).

What does all this mean? I dunno—blah blah about things being ahead of their time, the inevitable commodification/increasing access to art, something like that. “And on it goes, this thing of ours…”


Demo: Photoshop adds in-app search

I’m blessed & cursed by a long memory, especially for useless details.

Seeing Photoshop’s new Find feature, I think back to 10+ years ago, when I was pushing hard for Photoshop to offer a universal search feature for both commands and help content. Beyond basic search, I wanted the panel to enable running simple script-like commands for replicating objects, applying fills, etc. Inspired in part by InDesign’s Quick Apply panel, the vision was similar to Maya’s MEL language: show history steps as script, and let users copy/modify/run those to do awesome things.

Outside of search in the Layers panel, I failed* to get the feature shipped (partly because Mac OS’s then-new Help menu feature did a lot of the job). Illustrator, however, managed to ship the Knowhow panel:


Almost no one (knowone? #dadjokes) used it or, I’m sure, remembers it today (including inside Adobe)—but try, try again. Photoshop’s new Find command offers some slick integration with Adobe Stock. Take it away, Julieanne:

*Update: Wait, I did get it shipped—at least as a feature in Configurator (oh hai, Knowledge Panel!). Thanks to my collaborators on that, Amy Casillas & Victor Gavenda, for the reminder.


You can now edit your panoramas in Snapseed

I was really annoyed & frankly embarrassed to find not long ago that when I used Snapseed to edit a panorama (captured via either my iPhone’s built-in pano mode or via Google Street View’s 360º capture), then tried to post it to Facebook, its magical pano-ness would be lost & the image would be rendered as a flat JPEG instead of as an interactive pano.

Happily this has been fixed, and if you install the latest update to Snapseed, you should be able to edit panos, then upload them in interactive form. (This works for spheres shown via, too.) Take it for a oh God don’t let me say spin and let me know if you hit any snags.