Category Archives: Photography

Photography: “Jay Myself” is terrific

Many years ago I had the chance to drop by Jay Maisel‘s iconic converted bank building in the Bowery. (This must’ve been before phone cameras got good, as otherwise I’d have shot the bejesus out of the place.) It was everything you’d hope it to be.

As luck would have it, my father-in-law (having no idea about the visit) dialed up the documentary “Jay Myself” last night, and whole family (down to my 12yo budding photographer son) loved it. I think you would, too!

Animation: Gmunk & Light

I’ve admired the motion graphics of Bradley Munkowitz since my design days in the 90’s (!), and I enjoyed this insight into one of his most recent creations:

What I didn’t know until now is that he collaborated with the folks at Bot & Dolly—who created the brilliant work below before getting acquired by Google and, as best I can tell, having their talent completely wasted there 😭.

Gone Fishin’, 2021 edition

Hey all—greetings from somewhere in the great American west, which I’m happily exploring with my wife, kids, and dog. Being an obviously crazy person, I can’t just, y’know, relax and stop posting for a while, but you may notice that my cadence here drops for a few days.

In the meantime, I’ll try to gather up some good stuff to share. Here’s a shot I captured while flying over the Tehachapi Loop on Friday (best when viewed full screen).


Just for fun, here’s a different rendering of the same file (courtesy of running the Mavic Pro’s 360º stitch through Insta360 Studio):

And, why not, heres’ another shot of the trains in action. I can’t wait to get some time to edit & share the footage.

default

Google Pixel brings video to astrophotography

Psst, hey, Russell Brown, tell me again when we’re taking our Pixels to the desert… 😌✨

Pixel owners love using astrophotography in Night Sight to take incredible photos of the night sky, and now it’s getting even better. You can now create videos of the stars moving across the sky all during the same exposure. Once you take a photo in Night Sight, both the photo and video will be saved in your camera roll. Try waiting longer to capture even more of the stars in your video. This feature is available on Pixel 4 and newer phones and you can learn more at g.co/pixel/astrophotography.

Google makes strides on equitable imaging

“I’m real black, like won’t show up on your camera phone,” sang Childish Gambino. It remains a good joke, but ten years later, it’s long past time for devices to be far fairer in how they capture and represent the world. I’m really happy to see my old teammates at Google focusing on just this area:

Apply for the Adobe Stock Artist Development Fund

I’m really happy to see Adobe putting skin in the game to increase diversity & inclusion in stock imagery:

Introducing the Artist Development Fund, a new $500,000 creative commission program from Adobe Stock. As an expression of our commitment to inclusion we’re looking for artists who self-identify with and expertly depict diverse communities within their work.

Here’s how it works:

The fund also ensures artists are compensated for their work. We will be awarding funding of $12,500 each to a total of 40 global artists on a rolling basis during 2021. Artist Development Fund recipients will also gain unique opportunities, including having their work and stories featured across Adobe social and editorial channels to help promote accurate and inclusive cultural representation within the creative industry.

VFX & photography: Fireside chat tonight with Paul Debevec

If you liked yesterday’s news about Total Relighting, or pretty much anything else related to HDR capture over the last 20 years, you might dig this SIGGRAPH LA session, happening tonight at 7pm Pacific:

Paul Debevec is one of the most recognized researchers in the field of CG today. LA ACM SIGGRAPH’s “fireside chat” with Paul and Carolyn Giardina, of the Hollywood Reporter, will allow us a glimpse at the person behind all the innovative scientific work. This event promises to be one of our most popularas Paul always draws a crowd and is constantly in demand to speak at conferences around the world.

“Total Relighting” promises to teleport(rait) you into new vistas

This stuff makes my head spin around—and not just because the demo depicts heads spinning around!

You might remember the portrait relighting features that launched on Google Pixel devices last year, leveraging some earlier research. Now a number of my former Google colleagues have created a new method for figuring out how a portrait is lit, then imposing new light sources in order to help it blend into new environments. Check it out:

Interesting, interactive mash-ups powered by AI

Check out how StyleMapGAN (paper, PDF, code) enables combinations of human & animal faces, vehicles, buildings, and more. Unlike simple copy-paste-blend, this technique permits interactive morphing between source & target pixels:

From the authors, a bit about what’s going on here:

Generative adversarial networks (GANs) synthesize realistic images from random latent vectors. Although manipulating the latent vectors controls the synthesized outputs, editing real images with GANs suffers from i) time-consuming optimization for projecting real images to the latent vectors, ii) or inaccurate embedding through an encoder. We propose StyleMapGAN: the intermediate latent space has spatial dimensions, and a spatially variant modulation replaces AdaIN. It makes the embedding through an encoder more accurate than existing optimization-based methods while maintaining the properties of GANs. Experimental results demonstrate that our method significantly outperforms state-of-the-art models in various image manipulation tasks such as local editing and image interpolation. Last but not least, conventional editing methods on GANs are still valid on our StyleMapGAN. Source code is available at https://github.com/naver-ai/StyleMapGAN​.

A little fun with Bullet Time

During our epic Illinois-to-California run down Route 66 in March, my son Henry and I had fun capturing all kinds of images, including via my Insta360 One X2 camera. Here are a couple of “bullet time” slow-mo vids I thought were kind of fun. The first comes from the Round Barn in Arcadia, OK…

…and the second from the Wigwam Motel in Holbrook, AZ (see photos):

It’s a bummer that the optical quality here suffers from having the company’s cheap-o lens guards applied. (Without the guards, one errant swipe of the selfie stick can result in permanent scratches to the lens, necessitating shipment back to China for repairs.) They say they’re working on more premium glass ones, for which they’ll likely get yet more of my dough. ¯\_(ツ)_/¯

What a difference four years makes in iPhone cameras

“People tend to overestimate what can be done in one year and to underestimate what can be done in five or ten years,” as the old saying goes. Similarly, it can be hard to notice one’s own kid’s progress until confronted with an example of that kid from a few years back.

My son Henry has recently taken a shine to photography & has been shooting with my iPhone 7 Plus. While passing through Albuquerque a few weeks back, we ended up shooting side by side—him with the 7, and me with an iPhone 12 Pro Max (four years newer). We share a camera roll, and as I scrolled through I was really struck seeing the output of the two devices placed side by side.

I don’t hold up any of these photos (all unedited besides cropping) as art, but it’s fun to compare them & to appreciate just how far mobile photography has advanced in a few short years. See gallery for more.

Tutorial: Light painting tips from Russell Brown

Back in February I got to try my hand at some long-exposure phone photography in Death Valley with Russell Brown, interspersing chilly morning & evening shoots with low-key Adobe interviewing. 😌

Here’s a long-exposure 360º capture I made with Russell’s help in the ghost town of Rhyolite, NV:

Stellar times chilling (literally!) with Russell Preston Brown. 💫

Posted by John Nack on Thursday, February 4, 2021

Russell never stops learning & exploring, and here he shares some of his recent findings, using a neutral density filter on a phone to prevent blown-out highlights:

 
 
 
 
 
View this post on Instagram
 
 
 
 
 
 
 
 
 
 
 

A post shared by Russell Preston Brown (@dr_brown)

Getting our kicks

After driving 2,000+ miles down Route 66 and beyond in six days—the last of which also included getting onboarded at Adobe!—I’ve only just begun to breathe & go through the titanic number of photos and videos my son & I captured. I’ll try to share more good stuff soon, but in the meantime you might get a kick (heh) out of this little vid, captured via my Insta360 One X2:

Now one of these days I just need to dust off my After Effects skills enough to nuke the telltale pole shadows. Someday…!

2-minute tour: ProRAW + Lightroom

Over the last 20 years or so, photographers have faced a slightly Faustian bargain: shoot JPEG & get the benefits of a camera manufacturer’s ability to tune output with a camera’s on-board smarts; or shoot raw and get more dynamic range and white balance flexibility—at the cost of losing that tuning and having to do more manual work.

Fortunately Adobe & Apple have been collaborating for many months to get Apple’s ProRAW variant of DNG supported in Camera Raw and Lightroom, and here Russell Brown provides a quick tour of how capture and editing work:

 
 
 
 
 
View this post on Instagram
 
 
 
 
 
 
 
 
 
 
 

A post shared by Russell Preston Brown (@dr_brown)

Happy St. Paddy’s from one disgruntled leprechaun

We can’t celebrate in person with pals this year, but here’s a bit of good cheer from our wee man (victim of the old “raisin cookie fake-out”):

Saturday, March 16, 2019

Meanwhile, I just stumbled across this hearty “Sláinte” from Bill Burr. 😌

And on a less goofball note,

May the road rise to meet you,
May the wind be always at your back.
May the sun shine warm upon your face,
The rains fall soft upon your fields.
And until we meet again,
May God hold you in the palm of His hand.

☘️ J.

Insta360 GO 2: Finally a wearable cam that doesn’t suck?

Photo-taking often presents a Faustian bargain: be able to relive memories later, but at the cost of being less present in the experience as it happens. When my team researched why people do & don’t take photos, wanting to be present & not intrusive/obnoxious were key reasons not to bring out a camera.

So what if you could wear not just a lightweight, unobtrusive capture device, but actually wear a photographer—an intelligence that could capture the best moments, leaving your hands & mind free in the moment? Even naive, interval-based capture could produce a really interesting journey through space, as Blaise Agüera y Arcas demonstrated at Microsoft back in 2013:

It’s a long-held dream that products like Google’s Clips camera (which Blaise led at Google) have tried so achieve, thus far without any notable success. Clips proved to be too large & heavy for many people to wear comfortably, and training an AI model to find “good” moments ends up being much harder than one might imagine. Google discontinued Clips, though as a consolation prize I ended up delighting my young son by bringing home reams of unused printed circuit boards (which for some reason resembled the Millennium Falcon). Meanwhile Microsoft discontinued PhotoSynth.

The need remains & the dream won’t die, however, so I was excited ~18 months ago when Insta360 introduced the GO, a $199, “20-gram steadicam” for $199. It promised ultra lightweight wearability, photo capture, and a slick AirPods-style case for both recharging & data transfer. The wide FOV capture promised post-capture reframing driven by (you guessed it) mythical AI that could select the best moments.

Others (including many on the Insta forum) were skeptical, but I was enamored enough that my wife bought me one for Christmas. Sadly, buying Insta products is a little like Russian Roulette (e.g. I have loved the One X & subsequent X2, while the One R has been a worthless paperweight), and the GO ended up on the bummer side of the ledger. I found it way too hard to reliably start/stop & to transfer data. It’s been another paperweight.

To their possible credit (TBD), though, Insta has persisted with the product and has released the GO 2—now more expensive ($299) but promising a host of improvements (wireless preview & transfer, better storage & battery, etc.). Check it out:

“Looks perfect for a proctologist, which is where Insta can shove it,” said one salty user on the Insta forum. Will it finally work well? I don’t know—but I’m just hungry/sucker enough to pull the trigger, F around & find out. Hopefully it’ll arrive in advance of the road trip I’m planning with my son, so stay tuned for real-world findings.

Meanwhile, here’s a review I found thorough & informative—and not least in its innovative use of gummi bears as a unity of measure 🙃:

Oh, and I did not order the forthcoming Minion mod (a real thing, they swear):

Lego: Nanonaxx Conquer Death Valley

Do I seem like the kind of guy who’d have tiny Lego representations of himself, his wife, our kids (the Micronaxx), and even our dog? What a silly question. 😌

I had a ball zipping around Death Valley, unleashing our little crew on sand dunes, lonesome highways, and everything in between. In particular I was struck by just how often I got more usable shallow depth-of-field images from my iPhone (which, like my Pixel, lets me edit the blur post-capture) than from my trusty, if aging, DSLR & L-series lens.

Anyway, in case this sort of thing is up your alley, please enjoy the results.

Snowflakes materialize in reverse

“Enjoy your delicious moments,” say the somewhat Zen pizza boxes from our favorite local joint. In that spirit, let’s stay frosty:

PetaPixel notes,

Jens writes that the melting snowflake video was shot on his Sony a6300 with either the Sony 90mm macro lens or the Laowa 60mm 2:1 macro lens. He does list the Sony a7R IV as his “main camera,” but it’s still impressive that this high-resolution video was shot thanks to one of Sony’s entry-level offerings.

3D dronie!

Inspired by the awesome work of photogrammetry expert Azad Balabanian, I used my drone at the Trona Pinnacles to capture some video loops as I sat atop one of the structures. My VFX-expert friend & fellow Google PM Bilawal Singh Sidhu used it to whip up this fun, interactive 3D portrait:

The file is big enough that I’ve had some trouble loading it on my iPhone. If that affects you as well, check out this quick screen recording:

The facial fidelity isn’t on par with the crazy little 3D prints of my head I got made 15 (!) years ago—but for for footage coming from an automated flying robot, I’ll take it. 🤘😛

“The World Deserves Witnesses”

Lovely work from Leica, testifying to the power of presence, capture, and connection.

Per PetaPixel,

“A witness, someone who sees what others simply watch,” the company writes in a description of the campaign. “When Leica invented the first 35mm camera in 1914, it allowed people to capture their world and the world around them and document its events, no matter how small or big they were. Today, as for more than one century, Leica keeps celebrating the witnesses, the ones who see the everyday beauty, grace and poetry, and the never ending irony and drama of our human condition, and bring their cameras to the eye in order to frame it and fix it forever.

Facebook improves automatic image description

I love seeing progress towards making the world more accessible, and tech that’s good for inclusion can also benefit all users & businesses. Here the researchers write,

To make our models work better for everyone, we fine-tuned them so that data was sampled from images across all geographies, and using translations of hashtags in many languages. We also evaluated our concepts along gender, skin tone, and age axes. The resulting models are both more accurate and culturally and demographically inclusive — for instance, they can identify weddings around the world based (in part) on traditional apparel instead of labeling only photos featuring white wedding dresses.

PetaPixel writes,

Facebook says that this new model is more reliably able to recognize more than 1,200 concepts, which is more than 10 times as many as the original version launched in 2016.

From refugee to… squirrel photographer?

Our kids were born with such voluminous, Dizzy Gillespie-grade cheeks that we immediately dubbed them “The Squirrels,” and we later gave our van the license plate SQRLPOD. This has nothing to do with anything, but I thought of it fondly upon seeing this charming 1-minute portrait:

Niki Colemont, is a wildlife photographer and a survivor who fled the Rwandan genocide at just four years old, arriving in Belgium as a refugee. The National Geographic 2019 finalist photographer finds peace today in photographing squirrels, who he considers “the perfect models.”

Drone rescues drone

“For he is truly his brother’s keeper, and the finder of lost children…”

Photographer Ty Poland tells the story, including this MacGyver-y bit:

Next up, we needed a lasso. Thankfully our good friend Martin Sanchez had an extra pair of shoes in the trunk. On top of that, he had just polished off a fresh iced coffee from Dunkin with a straw. With these two ingredients, we were able to construct an open lasso. By simply putting the straw over the shoelace and adding a small key chain for weight, we were able to center the lasso to the Mavic 2 Pro for the rescue.

Samsung adds one-tap object erasing

If you want to be successful, says Twitter founder Evan Williams, “Take a human desire, preferably one that has been around for a really long time…Identify that desire and use modern technology to take out steps.” 

My old Photoshop boss Kevin Connor liked to cite the Healing Brush as an example of how tech kept evolving to offer more specialized, efficient solutions (in this case, from the more general Clone Stamp to something purpose-built). Content-Aware Fill, which we shipped back in 2010, was another such optimization, and now its use is getting even more specialized/direct.

PetaPixel writes,

Samsung added Object Eraser, a tool powered by AI that appears to work by combining object recognition with something like Adobe’s Content-Aware Fill. In any photo captured on an S21 series phone, simply tap the button to tell activate Object Eraser, then just tap on the people you want to remove, and then the phone automatically does all the work.

Night Photo Summit coming soon

I was excited to learn today that Adobe’s Russell Brown, together with a large group of other experts, is set to teach night photography techniques February 12-14:

28 speakers in 6 categories will present 30+ talks over three days.⁠ Our goal: “Inspiring night photographers across the galaxy.”⁠

Check out the schedule & site for more. Meanwhile I’m hoping to get out to the desert with Russell in a couple of weeks, and I hope to help produce some really cool stuff. 🤞

 

 

 
 
 
 
 
View this post on Instagram
 
 
 
 
 
 
 
 
 
 
 

 

A post shared by National Parks At Night (@nationalparksatnight)

Behind the scenes: Drone light painting

I’m a longtime admirer of Reuben Wu’s beautiful light painting work, and planning to head to Death Valley next month, I thought I’d try to learn more about his techniques. Happily he’s shared a quick, enlightening (heh) peek behind the scenes of his process:

I also enjoyed this more detailed how-to piece from Daniel James. He’s convinced me to spring for the Lume Cube Mavic Pro kit, though I welcome any additional input!

🎄Blinding Lights & Beach Boys

Although we can’t travel far this holiday season (bye bye, Mendocino! see you… someday?), we are, in the words of Tony Stark, “bringing the party to you” via our decked-out VW Westy (pics). I’m having fun experimenting with my new Insta360 One X2, mounting it on a selfie stick & playing with timelapse mode. Here’s a taste of the psychedelic stylings, courtesy of The Weeknd…

…and Brian Wilson:

Astro Adobeans

A couple of my old Adobe pals (who happen to dwell in the dark, beautiful wilderness around Santa Cruz) have been sharing some great astrophotography-related work lately.

First, Bryan O’Neil Hughes shares tips on photographing the heavens, including the Jupiter-Saturn Conjunction and the Ursids Meteor Shower:

Meanwhile Michael Lewis has been capturing heavenly shots which, in the words of my then ~4-year-old son, “make my mind blow away.” Check out his Instagram feed for images like this:

And if you’re shooting with a phone—especially with a Pixel—check out these tips from former Pixel imagining engineer Florian Kainz (who’s now also at Adobe—hat trick!).

Google Photos rolls out Cinematic Photos & more

Nearly 20 years ago, on one of my first customer visits as a Photoshop PM, I got to watch artists use PS + After Effects to extract people from photo backgrounds, then animate the results. The resulting film—The Kid Stays In The Picture—lent its name to the distinctive effect (see previous).

Now I’m delighted that Google Photos is rolling out similar output to its billion+ users, without requiring any effort or tools:

We use machine learning to predict an image’s depth and produce a 3D representation of the scene—even if the original image doesn’t include depth information from the camera. Then we animate a virtual camera for a smooth panning effect—just like out of the movies.

Photos is also rolling out new collages, like this:

And they’re introducing new themes in the stories-style Memories section up top as well:

Now you’ll see Memories surface photos of the most important people in your life…  And starting soon, you’ll also see Memories about your favorite things—like sunsets—and activities—like baking or hiking—based on the photos you upload.

Enjoy!

Tilt-shift takes off

Remember Obama’s first term, when faked tilt-shift photos were so popular that Instagram briefly offered a built-in feature for applying the look? The effect got burned out, but I found it surprisingly fun to see it return in this short video.

In a brief interview, Sofia-based photographer Pavel Petrov shares some behind-the-scenes details.

I have used Adobe Premiere Pro for post processing with some compound blur (for the narrow depth of field) and some oversaturation and speed up to 300%.

Google Photos gets HDR & sky palette transfer on Pixel

A couple of exciting new features have landed for Pixel users. My colleague Navin Sarma writes,

Sky palette transfer in Photos – Sky palette transfer allows users to quickly improve their images that contain sky, achieving a dramatic, creative, and professional effect. It localizes the most dramatic changes to color and contrast to the sky, and tapers the effect to the foreground. It’s especially powerful to improve images of sunsets or sunrises, or where there are complex clouds and contrasty light. 

Dynamic/HDR in Photos – The “Dynamic” suggestion is geared towards landscape and “still life” photography, where images can benefit from enhanced brightness, contrast, and color. This effect uses local tone mapping, which allows more control of where brightness and contrast changes occur, making it especially useful in tricky lighting situations. You can use this effect on any photo by using the “Dynamic” suggestion, or navigating to Adjust and moving the “HDR” slider.