All posts by jnack

Snowflakes materialize in reverse

“Enjoy your delicious moments,” say the somewhat Zen pizza boxes from our favorite local joint. In that spirit, let’s stay frosty:

PetaPixel notes,

Jens writes that the melting snowflake video was shot on his Sony a6300 with either the Sony 90mm macro lens or the Laowa 60mm 2:1 macro lens. He does list the Sony a7R IV as his “main camera,” but it’s still impressive that this high-resolution video was shot thanks to one of Sony’s entry-level offerings.

Adobe enhances Photoshop, Illustrator cloud collaboration

One of my very earliest interactions with Adobe—in 1999, I believe, before I worked there—a PM called me with questions about how my design team collaborated across offices. Now 20+ years later I find myself married to an Adobe PM charged with enhancing just that. 😌

Check out some of the latest progress they’re making with PS, AI, and the mobile drawing app Fresco:

Invite to Edit in Photoshop, Illustrator and Fresco

The Invite to Edit feature in Photoshop, Illustrator, and Fresco allows asynchronous editing on all surfaces across the desktop, iPad, and iPhone (Fresco). Now collaborators can edit a shared cloud document, one at a time. Just save your .PSD or .AI files as cloud documents and send invitations for others to edit them. You can also edit files that have been shared with you. In addition, you can access your shared cloud documents on assets.adobe.com and the Creative Cloud Desktop app.

The Verge adds some helpful details:

Collaborators will not be able to work on the file live alongside you, but they will be able to open up your work, make changes of their own, save it, and have those changes sync back to your machine. If someone is already editing the file, the new user be given the choice to either make a copy or wait until the current editor is finished. It’s not quite Google Docs-style editing for Photoshop, but it should be easier than emailing a file back and forth.

Rhyolite star trails

A week ago I found myself shivering in the ghost town of Rhyolite, Nevada, alongside Adobe’s Russell Brown as we explored the possibilities of shooting 360º & traditional images at night. I’d totally struck out days earlier at the Trona Pinnacles as I tried to capture 360º star trails via either the Ricoh Theta Z or the Insta360 One X2, but this time Russell kindly showed me how to set up the Theta for interval shooting & additive exposure. I’m kinda pleased with the results:

 

Stellar times chilling (literally!) with Russell Preston Brown. 💫

Posted by John Nack on Thursday, February 4, 2021

.

3D dronie!

Inspired by the awesome work of photogrammetry expert Azad Balabanian, I used my drone at the Trona Pinnacles to capture some video loops as I sat atop one of the structures. My VFX-expert friend & fellow Google PM Bilawal Singh Sidhu used it to whip up this fun, interactive 3D portrait:

The file is big enough that I’ve had some trouble loading it on my iPhone. If that affects you as well, check out this quick screen recording:

The facial fidelity isn’t on par with the crazy little 3D prints of my head I got made 15 (!) years ago—but for for footage coming from an automated flying robot, I’ll take it. 🤘😛

Photoshop is hiring

I’m excited to see this great team growing, especially as they’ve expanded the Photoshop imaging franchise to mobile & Web platforms. Check out some of the open roles:

———-

Photoshop Developers

Photoshop Quality Engineers

Full list of all Adobe opportunities.

A quick, cool demo of markerless body tracking

AR fashion star:

No markers, no mocap cameras, no suit, no keyframing. This take uses 3 DSLR cameras, though, and pretty far from being real-time. […]

Under the hood, it uses #OpenPose ML-network for 2d tracking of joints on each camera, and then custom Houdini setup for triangulating the results into 3d, stabilizing it and driving the rig (volumes, CHOPs, #kinefx, FEM – you name it 🙂

[Via Tyler Zhu]

Google Lens hits 500 million downloads

This is one of the many Google projects to which I’ve been lucky enough to contribute just a bit (focusing on object tracking & graphical adornments). It’s built into Google Photos, among other surfaces, and I’m really pleased that people are seeking it out:

We at AP don’t think that Lens gets enough praise; we even named it one of our 10 favorite Android features from 2020. Lens is an AR-powered service that can help you translate, identify, and scan things around you. Last year, it added support for solving homework questionstext-to-speech and “copy to computer” functions, and helping choose the best dishes at restaurants. There’s lots of nifty stuff that Lens can do.

Google turns offices into vaccination sites, dedicates $150M to education & access

I love seeing people with the means—material, technical, organizational—to help fight the pandemic stepping up to do so. As one step:

To help with vaccination efforts, starting in the United States, we’ll make select Google facilities—such as buildings, parking lots and open spaces—available as needed. These sites will be open to anyone eligible for the vaccine based on state and local guidelines. We’ll start by partnering with health care provider One Medical and public health authorities to open sites in Los Angeles and the San Francisco Bay Area in California; Kirkland, Washington; and New York City, with plans to expand nationally. We’re working with local officials to determine when sites can open based on local vaccine availability. 

Google is also adding $150 million to previous commitments around education & access:

Our efforts will focus heavily on equitable access to vaccines. Early data in the U.S. shows that disproportionately affected populations, especially people of color and those in rural communities, aren’t getting access to the vaccine at the same rates as other groups. To help, Google.org has committed $5 million in grants to organizations addressing racial and geographic disparities in COVID-19 vaccinations, including Morehouse School of Medicine’s Satcher Health Leadership Institute and the CDC Foundation.

Fight on. 💪

“The World Deserves Witnesses”

Lovely work from Leica, testifying to the power of presence, capture, and connection.

Per PetaPixel,

“A witness, someone who sees what others simply watch,” the company writes in a description of the campaign. “When Leica invented the first 35mm camera in 1914, it allowed people to capture their world and the world around them and document its events, no matter how small or big they were. Today, as for more than one century, Leica keeps celebrating the witnesses, the ones who see the everyday beauty, grace and poetry, and the never ending irony and drama of our human condition, and bring their cameras to the eye in order to frame it and fix it forever.

Facebook improves automatic image description

I love seeing progress towards making the world more accessible, and tech that’s good for inclusion can also benefit all users & businesses. Here the researchers write,

To make our models work better for everyone, we fine-tuned them so that data was sampled from images across all geographies, and using translations of hashtags in many languages. We also evaluated our concepts along gender, skin tone, and age axes. The resulting models are both more accurate and culturally and demographically inclusive — for instance, they can identify weddings around the world based (in part) on traditional apparel instead of labeling only photos featuring white wedding dresses.

PetaPixel writes,

Facebook says that this new model is more reliably able to recognize more than 1,200 concepts, which is more than 10 times as many as the original version launched in 2016.

From refugee to… squirrel photographer?

Our kids were born with such voluminous, Dizzy Gillespie-grade cheeks that we immediately dubbed them “The Squirrels,” and we later gave our van the license plate SQRLPOD. This has nothing to do with anything, but I thought of it fondly upon seeing this charming 1-minute portrait:

Niki Colemont, is a wildlife photographer and a survivor who fled the Rwandan genocide at just four years old, arriving in Belgium as a refugee. The National Geographic 2019 finalist photographer finds peace today in photographing squirrels, who he considers “the perfect models.”

AR: Super high-res car interiors arrive in Google Search

Imagine loading multi-gigabyte 3D models nearly instantaneously into your mobile device, then placing them into your driveway and stepping inside. That’s what we’ve now enabled via Google Search on Android:

Take it for a spin via the models listed below, and please let us know what you think!

Volvo: Volvo XC40, Volvo XC40 Recharge, Volvo XC60, Volvo XC90

Porsche: Porsche 911, Porsche Cayenne, Porsche Macan, Porsche Panamera, Porsche Taycan

Fiat Chrysler: Jeep Wrangler 4xE

Drone rescues drone

“For he is truly his brother’s keeper, and the finder of lost children…”

Photographer Ty Poland tells the story, including this MacGyver-y bit:

Next up, we needed a lasso. Thankfully our good friend Martin Sanchez had an extra pair of shoes in the trunk. On top of that, he had just polished off a fresh iced coffee from Dunkin with a straw. With these two ingredients, we were able to construct an open lasso. By simply putting the straw over the shoelace and adding a small key chain for weight, we were able to center the lasso to the Mavic 2 Pro for the rescue.

Track.AI helps fight blindness in children

On a day of new hope & new vision, I’m delighted to see Google, Huawei, and the medical community using ML to help spot visual disorders in kids around the world:

This machine learning framework performs classification and regression tasks for early identification of patterns, revealing different types of visual deficiencies in children. This AI-powered solution reduces diagnosis time from months to just days, and trials are available across 5 countries (China, UAE, Spain, Vietnam and Mexico).

Samsung adds one-tap object erasing

If you want to be successful, says Twitter founder Evan Williams, “Take a human desire, preferably one that has been around for a really long time…Identify that desire and use modern technology to take out steps.” 

My old Photoshop boss Kevin Connor liked to cite the Healing Brush as an example of how tech kept evolving to offer more specialized, efficient solutions (in this case, from the more general Clone Stamp to something purpose-built). Content-Aware Fill, which we shipped back in 2010, was another such optimization, and now its use is getting even more specialized/direct.

PetaPixel writes,

Samsung added Object Eraser, a tool powered by AI that appears to work by combining object recognition with something like Adobe’s Content-Aware Fill. In any photo captured on an S21 series phone, simply tap the button to tell activate Object Eraser, then just tap on the people you want to remove, and then the phone automatically does all the work.

Night Photo Summit coming soon

I was excited to learn today that Adobe’s Russell Brown, together with a large group of other experts, is set to teach night photography techniques February 12-14:

28 speakers in 6 categories will present 30+ talks over three days.⁠ Our goal: “Inspiring night photographers across the galaxy.”⁠

Check out the schedule & site for more. Meanwhile I’m hoping to get out to the desert with Russell in a couple of weeks, and I hope to help produce some really cool stuff. 🤞

 

 

 
 
 
 
 
View this post on Instagram
 
 
 
 
 
 
 
 
 
 
 

 

A post shared by National Parks At Night (@nationalparksatnight)

Guido Quaroni joins Adobe

Awesome Dad-flex: Telling your tiny, Cars-loving kids that you know Guido the forklift. 😌

Granted, it was a little confusing to explain that I knew the voice of the cartoon forklift & that he was actually a brainy Italian guy who worked at Pixar—but it worked. In any case, now Guido Quaroni—who spent 20 years at Pixar & who was always a fantastic host during Adobe customer visits—has now joined the Big Red A:

“I’ve been a customer of Adobe’s software for a number of years, and I always admired Adobe’s commitment to provide top of the line tools to creatives,” said Quaroni. “When I heard about Adobe’s renewed interest in entering into the 3D market, given how much more pervasive the consumption of 3D content is becoming, I knew it was something I wanted to be a part of. I’m excited to be joining the Adobe team to help accelerate and grow their 3D offerings for creatives worldwide.”

I remain proud to have delivered, at Guido’s urging, perhaps the most arcane feature request ever: he asked for per-layer timestamps in Photoshop so that Pixar’s rendering pipeline could discern which layers had actually been changed by artists, thereby saving a lot of rendering time. We got this done, and somehow it gives me roughly as much pleasure as having delivered a photo editor that’s used by hundreds of millions of people every month. 😌

Anyway, here’s to great things for Guido, Adobe, and 3D creators everywhere!

Behind the scenes: Drone light painting

I’m a longtime admirer of Reuben Wu’s beautiful light painting work, and planning to head to Death Valley next month, I thought I’d try to learn more about his techniques. Happily he’s shared a quick, enlightening (heh) peek behind the scenes of his process:

I also enjoyed this more detailed how-to piece from Daniel James. He’s convinced me to spring for the Lume Cube Mavic Pro kit, though I welcome any additional input!

AR: Google & Fiat-Chrysler bring super detailed models to your driveway

Psst—wanna see a multi-gig, super detailed 3D model appear in your driveway almost instantaneously?

I’m delighted to say that our work in cloud-rendered streaming 3D is paying off via this year’s virtual CES show. Per the Google Cloud blog:

As part of Fiat Chrysler’s Virtual Showroom CES event, you can experience the new innovative 2021 Jeep Wrangler 4xe by scanning a QR code with your phone. You can then see an Augmented Reality (AR) model of the Wrangler right in front of you—conveniently in your own driveway or in any open space. Check out what the car looks like from any angle, in different colors, and even step inside to see the interior with incredible details.

A bit on how it works:

The Cloud AR tech uses a combination of edge computing and AR technology to offload the computing power needed to display large 3D files, rendered by Unreal Engine, and stream them down to AR-enabled devices using Google’s Scene Viewer. Using powerful rendering servers with gaming-console-grade GPUs, memory, and processors located geographically near the user, we’re able to deliver a powerful but low friction, low latency experience.

This rendering hardware allows us to load models with tens of millions of triangles and textures up to 4k, allowing the content we serve to be orders of magnitude larger than what’s served on mobile devices (i.e., on-device rendered assets).

And to try it out:

Scan the QR code below, or check out the FCA CES website. Depending on your OS, device, and network strength, you will see either a photorealistic, cloud-streamed AR model or an on-device 3D car model, both of which can then be placed in your physical environment.

AR_Wrangler-4xe.jpg

🎄Blinding Lights & Beach Boys

Although we can’t travel far this holiday season (bye bye, Mendocino! see you… someday?), we are, in the words of Tony Stark, “bringing the party to you” via our decked-out VW Westy (pics). I’m having fun experimenting with my new Insta360 One X2, mounting it on a selfie stick & playing with timelapse mode. Here’s a taste of the psychedelic stylings, courtesy of The Weeknd…

…and Brian Wilson:

Animation: “Mecha”

Apropos of nothing, check out 60 lovingly rendered seconds commissioned by YouTube:

Maciej Kuciara writes,

MECHA – the love letter to our youth. Watching anime classics as kids left a mark that stayed with us to this day. So we felt it’s due to time to celebrate our love to mecha and pay proper homage with this piece we did for YouTube.

An epic 50,000-Lego sculpture 🌊

Jumpei Mitsui‘s work is staggering. Colossal writes,

During the course of 400 hours, Mitsui snapped together 50,000 cobalt and white LEGO into an undulating wave that mimics the original woodblock print.

To recreate this iconic work in three-dimensions, Mitsui studied videos of waves crashing and pored over academic papers on the topic. He then sketched a detailed model before assembling the textured water, three boats, and Mount Fuji that span more than five feet.

Astro Adobeans

A couple of my old Adobe pals (who happen to dwell in the dark, beautiful wilderness around Santa Cruz) have been sharing some great astrophotography-related work lately.

First, Bryan O’Neil Hughes shares tips on photographing the heavens, including the Jupiter-Saturn Conjunction and the Ursids Meteor Shower:

Meanwhile Michael Lewis has been capturing heavenly shots which, in the words of my then ~4-year-old son, “make my mind blow away.” Check out his Instagram feed for images like this:

And if you’re shooting with a phone—especially with a Pixel—check out these tips from former Pixel imagining engineer Florian Kainz (who’s now also at Adobe—hat trick!).

AR: Baby Yoda comes to Google Search

Putting the “AR” in “Galaxy far, far away…” 😌

Just don’t expect to get a Baby Yoda Hellraiser Pinhead version. 😬

Also, remember that if you have a Pixel 5G or a compatible Android 5G device, you can install The Mandalorian” AR Experience.

#ThisIsTheWay

AR: Come try new cars & makeup in Google Search

I’m delighted to be closing out 2020 on a pair of high notes, welcoming the arrival of my two biggest efforts from the last year+.

First, Google Search now supports 150+ new cars that you can view in 3D and AR (via iPhone or Android device), including in beautiful cloud-rendered quality (provided you have a good connection & up-to-date Android). As we initially previewed in October:

Bring the showroom to you with AR

You can easily check out what the car looks like in different colors, zoom in to see intricate details like buttons on the dashboard, view it against beautiful backdrops and even see it in your driveway. We’re experimenting with this feature in the U.S. and working with top auto brands, such as Volvo and Porsche, to bring these experiences to you soon.

Second, you can try on AR beauty products right through Search:

Now, when you search for a lipstick or eyeshadow product, like L’Oreal’s Infallible Paints Metallic Eyeshadow, you can see what it looks like on a range of skin tones and compare shades and textures to help you find the right products.

To help you find the perfect match, you can now also virtually try makeup products right from the Google app.

New tech promises super fast, high quality background removal

Google researchers Ira Kemelmacher-Shlizerman, Brian Curless, and Steve Seitz have been working with University of Washington folks on tech that promises “30fps in 4K resolution, and 60fps for HD on a modern GPU.”

Our technique is based on background matting, where an additional frame of the background is captured and used in recovering the alpha matte and the foreground layer.

Check it out:


See the PDF, abstract, and GitHub repo for details.

Google Photos rolls out Cinematic Photos & more

Nearly 20 years ago, on one of my first customer visits as a Photoshop PM, I got to watch artists use PS + After Effects to extract people from photo backgrounds, then animate the results. The resulting film—The Kid Stays In The Picture—lent its name to the distinctive effect (see previous).

Now I’m delighted that Google Photos is rolling out similar output to its billion+ users, without requiring any effort or tools:

We use machine learning to predict an image’s depth and produce a 3D representation of the scene—even if the original image doesn’t include depth information from the camera. Then we animate a virtual camera for a smooth panning effect—just like out of the movies.

Photos is also rolling out new collages, like this:

And they’re introducing new themes in the stories-style Memories section up top as well:

Now you’ll see Memories surface photos of the most important people in your life…  And starting soon, you’ll also see Memories about your favorite things—like sunsets—and activities—like baking or hiking—based on the photos you upload.

Enjoy!

AR: Google Maps can point you towards your friends

I love these simple, practical uses of augmented reality. The Maps team writes,

Last month, we launched Live View in Location Sharing for Pixel users, and we’ll soon expand this to all Android and iOS users around the globe. When a friend has chosen to share their location with you, you can easily tap on their icon and then on Live View to see where and how far away they are–with overlaid arrows and directions that help you  know where to go.

Live View in Location Sharing will soon expand to all Android and iOS users globally on ARCore and ARKit supported phones.

They’re also working hard to leverage visual data & provide better localization and annotation.

With the help of machine learning and our understanding of the world’s topography, we’re able to take the elevation of a place into account so we can more accurately display the location of the destination pin in Live View. Below, you can see how Lombard Street—a steep, winding street in San Francisco—previously appeared far off into the distance. Now, you can quickly see that Lombard Street is much closer and the pin is aligned with where the street begins at the bottom of the hill.

50 new AR animals arrive in Google search

“If your dog woke up 10 times its current size, it would lick you; if your cat woke up 10 times bigger, it would eat you,” or so I’ve heard.

In any case, building on the world’s viral (in every sense) adoption of AR animals already in search, my team has added a bunch more:

https://twitter.com/Google/status/1337506620612485120?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1337506620612485120%7Ctwgr%5E%7Ctwcon%5Es1_&ref_url=https%3A%2F%2Fwww.theverge.com%2F2020%2F12%2F12%2F22171601%2Fgoogle-adds-augmented-reality-ar-animals-search-3d

The Verge writes,

When Google started putting 3D animals in Search last year it only had a few standard animals available like a tiger, a lion, a wolf, and a dog. It added more creatures in March, including alligators, ducks, and hedgehogs. In August, Google made prehistoric creatures and historical artifacts available in AR via its Arts and Culture app— and who among us wouldn’t love to check out the ancient crustacean Cambropachycope up close and personal?

Meanwhile my man Seamus abides. 🐕😌

Tilt-shift takes off

Remember Obama’s first term, when faked tilt-shift photos were so popular that Instagram briefly offered a built-in feature for applying the look? The effect got burned out, but I found it surprisingly fun to see it return in this short video.

In a brief interview, Sofia-based photographer Pavel Petrov shares some behind-the-scenes details.

I have used Adobe Premiere Pro for post processing with some compound blur (for the narrow depth of field) and some oversaturation and speed up to 300%.