Monthly Archives: June 2020

Enjoy the Silence

When the fam & I headed out on Friday for two weeks (twoooo weeeeks!) of road-tripping adventures, I didn’t expect to have *zero* connectivity with which to share updates, but so it goes; hence the radio silence here.

The disconnection (something I can rarely grant myself) has been mostly a blessing, and I’ll try to be good about staying off the keyboard until I return. Still, I’ll try to share good stuff when time permits. I hope that you, too, get a little downtime & get to go outside—where I hear that the graphics are amazing. 😌🤘

PS: Orbital greetings from Mesa Verde yesterday.

View this post on Instagram

A quick spin around Mesa Verde

A post shared by John Nack (@jnack) on

Google Photos gets a Map view & improved Memories

Overall the app is now organized into three tabs. Notably:

As part of the new search tab, you’ll see an interactive map view of your photos and videos, which has been one of our most-requested features since we launched Google Photos. You can pinch and zoom around the globe to explore photos of your travels…

In addition, the “Stories”-style strip up top is getting upgrades:

We’re adding more types of Memories, like the best pics of you and your closest friends and family over the years, trips, and even just the highlights from last week… We’ve also moved our automatic creations–like movies, collages, animations, stylized photos and more–from the “For you” tab (which is now gone) and into Memories.

ARCore rolls out depth support

Exciting news from my teammates:

Today, we’re taking a major step forward and announcing the Depth API is available in ARCore 1.18 for Android and Unity, including AR Foundation, across hundreds of millions of compatible Android devices.

As we highlighted last year, a key capability of the Depth API is occlusion: the ability for digital objects to accurately appear behind real world objects. This makes objects feel as if they’re actually in your space, creating a more realistic AR experience.

Check out the new Depth Lab app (also available as an open-source Unity project) to try it for yourself. You can play hide-the-hot-dog with Snap, as well as check out an Android-exclusive undersea lens:

Roto Brush 2: Semantic Boogaloo

Back in 2018 I wrote,

Wanna feel like walking directly into the ocean? Try painstakingly isolating an object in frame after frame of video. Learning how to do this in the 90’s (using stone knives & bear skins, naturally), I just as quickly learned that I never wanted to do it again. 

Happily the AE crew has kept improving automated tools, and they’ve just rolled out Roto Brush 2 in beta form. Ian Sansevera shows (below) how it compares & how to use it, and John Columbo provides a nice written overview.

In this After Effects tutorial I will explore and show you how to use Rotobrush 2 (which is insane by the way). Powered by Sensei, Roto Brush 2 will select and track the object, frame by frame, isolating the subject automatically.

ML Kit gets pose detection

This is kinda inside-baseball, but I’m really happy that friends from my previous team will now have their work distributed on hundreds of millions, if not billions, of devices:

[A] face contours model — which can detect over 100 points in and around a user’s face and overlay masks and beautification elements atop them — has been added to the list of APIs shipped through Google Play Services…

Lastly, two new APIs are now available as part of the ML Kit early access program: entity extraction and pose detection… Pose detection supports 33 skeletal points like hands and feet tracking.

Let’s see what rad stuff the world can build with these foundational components. Here’s an example of folks putting an earlier version to use, and you can find a ton more in my Body Tracking category:

[Via]

Camera Raw gets a facelift

To my slight chagrin—having been a naysayer about turning Camera Raw into a filter one can use in Photoshop, on the grounds that doing so would be a crutch at a time when Adobe should do the hard work of revamping a motley set of disparate, 30-year-old adjustment dialogs—I find myself hitting Shift-Cmd-A all the damn time. Thus I’m glad to see the UI freshened up & tools made easier to access:

As for getting the rest of the adjustments-house in order, I wasn’t wrong, but ACR-in-PS gives me fewer reasons to care. On we go!

DeepFaceDrawing makes something out of (almost) nothing

‘Allo, my name is Simon, and I like to do drawerings with the help of me cheeky robot, Mr. Deepfakes:

As PetaPixel explains,

They’ve achieved this by treating each facial feature locally first, and then the face as a whole, basically assigning a probability to each feature. That way you don’t need a professional sketch to generate a realistic-looking image, but the better the sketch, the better and more accurate the results become. What’s more, the software can work in near-real-time, 

Photography: “Black Ice”

Constraint -> Creativity. Christopher Dormoy writes,

Having to stay home does not mean less creation. It is time to observe and experience elements and details of our daily life that we find at home.

I wanted to play with ice, flower and ink and see what kind of universe I can created with the macro and motion timelapse technique. I spent many hours to observe and experiment the ice and how it react with liquids like ink but also, oil, paint and soap. Some effects are hypnotic and surprising.

“Stare. It is the way to educate your eye, and more. Stare, pry, listen, eavesdrop. Die knowing something. You are not here long.” ― Walker Evans.

[Via]

Adobe releases Photoshop Camera

After being announced last fall, it’s free to download for iOS & Android.

TBH I’m a little nonplussed about the specific effects shown here, but I remain intrigued by the idea of a highly accessible, results-oriented app that could also generate layered imagery for further tweaking in Photoshop and other more flexible tools.

As for the rationale, PetaPixel notes,

The main goal of apps like this might simply be to introduce more people to the Adobe ecosystem. Adobe CTO Abhay Parasnis said as much in an interview with The Verge, in which he calls Photoshop Camera “the next one in that journey for us.” Photoshop Camera could act as the “gateway drug” to a Creative Cloud subscription for anybody who discovers a dormant love of photo editing.

Google Maps adds features to help navigate the age of Covid

Lots of good details here, among them:

Starting today, you can easily see the times when a transit station is historically more or less busy to plan your trip accordingly or you can look at live data showing how busy it is right now compared to its usual level of activity. Simply search for a station in Google Maps or tap on the station on the map to see the departure board and busyness data, where available.

The Nik Collection returns, adds new features

I’m very pleased to see that after happily finding a new post-Google home with DxO, the Nik Collection has hit version 3.0, offering non-destructive editing, perspective correction, and more.

Among the notable changes:

By taking advantage of the TIFF MULTIPAGE file format, the plugin suite is able to combine “the input image, the saved Nik editing parameters, and the output file” into a single file. DxO claims this as a “first” for a suite of creative photo plugins, resulting in “unparalleled versatility.”

As for pricing & availability:

The Nik Collection 3 was launched early this morning and is available to purchase right away for a “special introductory price” of $100 for new users, or $60 for upgrades. Come July 1st, the collection will go back up to its MSRP of $150 for new users or $80 for upgrades.

You can download the trial version here.

Spinning my world

As society whirls around us, I take a certain kind of comfort in seeing the planet keep whirling as well.

Colossal writes,

With a camera peering out over the landscape of Tivoli, Namibia, Bartosz Wojczyński focused on the sky. The Polish photographer created a hypnotic timelapse spanning 24 hours that has a focal point in the atmosphere rather than on the land. Each minute, he snapped a frame that subsequently was looped 60 times to create the final 24-minute version that’s a mesmerizing look at Earth’s cycles.