Damn… I never thought I’d see this—not ever to such a degree. Thank God at least that people are taking social distancing so seriously.
Join my old friends & colleagues Phil Clevenger & Rick Miller tomorrow for what promises to be an informative online class/discussion. Topics include:
- Quick history of the Lightroom UI and its influence on modern software design
- The importance of choosing the right color space when editing your photos.
- Creating custom camera profiles for your DSLR, cellphone, and drone cameras to achieve the best color fidelity.
- The RAW advantage: recovering data from overexposed/underexposed images.
- Using the Map module and GPS coordinates for location scouting.
- Soft Proofing your photos to determine the most appropriate print color settings
- Questions & Answers
About your hosts:
Senior Director, Experience Design, Adobe Experience Cloud. Original UI designer for Adobe Lightroom and author on two patents for UI innovations in the Lightroom 1.0 interface.
Former Sr. Solutions Engineer/color management expert at Adobe Systems (Rick’s name appeared on the credit screens for Photoshop and Premiere Pro), Professional photographer, and currently a professor at USC. Rick previously taught at the Art Center College of Design in Pasadena, Cal Poly Pomona University, and assisted the LAPD’s Scientific Investigation Division in the forensic application of Photoshop.
It’s really cool to see companies stepping up to help creative people make the most of our forced downtime. PetaPixel writes,
If you’re a photographer stuck at home due to the coronavirus pandemic, Professional Photographers of America (PPA) has got your back. The trade association has made all of its 1,100+ online photography classes free for the next two weeks. […]
During the COVID-19 crisis, we’re committed to supporting the community with complimentary access to Unity Learn Premium for three months (March 19 through June 20). Get exclusive access to Unity experts, live interactive sessions, on-demand learning resources, and more.
“This is certainly the coolest thing I’ve ever worked on, and it might be one of the coolest things I’ve ever seen.”
My Google Research colleague Jon Barron routinely makes amazing stuff, so when he gets a little breathless about a project, you know it’s something special. I’ll pass the mic to him to explain their new work around capturing multiple photos, then synthesizing a 3D model:
I’ve been collaborating with Berkeley for the last few months and we seem to have cracked neural rendering. You just train a boring (non-convolutional) neural network with five inputs (xyz position and viewing angle) and four outputs (RGB+alpha), combine it with the fundamentals of volume rendering, and get an absurdly simple algorithm that beats the state of the art in neural rendering / view synthesis by *miles*.
You can change the camera angle, change the lighting, insert objects, extract depth maps — pretty much anything you would do with a CGI model, and the renderings are basically photorealistic. It’s so simple that you can implement the entire algorithm in a few dozen lines of TensorFlow.
Check it out in action:
TBH the last thing I want is for coronavirus talk to infect (ahem) my escapist art-posting, but I’ve gotta give Markus Hofstätter props for the sheer effort he put into making this 7-frame animation with archaic tintype printing (or as my wife asked, lacking all context, “Why did that dude put a picture into a panini press?”). You can watch his process from the beginning (and check out PetaPixel for the full story), or just jump to the finished animation at the end:
So cool! I’d never actually watched these Apollo 16 clips on their own, unedited & with original dialog intact.
For this particular project, Shiryaev used the stabilized version of the footage that NASA itself released in July of 2019 as a baseline. He then fed it through the same AI software that he’s been using to upscale all of the videos he’s released: Google’s DAIN interpolate frames and achieve 60fps, and Topaz Labs’ Gigapixel AI to upscale each frame and achieve 4K resolution.
More about the mission from NASA:
[Please note: I don’t work on the Pixel team, and these opinions are just those of a guy with a couple of phones in hand, literally shooting in the dark.]
In Yosemite Valley on Friday night, I did some quick & unscientific but illuminating (oh jeez) tests shooting with a Pixel 4 & iPhone 11 Pro Max. I’d had fleeting notions of trying some proper astrophotography (side note: see these great tips from Pixel engineer & ILM vet Florian Kainz), but between the moon & the clouds, I couldn’t see a ton of stars. Therefore I mostly held up both phones, pressed the shutter button, and held my breath.
Check out the results in this album. You can see which camera produced which images by tapping each image, then tapping the little comment icon. I haven’t applied any adjustments.
Overall I’m amazed at what both devices can produce, but overall I preferred the Pixel’s interpretations. They were darker, but truer to what my eyes perceived, and very unlike the otherworldly, day-for-night iPhone renderings (which persisted despite a few attempts I made to set focus, then drag down the exposure before shooting).
Check out the results, judge for yourself, and let me know what you think.
Oh, and for a much more eye-popping Pixel 4 result, check out this post from Adobe’s Russell Brown:
View this post on Instagram
A Pixel 4 Night Site Moment – OK, I am officially amazed at the wonders of mobile photography. The Pixel 4 has won me over as the best Astrophotography phone camera. How they made this magic happen is totally unbelievable. This is a 4 minute exposure consisting of a composite of 15 exposures at 16 seconds each. They are magically merged together into this amazing image. I processed the image in Lightroom Mobile on my iPad Pro. #pixel4 #lightroom #wanaka #newzealand #nightsite #milkyway #pixel4photography #lightroommobile #nightphotography #astrophotography #mobilephotography #reallyrightstufftripod #myrrs
What a fascinating 90-second peek into a clever trick that saved millions of dollars in production costs on Titanic. As a friend asks, “I wonder what became of all those reverse WHITE STAR LINE sweaters?”
the starboard side of the 90% scale ship was built for the film, but for the Southampton launch, which depicts the port side next to the doc, all props, costumes and signage were built flopped to double for the port side of the ship – the footage was then horizontally flopped pic.twitter.com/15llmvJ8z1
— Todd Vaziri (@tvaziri) February 29, 2020