Back in 2014, Action Movie Dad posted a delightful vid of his niño evading the hot foot:
But now instead of needing an hour-long tutorial on how to create this effect, you can do it it realtime, with zero effort, on your friggin’ telephone. (Old Man Nack does wonder just how much this cheapens the VFX coin—but on charges progress.)
Man, who knew just how much cultural identity could be wrapped up in a style of printing?
This excellent 99% Invisible episode covers the origins of blackletter printing (faster & more reliable for medieval scribes), the culture wars (from Luther to Napoleon) in which it battled Roman faces, its association with (and revilement by!) Nazis, and more.
Bonus: stick around for a discussion of revanchist, Trumpian mandates around government architecture, featuring that delightful term of art, CHUD. *chef’s kiss*
Good lord… seems like I was just posting about the 25th anniversary (see fun links from then), and before that the 20th (do I even dare to listen to whatever presumable ear poison I recorded back then? (“A toast to Photoshop and 20 years of pain and pleasure! A toast to our guests and the spontaneous creation of a drinking game based on every time Bryan says “Scott Kelby” or John says “Configurator!””))… and yet here we are, old friends.
The team is marking the occasion by releasing new features for desktop & mobile versions of the app, including ML-powered object selection in both:
As for mobile, it sounds like things are going in the right direction. Per TechCrunch:
It’s no secret that the original iPad app wasn’t exactly a hit with users as it lacked a number of features Photoshop users wanted to see on mobile. Since then, the company made a few changes to the app and explained some of its decisions in greater detail. Today, Adobe notes, 50% of reviews give the app five stars and the app has been downloaded more than 1 million times since November.
Heh—here’s a super fun application of body tracking tech (see whole category here for previous news) that shows off how folks have been working to redefine what’s possible with. realtime machine learning on the Web (!):
I’ve long heard that 19th-century audiences would faint or jump out of their seats upon seeing gripping, O.G. content like “Train Enters Station.” If that’s true, imagine the blown minds that would result from this upgraded footage. Colossal writes,
Shiryaev first used Topaz Lab’s Gigapixel AI to upgrade the film’s resolution to 4K, followed by Google’s DAIN, which he used to create and add frames to the original file, bringing it to 60 frames per second.
Check out the original…
…and the enhanced version:
Update: Conceptually related:
there’s an app called remini that enhances your blurry pictures & i’m never taking anything but macbook selfies ever again pic.twitter.com/KrRbcO4Ndp
I’m pleased to say researchers have built on my team’s open-source MediaPipe framework to create AutoFlip, helping make video look good across a range of screens.
Taking a video (casually shot or professionally edited) and a target dimension (landscape, square, portrait, etc.) as inputs, AutoFlip analyzes the video content, develops optimal tracking and cropping strategies, and produces an output video with the same duration in the desired aspect ratio.
Check out the team’s post for more details. I don’t know how the results compare to Adobe’s Auto Reframe tech, but hopefully this release will help more people collaborate in delivering great tools.