I’ve long heard that 19th-century audiences would faint or jump out of their seats upon seeing gripping, O.G. content like “Train Enters Station.” If that’s true, imagine the blown minds that would result from this upgraded footage. Colossal writes,
Shiryaev first used Topaz Lab’s Gigapixel AI to upgrade the film’s resolution to 4K, followed by Google’s DAIN, which he used to create and add frames to the original file, bringing it to 60 frames per second.
Check out the original…
…and the enhanced version:
Update: Conceptually related:
there’s an app called remini that enhances your blurry pictures & i’m never taking anything but macbook selfies ever again pic.twitter.com/KrRbcO4Ndp
I’m pleased to say researchers have built on my team’s open-source MediaPipe framework to create AutoFlip, helping make video look good across a range of screens.
Taking a video (casually shot or professionally edited) and a target dimension (landscape, square, portrait, etc.) as inputs, AutoFlip analyzes the video content, develops optimal tracking and cropping strategies, and produces an output video with the same duration in the desired aspect ratio.
Check out the team’s post for more details. I don’t know how the results compare to Adobe’s Auto Reframe tech, but hopefully this release will help more people collaborate in delivering great tools.
I’m a longtime superfan of (now longtime!) Adobe designer Dave Werner. Just as the one good thing to come of my first two futile years at Adobe was meeting my wife, the clear good that came from my final project was hiring Dave. Our project mostly crashed & burned, but Dave hung in there & smartly made his way onto Character Animator. Here he provides a delightful inside look at how the app’s little red mascot came to be.
To achieve this level of detail he shot the whole thing in 8K on a $25,000 RED Helium camera using a Canon 100mm f/2.8L Macro and a Canon MP-E 65mm f/2.8 1-5X Macro, and then edited the final product down to 4K resolution.