I took a video of a guy zip lining in full Harry Potter costume and edited out the zip lines to make it look like he was flying. I mainly used Content Aware Fill and the free Redgiant/Maxon script 3D Plane Stamp to achieve this.
For the surprise bit at the end, I used Midjourney and Runway’s Motion Brush to generate and animate the clothing.
Trapcode Particular was used for the rain in the final shot.
I also did a full sky replacement in each shot and used assets from ProductionCrate for the lighting and magic wand blast.
To start, we’re exploring a range of concepts, including:
Text to color enhancements: Change color schemes, time of day, or even the seasons in already-recorded videos, instantly altering the mood and setting to evoke a specific tone and feel. With a simple prompt like “Make this scene feel warm and inviting,” the time between imagination and final product can all but disappear.
Advanced music and sound eﬀects: Creators can easily generate royalty-free custom sounds and music to reflect a certain feeling or scene for both temporary and final tracks.
Stunning fonts, text effects, graphics, and logos: With a few simple words and in a matter of minutes, creators can generate subtitles, logos and title cards and custom contextual animations.
Powerful script and B-roll capabilities: Creators can dramatically accelerate pre-production, production and post-production workflows using AI analysis of script to text to automatically create storyboards and previsualizations, as well as recommending b-roll clips for rough or final cuts.
Creative assistants and co-pilots: With personalized generative AI-powered “how-tos,” users can master new skills and accelerate processes from initial vision to creation and editing.
Frame.io for Creative Cloud includes real-time review and approval tools with commenting and frame-accurate annotations, accelerated file transfers for fast uploading and downloading of media, 100GB of dedicated Frame.io cloud storage, the ability to work on up to 5 different projects with another user, free sharing with an unlimited number of reviewers, and Camera to Cloud.
Back in 2013 I found myself on a bus full of USC film students, and I slowly realized that the guy seated next to me had created the Take On Me vid. Not long after I was at Google & my friend recreated the effect in realtime AR. Perhaps needless to say, they didn’t do anything with it. ¯\_(ツ)_/¯
In any event, now Action Movie Dad Daniel Hashimoto has created a loving homage as a tutorial video (!).
As industries evolve, core infrastructure gets built and commoditized, and differentiation moves up the hierarchy of needs from basic functionality to non-basic functionality, to design, and even to fashion.
For example, there was a time when chief buying concerns included how well a watch might tell time and how durable a pair of jeans was.
Now apps like FaceTune deliver what used to be Photoshop-only levels of power to millions of people, and Runway ML promises to let you just type words to select & track objects in video—using just a Web browser. 👀
Wanna feel like walking directly into the ocean? Try painstakingly isolating an object in frame after frame of video. Learning how to do this in the 90’s (using stone knives & bear skins, naturally), I just as quickly learned that I never wanted to do it again.
Happily the AE crew has kept improving automated tools, and they’ve just rolled out Roto Brush 2 in beta form. Ian Sansevera shows (below) how it compares & how to use it, and John Columbo provides a nice written overview.
In this After Effects tutorial I will explore and show you how to use Rotobrush 2 (which is insane by the way). Powered by Sensei, Roto Brush 2 will select and track the object, frame by frame, isolating the subject automatically.