Back in 2013 I found myself on a bus full of USC film students, and I slowly realized that the guy seated next to me had created the Take On Me vid. Not long after I was at Google & my friend recreated the effect in realtime AR. Perhaps needless to say, they didn’t do anything with it. ¯\_(ツ)_/¯
In any event, now Action Movie Dad Daniel Hashimoto has created a loving homage as a tutorial video (!).
As industries evolve, core infrastructure gets built and commoditized, and differentiation moves up the hierarchy of needs from basic functionality to non-basic functionality, to design, and even to fashion.
For example, there was a time when chief buying concerns included how well a watch might tell time and how durable a pair of jeans was.
Now apps like FaceTune deliver what used to be Photoshop-only levels of power to millions of people, and Runway ML promises to let you just type words to select & track objects in video—using just a Web browser. 👀
Wanna feel like walking directly into the ocean? Try painstakingly isolating an object in frame after frame of video. Learning how to do this in the 90’s (using stone knives & bear skins, naturally), I just as quickly learned that I never wanted to do it again.
Happily the AE crew has kept improving automated tools, and they’ve just rolled out Roto Brush 2 in beta form. Ian Sansevera shows (below) how it compares & how to use it, and John Columbo provides a nice written overview.
In this After Effects tutorial I will explore and show you how to use Rotobrush 2 (which is insane by the way). Powered by Sensei, Roto Brush 2 will select and track the object, frame by frame, isolating the subject automatically.
Today a small group of users will become the first Adobe Creative Cloud members to find Beta versions of the Adobe video and audio apps available in the Creative Cloud Desktop app. This marks the start of a public Beta program which will roll out incrementally over the coming months, until it is available to all Creative Cloud members.
Now O.G.’s David Simons & Jason Levine have given a live overview and Q&A covering the project:
The new public Beta program started rolling out to Creative Cloud members earlier this year. On Friday, join this discussion and live Q&A with Jason Levine and David Simons, Adobe Fellow, who has been leading the initiative. If the name rings a bell: Dave is one of the inventors of After Effects, for which he won a technical Academy Award, and Adobe Character Animator, which won him a technical Emmy.
Not to be confused with cloud-hosted Team Projects, this new feature is geared towards editors working together on premises:
Productions connects Premiere Pro project files, making them into components of the larger workflow.
Media referencing across projects means you can reuse assets within your Production without creating duplicate files. Using shared local storage, multiple editors can work on different projects in the same production. Project Locking ensures that no one overwrites your work.
You control your content: Productions use shared local storage and can be used without an internet connection.
My wife & her team have been working hard to support freelancers & other video professionals collaborating remotely. Here’s some great news:
[W]e are pleased to extend the availability of Adobe’s Team Projects video collaboration capabilities to Premiere Pro and After Effects users with a Creative Cloud for individual license… until August 17, 2020, at no additional cost. […]
Team Projects is a cloud-hosted collaboration service that allows editors and motion graphics artists to work within Premiere Pro and After Effects. With Team Projects, colleagues can collaborate on video projects from anywhere by syncing changes through the cloud. All you need to do is connect to the Team Projects service and create a team project in Adobe Premiere Pro or After Effects.
Project files are stored and saved in Creative Cloud, so you can revert and sync project files across multiple workstations.
Check out the rest of the post for an FAQ and other details.
We used PHYX on almost every composite to separate the “newscaster” from the “scene” he was reporting on, such as standing in front of the Taj Mahal. We used the PHYX filters with After Effects and Premiere Pro.
The chance to work with the After Effects team is among the things that drew me to work at Adobe. Here’s a look at the folks whose work has helped define motion graphics and who keep breaking new ground:
Creative Cloud means continuous innovation. Check out the newest batch:
Adobe Premiere Pro CC has seen four new releases in this year – all within the 6 months since the CC version was announced. Guided by user requests, the Adobe Premiere Pro CC December 2013 release adds Open CL performance enhancements, media management improvements like multiple Media Browser tabs, new editing enhancements for even greater workflow efficiency, and delivers more intuitive voiceover recording.
The After Effects CC December 2013 release offers customizable output of file name and path templates, improved snapping behavior, enhanced scripting options, and the ability to migrate user settings when updating to newer versions.
The December 2013 releases also includes updates to SpeedGrade CC, Prelude CC, Adobe Media Encoder CC and Adobe Anywhere for video. Along with performance enhancements, SpeedGrade also offers expanded camera format support in Direct Link mode. Prelude CC has added support for the latest Adobe Anywhere protocols. Adobe Media Encoder expands Sony XAVC format support, and Adobe Anywhere introduces performance improvements and diagnostic tools for monitoring system status.
I like what my colleague Steve from After Effects had to say:
“Our team turned around this release in a matter of weeks based on direct feedback from our users,” said Steve Forde, senior product manager for After Effects. “With regular Creative Cloud updates, we’re able to continually evolve and enhance our feature set. Your tools just keep getting better.”
I’m delighted to announce that our team designer Dave Werner, together with our teammate Shaun Saperstein, has released the 2013 edition of his “Extraneous Lyrics” series:
As you might remember from last year’s edition, the videos (now with over 1 million views) feature Dave giving “some of the year’s most popular songs a wordier acoustic mashup treatment.” Each year he raises his technical game: 2012 was all about motion-tracked text in After Effects, and 2013 is a full-on green-screen extravaganza. Here Dave & Shaun take you behind the scenes:
I made a fool of myself in an empty street. (image)
From the same vantage point as the previous shot, I took footage of just cars going by. I made sure to shoot them at a high shutter speed, reducing the motion blur and making them easier to rotosope. Then I roto-ed them out to place in the footage where I was running. (image)
I slowly built up the traffic and choreographed the cars to give the illusion of almost hitting me. (image)
Paul McDonnel just won an Emmy Award for Outstanding Main Title Design for his work on the title design of Da Vinci’s Demons. The team at Behance has posted an interview about how he used After Effects & Photoshop to complete the job.
Our animated typefaces are Adobe After Effects files with each glyph in a separate composition. A controller-composition serves as a central point from which you can customize all the glyphs in one go.
Andrew Kramer & company used After Effects, Premiere Pro, and more to create titles & HUDs for the most recent Star Trek installment. Check out their detailed notes, as well as the video below. And yes, they talk about how they made the lens flares. 🙂
“A stunning 90-minute documentary visualizing key events from World War II from the vantage point of space,” World War II From Space just won an Emmy for Outstanding Graphic Design and Art Direction. Featuring 300 animations and 79 VFX shots, it made heavy use of an Adobe workflow (script writing in Adobe Story, 3D integration with After Effects & Cinema 4D, editing in Premiere Pro). Check out an in-depth interview on how the team made it happen.
Who the heck welcomes a new baby by slimming down, dressing better, and spending more time making bits of art? Our designer Dave, apparently. During his just-ended paternity break he started surprising us with unexpected looks at his domestic life. It started simple & totally unannounced:
Hasselblad portraits -> After Effects -> pixie dust: A “person” ages 65 years in 5 minutes.
Last Thanksgiving, Cerniello traveled to his friend Danielle’s family reunion and with still photographer Keith Sirchio shot portraits of her youngest cousins through to her oldest relatives with a Hasselblad medium format camera. Then began the process of scanning each photo with a drum scanner at the U.N. in New York, at which point he carefully edited the photos to select the family members that had the most similar bone structure. Next he brought on animators Nathan Meier and Edmund Earle who worked in After Effects and 3D Studio Max to morph and animate the still photos to make them lifelike as possible. Finally, Nuke (a kind of 3D visual effects software) artist George Cuddy was brought on to smooth out some small details like the eyes and hair.
For this specific animated typeface we have rounded up 110 talented animators from all over the world. We asked every animator to pick a glyph and animate it using no more than 4 colors, 25 frames and a 500 x 600 px canvas in Adobe After Effects. The animators had complete freedom to work their magic within those 25 frames. The result is a wide variety of styles and techniques. The color palette and letterforms tie it all together.
The downloadable source file contains all the keyframes, expressions and artwork from the artists. This makes it a great learning source for motion students and professionals.
Create photo-real visual content fast with awesome new advancements, such as the Live 3D Pipeline between After Effects and CINEMA 4D, an enhanced 3D Camera Tracker, and layer and mask snapping for faster composition construction. Save hours of tedious rotoscoping work with the Refine Edge tool. Be more creative, thanks to advancements in stabilization and other refinements for a more responsive workflow.
“Today,” writes After Effects PM Steve Forde, “Adobe announced it is entering into a strategic alliance with MAXON, the makers of CINEMA 4D.” He goes on to hint at future integration:
“Do what you know, and be the best at it.” Hand in hand with this idea means that you DON’T do a whole lot of stuff you don’t know. With this relationship announcement you have two companies who focus on being the very best at what they do…
I wish could go into more detail right now – but stay tuned. This area is about to get very exciting.
See also the Maxon announcement which says, “As part of the alliance, both companies are expected to collaborate and engineer a pipeline between Adobe® After Effects® software and CINEMA 4D to give users a seamless 2D/3D foundation.”
Just as interesting to me, from a geeky perspective, is the way the famous & simple Ken Burns effect has morphed into something richer & more ambitious, imparting parallax movement to the various pans & zooms. In fact, the clip above prominently credits After Effects artist Elliot Cowan. Let’s hear it for Content-Aware Fill, “postcards in space,” and more.
What starts out as a few simple repeating elements soon becomes a chaotic collage of video snippets that take on a life of their own. He says that he uses Photoshop and After Effects for most of his animations, which I find totally astonishing. I’d suggest watching this video several times so that you can fully appreciate the amount of work he had to put into this incredible music video.
Reminds me of Michel Gondry’s impossibly* brilliant video for Kylie Minogue’s Come Into My World:
*if nothing else, in that it gets me to willingly listen to a Kylie Minogue song
Did you know that the first version of AE didn’t even have a timeline? Check out this screenshot from Dan Wilk (click to enlarge):
As part of AE’s 20th Anniversary celebration, you can:
Join After Effects creators Dave Simons and Dan Wilk as they take you on a trip down memory lane to see how After Effects started—from concept to initial user interface. See how much After Effects has changed throughout the years and why things are simply the way they are.
“I can’t believe I’m talking to these guys,” I thought. “They’re spending their time talking to me–and they’re so down-to-earth!”
That was in 2000, when I first met the brains behind After Effects. (I’d just joined Adobe, aspiring to build “AE for the Web.”) 13 years later, I still feel just the same. In any industry full of half-hit wonders acting like they’ve just cured cancer, I find Dave, Dan, and all the AE guys as relentlessly humble & passionate as can be.
So “Happy 20th anniversary to After Effects, the video package you keep promising yourself you’ll learn someday,” as I saw Matt May quip the other day. Here some pros salute this game-changing app:
That’s what I found myself wondering as I watched Supralude‘s That Night In Williamsburg. He won’t spill many beans, but what do you think? Were those lights added in post, and how can you tell one way or the other?
Our own Dave Werner is not just a kickass designer, he’s a musician with a penchant for goofing on popular music. His Extraneous Lyrics series, “where some of the year’s most popular songs are given a wordier acoustic mashup treatment,” is closing in on 1 million YouTube views (!).
Now that he works at Adobe, Dave’s traded his guitar-in-front-of-tablecloth aesthetic for After Effects motion tracking and more. So without further ado, check out “Extraneous Lyrics 2012”:
Includes Call Me Maybe by Carly Rae Jepsen, Boyfriend by Justin Bieber, We Are Never Ever Getting Back Together by Taylor Swift, Gangnam Style by Psy, Somebody That I Used To Know by Gotye, What Makes You Beautiful by One Direction, and We Are Young by Fun.
In August I pointed out the inspired lunacy of Old Spice’s Muscle Music (see below, especially if you have Flash installed as it becomes interactive at the end). Now Jake Friedman, creative director and co-founder of LA-based Wildlife, offers a behind-the-scenes tour of the ambitious project.
We worked with Adobe Flash, Flash Builder, After Effects, Photoshop, and Media Encoder. There were a huge number of assets moving back and forth across these products, so it was important that they could integrate to the pipeline seamlessly. We also benefited greatly from Photoshop’s recent addition of video integration and support. […]
We also had to crop these videos to their minimum canvas area in order to speed up performance for both pieces of software and avoid layering dozens of full-screen clips over one another. Photoshop was a champion here…
I had to import and customize the NEF files before I equalized them with the great LR-Timelapse from Gunther Wegner. (Adobe Lightroom is necessary) The observed JPEG had then to be droped into virtual dub and were rendered as AVI. When this was done, I had to stabilize the sequences manually frame by frame (AE motion tracker) and rendered each of them in 3 different sizes: (4928×3264 pixels, 1920×1080 pixels, 1024×768 pixels) Last but not least the snippets were edited fitting to the beautiful title “Diving Through The Blue” by the respectable composer and musician Valentin Boomes.
“Ross Ching, the director,” writes Gizmodo, “used Adobe Photoshop, After Effects and Premiere to delete every human and moving car from all the timelapse sequences. His short, the first of a series called Empty America, shows every landmark from the Golden Gate Bridge to Fisherman’s Wharf to Lombard Street to Ghirardelli Square to the Bay Bridge, ‘wiped empty of tourists and traffic.'”
Here’s a peek behind the scenes:
Pro tip: You can shoot videos like this any day of the week here in San Jose (population 1 million) and never need to do any post-processing. “It’s more necropolis than metropolis,” says my wife. [Via Dave Helmly]
Well now I feel bad: Not only have I failed send any of our guys’ innumerable Thomas engines to space, I’ve also neglected to learn After Effects well enough to animate their faces. Big props all around, Ron Fugelseth.
Despite kind of burning itself out in the 90’s, image morphing can remain an interesting storytelling device:
“All visual effects for this sequence were created entirely in After Effects, by Morgan Préleur and the team at noside.fr, using mettle’s FreeForm Pro and FreeForm V2 plug-in.” I’d like to see a making-of piece.
I’m delighted that Adobe has officially unveiled Adobe Anywhere, our collaborative workflow platform for video. You can use After Effects, Premiere Pro, and Prelude to manipulate assets on a server, letting people team up across locations, devices, and networks.
Seeing is believing: I’ve gotten to sit next to PM Michael Coleman as he cruises through high-res video on his MacBook Air, and you’d swear he was tethered to a brawny machine under the desk–not talking via WiFi to a server hundreds of miles away. Here’s a quick demo:
I’m especially proud as this is the project that the other leading Adobe Nack, my wife Margot, has been working on for quite some time. Congrats to the whole team!
How much character of movement can be conveyed just by moving dots. Apparent crazy person Colin Rozee set out to find out, saying “I manually keyframed 19 mask paths in AE. There’s over 20,000 keyframes in the piece, but it needed to be that detailed to achieve the fluidity of movement….” He used the Plexus particle-system plug-in in the project. [Via David Simons]
The VFX team at Cantina Creative sat down with Adobe to discuss the incredible attention to detail they put into creating on-screen graphics for Marvel’s The Avengers. From consulting with an A-10 pilot about his “ultimate HUD” to animating thousands of Illustrator elements in After Effects, their process makes for a really interesting read. The move to 3D demanded even tighter craftsmanship:
We focused a lot of time on how widgets and graphics would actually function because everything was clearly readable. Everything in the HUD, even down to the tiny micro-text, relates precisely to the current story-point.