If I someday fulfill my dream of becoming a T1000, I promise I’ll dance for you, baby—oh, I’ll dance my head literally off.
About the project:
The AICP awards celebrate global creativity within commercial production. Method Design wanted to create an entertaining piece of design that encapsulates the innovative and prolific nature of this industry. Our aim was to showcase the AICP sponsors as various dancing avatars, which playfully reference the visual effects used throughout production. Motion capture, procedural animation and dynamic simulations combine to create a milieu of iconic pop dance moves that become an explosion of colorful fur, feathers, particles and more.
Side bonus of my liquid-metalification: I’ll also turn my head into animated art.
I admired the app Directr (especially as was PM’ing the similarly template-centric Adobe Premiere Clip), and now it’s reborn as YouTube Director (download). TechCrunch writes,
YouTube is launching a new suite of products for advertisers under the umbrella name of YouTube Director. Collectively, these products are supposed to make it easier for businesses (particularly the smaller ones that don’t have their own production capabilities and aren’t going to hire an ad agency) to shoot and edit video ads that can run on YouTube.
As Google did with Android, Apple will package the raw data in Adobe’s Digital Negative (DNG) format, a move that makes it easier for software such as Photoshop to view the files.
Third-party camera apps will also be able to take Live Photos — Apple’s technology for taking a short video clip, currently available only in Apple’s camera app. And on supported hardware, cameras will be able to record a wider range of colors, too, for more vivid photos.
Imagine watching a live ballgame while being able to fly around the field or court in VR, viewing the action from any angle.
That’s the sort of future that could be enabled by new research from Microsoft. A team there has devised a way to capture live performances, generate 3D models, and stream the results. Check it out:
The team writes,
We present the first fully automated end-to-end solution to create high-quality free-viewpoint video encoded as a compact data stream. Our system records performances using a dense set of RGB and IR video cameras, generates dynamic textured surfaces, and compresses these to a streamable 3D video format. Four technical advances contribute to high fidelity and robustness: multimodal multi-view stereo fusing RGB, IR, and silhouette information; adaptive meshing guided by automatic detection of perceptually salient areas; mesh tracking to create temporally coherent subsequences; and encoding of tracked textured meshes as an MPEG video stream. Quantitative experiments demonstrate geometric accuracy, texture fidelity, and encoding efficiency. We release several datasets with calibrated inputs and processed results to foster future research.
Why do I keep waiting for the apocalyptic flash here?
“Las Vegas In Infrared” is a new 4-minute short film by Philip Bloom, who visited Las Vegas with a Sony RX100 IV that had been modified for infrared photography through having its filter removed. Most of what you see was shot from a moving vehicle with 2 second bursts at 250fps through a 665nm filter.
Now you can quickly correct perspective in a photograph with precision and control using the new Transform Panel, Guided Upright tool, and Offset sliders. Watch as Julieanne demonstrates how to manually position guides to automatically correct converging vertical and horizontal lines in images, which can then be repositioned within the canvas area.