Check out “Light Fields, Light Stages, and the Future of Virtual Production”

“Holy shit, you’re actually Paul Debevec!”

That’s what I said—or at least what I thought—upon seeing Paul next to me in line for coffee at Google. I’d known his name & work for decades, especially via my time PM’ing features related to HDR imaging—a field in which Paul is a pioneer.

Anyway, Paul & his team have been at Google for the last couple of years, and he’ll be giving a keynote talk at VIEW 2020 on Oct 18th. “You can now register for free access to the VIEW Conference Online Edition,” he notes, “to livestream its excellent slate of animation and visual effects presentations.”

In this talk I’ll describe the latest work we’ve done at Google and the USC Institute for Creative Technologies to bridge the real and virtual worlds through photography, lighting, and machine learning.  I’ll begin by describing our new DeepView solution for Light Field Video: Immersive Motion Pictures that you can move around in after they have been recorded.  Our latest light field video techniques record six-degrees-of-freedom virtual reality where subjects can come close enough to be within arm’s reach.  I’ll also present how Google’s new Light Stage system paired with Machine Learning techniques is enabling new techniques for lighting estimation from faces for AR and interactive portrait relighting on mobile phone hardware.  I will finally talk about how both of these techniques may enable the next advances in virtual production filmmaking, infusing both light fields and relighting into the real-time image-based lighting techniques now revolutionizing how movies and television are made.

Leave a Reply

Your email address will not be published. Required fields are marked *