In-camera VFX method - LED wall - Final Pixel
October 5, 2021

In-Camera VFX — How Industry Experts are Using It

Version Control

In-camera VFX and LED wall virtual production methods — sometimes also referred to as on-set virtual production — help alleviate some of the most common issues with live shoots.

I recently had the pleasure of sitting down with Michael McKenna — CEO and Co-Founder and virtual production specialist at global creative studio Final Pixel — to discuss creating content using in-camera VFX and LED walls. Below you can read more about his thoughts on and executions of these methods.

But first, let’s cover some basics:

What Is In-Camera VFX?

In-camera VFX is a type of virtual production that combines real actors and physical sets with photorealistic virtual backdrops displayed on LED walls, allowing content creators to capture extremely high-quality shots in-camera and on-set. This method helps reduce, or eliminate, the need for post-production visual effects (VFX) in the delivery of a final product.

How Does In-Camera VFX with an LED Wall Work?

At a high level, virtual worlds are created by artists and designers prior to a shoot. These scenes can be based on real locations or completely fictional. Then, they’re rendered back in real-time on a large LED display that creates the backdrop to the set.

The set itself can contain any number of physical and live elements. The most well-known example of this type of production is Jon Favreau’s “The Mandalorian.” Many of the scenes in this series were produced using this exact method and got the attention of many filmmakers.

Final Pixel crew in front of LED wall

Why Use In-Camera VFX with an LED Wall?

In-camera VFX is becoming more and more popular with content producers for many reasons. Below are the most notable advantages associated with this production method.

LED Walls and Getting More from Your On-Screen Talent

Visual effects and virtual worlds have been commonplace in film and television for decades. However, the need to capture the live performance first and then add in virtual aspects later led to one major problem: actors couldn’t see what they were interacting with.

Working with LED walls has all but eliminated this problem. They allow actors to interact with the environment more organically, resulting in a more realistic and higher-quality scene. That means actors can nail the scene in fewer takes, saving teams money.

Location Options are Limitless with In-Camera VFX and LED Walls

No longer will set locations be limited by time, budget, permits, or reality. Shooting with a virtual world backdrop means location options are only limited by the creator’s imagination.

From a single stage, you can produce scenes in any number of locations. In fact, you can shoot different locations in the same day. This helps you maximize your on-screen talent’s time and ensure you stay on budget. It also helps reduce your travel budget!

Plus, backgrounds and locations can be tweaked to fit the director’s exact requirements. Don’t like that building where it is? Move it in a matter of moments. The weather isn’t quiet right for the mood? Change it! Virtual worlds allow directors to have complete control.

 

Want to Learn More About In-Camera VFX?

Become a pro at virtual production with free, expert-led training. Sign up for Perforce U College of Virtual Production.

START LEARNING

 

Interview with Michael McKenna, CEO and Co-Founder of Final Pixel

After more than 15 years in the television industry, and in the midst of the pandemic, former BBC Studios executive  Michael McKenna founded global creative studio Final Pixel. His team works with producers to create revolutionary work using virtual production technology like in-camera VFX and LED walls. Let’s dive into his thoughts on the current and future state of this tech, the possibilities it opens for content creators, and how to start taking advantage of it.
 

Q: When Did You First Learn About Virtual Production?

A: “The Mandalorian” included a number of behind-the-scenes videos on Disney+. I watched them in awe of the technology John Favreau and the team from Industrial Light & Magic put together. They had created a method of shooting that could place actors in any environment imaginable, who were lit by the environment itself using giant LED screens projecting a photorealistic world that was built with 3D modelling in Unreal Engine. As they moved the camera, the world moved in sync. You could look around corners into virtual worlds. It blew my mind.

It was also blowing my brother’s (Chris McKenna, Creative Director of Final Pixel) mind in Los Angeles.  It came up on one of our weekly family zoom calls. We were both hooked. 

Q: What Was Your First Experience with Using In-Camera VFX?

A: Chris & I teamed up with Monica Hinden (Executive Producer, Final Pixel) and funded two demos in autumn of 2020, one in the UK and one in Los Angeles. We brought together a team from all over and partnered with some other local companies to have our first foray into virtual production.

We tried to create realistic looking sets, which could be used as a replacement for location shoots. The results were staggering. We could immediately see the huge potential for this method of filmmaking. So much so, we formally launched Final Pixel shortly afterwards.

We learned loads on these shoots, in particular the need for tight file management and version control. It was following this we began using Perforce Helix Core, which has dramatically improved our pipeline and efficiency working on Unreal projects.

Q: Where Are Your Studios?

A: Typically, we do our shoots on private stages which are set up specifically for our clients and not available to the public. These can be anywhere in the world.

We can establish these wherever our clients need to be, and they’re often determined by the talent involved. Our core bases are in New York, Los Angeles, and London. We also occasionally partner with other emerging virtual production stages where the project requires it.

Q: What Tools Do You Use For In-Camera VFX Production?

A: Virtual production works by tracking the “real-world” camera in a studio. This tracking information is combined with the ‘virtual’ camera tool within Unreal Engine, live and in real-time.

This virtual camera can be programmed to move in sync with the real-world camera with little noticeable lag. The result is that we can then project what the virtual camera is seeing onto a massive LED wall, basically a huge television.

Real-time rendering of 3D models makes this possible. The typical VFX pipeline includes a large portion of time devoted to rendering images. With Unreal Engine, this happens right before your eyes.

This doesn’t mean the end of physical set builds. On Final Pixel shoots, the art department is a critical component of the crew. To create a believable in-camera VFX effect, the foreground props need to blend seamlessly with the virtual world.

In-camera VFX: Camera tracking with LED wall

Q: What’s the Coolest Project You Have Used In-Camera VFX In?

A: Using in-camera VFX and virtual production has been a very cool and exciting process. Recently, we created a promo for ABC, where we got to recreate a physical set for “Dancing With the Stars” in Unreal Engine with controllable DMX lighting so that we could seamlessly combine stage lighting with the virtual set.

Q: What Advice Would You Give to Content Creators Looking to Try In-Camera VFX?

A: Don’t be afraid to ask for help, whether that is through toolchain vendors like Perforce or those in the business doing virtual production. We often consult with creators at an early stage and can help guide them as to whether virtual production is the right route to go down. There are limitations, and this technology is still in its infancy, so there are bugs to work through. However, it is improving at a rapid pace.

With the proliferation of online streaming and publishing services, there is certainly a future that will see more choice and relevancy for audiences of all backgrounds. Perhaps this will be one of the more wholesome, positive impacts this awful pandemic will have released onto the world.

Unreal Engine & In-Camera VFX: Tools You Need

You need a game engine like Unreal Engine to render virtual backdrops onto an LED wall in real time. And real time rendering is important because as the actor moves, the backdrop (used in tandem with motion tracking cameras) can move with them seamlessly. 

Game engines are changing everything within the VFX space. But using them requires managing the many, massive digital assets they create. To make the most of their investment in game engines and in-camera VFX tools, teams need foundational tools like version control.

Helix Core + Digital Asset Management

Helix Core is known in as the game development and media standard for version control. It is the only version control tool that has the performance required to manage the numerous extremely large files associate with Unreal Engine and in-camera VFX. If you are looking to create assets for an LED wall, you need Helix Core to be able to capitalize on the benefits of real-time rending.

Perforce Federated Architecture is the secret behind lightning-fast delivery around the globe.  It offers superior performance whether you are on-set or connecting to the cloud. It can transport large binary files where you need them without the WAN wait. Plus, Helix Core offers integrations with the tools digital creators are already using — 3ds Max, Maya, and more.

See for yourself why so many studios choose Helix Core for LED wall & in-camera VFX. You can get free tools and deploy them your way.

 

FREE VIRTUAL PRODUCTION TOOLS