Camera Tracking for Live-Action Virtual Production

Photo Credit: Disney

Photo Credit: Disney

Real-time camera tracking makes blending 3D data and live-action footage possible. With advancements in processing power and software, camera tracking has become more affordable and easier to accomplish.

Camera tracking is, essentially, the process of matching the movement of live-action footage with the movement of a virtual camera. Below you will find the key components of Camera Tracking and how they work together to blend live-action production with in-camera assets. 

Photo Credit: ILM

Photo Credit: ILM

MOTION CAPTURE

System Calibration

There are 2 forms of the LED walls. The first is the standard motion capture system calibration. That process defines the motion capture camera locations and the capture volume in the real world. The second is camera sensor calibration which is used to make sure that the images on the LED wall match the expectation of what the real-world camera should see.

Ex. If there was a one-foot cube in the real world five feet from the camera, and an image of the cube on the LED wall appearing five feet away, they would look identical in size and shape in the film camera.

Active LEDs

Each active marker has LEDs attached to it that are reflective to the camera and can be automatically identified by the cameras that are tracking the actor.

Continuous Calibration

Once an initial system calibration has been performed, there can be continuous monitoring of where objects, like a mocap marker, are expected to be. If there is a variation in the expected location, say after the motion capture camera has been inadvertently moved, the system can modify its initial calibration to remember the discrepancy.

Fusion Software

This is a software algorithm that combines information from multiple sensors in real-time. In the case of camera tracking this would be combining the optical capture information with the Inertial Measurement Unit (IMU). An example might be getting location info from the optical system and rotation information from the IMU.

Photo Credit: StageCraft

Photo Credit: StageCraft

UNREAL ENGINE

Livelink

Livelink is Epic Games’ common interface for receiving streamed animation data into the Engine. It offers a standard method for connecting applications directly to the engine and in real-time.

Ex. A motion capture system streaming skeleton animation from the motion capture software into Unreal where it’s retargeted to a character.

Data Output / Streaming Data

The streaming of real-time animation data from the motion capture system to the Unreal Engine. This is usually represented by “skeleton” – a human or a simple rigid object.

VIRTUAL PRODUCTION WITH LED WALLS

Camera Tracking

This is a method for deducing the location and orientation of the camera. That information can be fed into Unreal Engine so the video displayed on the wall matches what the camera should be seeing. In a sense, it’s making sure that virtual camera movement precisely matches the real camera on set.

Lens Calibration

This is a crucial process in VFX compositing and camera tracking. It allows the user to correctly map and calculate the lens focal length, distortion and entry pupil. This process is done on a per serial number basis as each lens may significantly vary from the next.

Previous
Previous

Virtual Production Eliminates Guessing for VFX

Next
Next

THE FUTURE OF GAMING IS COMMUNITY