In the world of computer graphics, creating realistic digital humans has been something of a holy grail for a couple of decades now. Many have achieved the goal of making a still image of a digital human that is indistinguishable from reality. But often, it’s when these characters are called on to perform—especially if they are required to be rendered in real time—that the “Uncanny Valley” creeps in.
Epic Games’ MetaHuman team, which includes digital human innovators 3Lateral and Cubic Motion, is on a mission to change all that. First, with MetaHuman Creator, they made the creation of realistic digital humans accessible to everyone. Then, with Mesh to MetaHuman, they took the technology a step further by enabling you to create a MetaHuman based on a sculpt of a character or a scan of an existing person.
In their most recent release, the team introduced MetaHuman Animator, enabling you to capture an actor’s performance and turn it into high-fidelity facial animation for your MetaHuman in minutes, using a stereo head-mounted camera, or even just an iPhone. MetaHuman Animator faithfully recreates every nuance of the actor’s performance on a digital character—something that would have previously taken a team of experts months.
To push MetaHuman Animator to its limits during its development, the Serbia-based 3Lateral team collaborated with local artists and filmmakers to produce Blue Dot, a short film featuring renowned actor Radivoje Bukvić, with cinematographer Ivan Šijak acting as director of photography. The entire sequence, including hair, was all rendered in Unreal Engine, and is able to run in real time.
The film demonstrates how MetaHuman Animator unlocks the ability for teams to create cinematics of stunning fidelity and impact by using a creative on-set process typical of traditional filmmaking to direct and capture a performance. What’s more, the quality of animation delivered straight out of the box was so high, only a small team of animators was required for final polishing.
Bringing traditional filmmaking techniques to digital productions
To get the project underway, Bukvić’s likeness was captured at 3Lateral, using the company’s custom 4D scanning techniques. While the 3Lateral team created a bespoke MetaHuman rig from this data, the animation data created by MetaHuman Animator from a captured performance can be applied to any digital character whose facial rig uses the control logic corresponding to MetaHuman Facial Description Standard—including those created with MetaHuman Creator or Mesh to MetaHuman.
Despite the fact that the piece was to be entirely digital, Šijak and his team drew heavily on their traditional filmmaking experience throughout the process.
To design the lighting exactly as they would for a live-action shoot, they brought in physical lights and adjusted them to get the look they wanted on Bukvić. With the chosen lighting setup recreated digitally in Unreal Engine, they could quickly preview how the lighting was working with Bukvić’s animated MetaHuman while the actor was still on the motion capture set, and get another take right away if required. And of course—unlike with physical lighting—the lighting could be tweaked after the fact. The benefits were not lost on Šijak.
“To see the results of the light on the actor—even when light was not present on set—and to change the light later on, you want to work in this kind of environment,” he says. “It felt natural and the workflow was really, really amazing.”
For added realism, real-world movie cameras, complete with dolly tracks, were brought into the mocap studio. These were tracked, along with Bukvić’s body, and of course, his face. All this enabled the team to precisely recreate the camera motions in Unreal Engine, with Bukvić’s MetaHuman acting directly to the camera.
“One of the main engines of acting is imagination,” he says. “If you follow your inner emotions, your body will move spontaneously in front of the camera. [...] In this work, I was trying to explore my inner world and the result is really stunning.”
Getting up close and personal
Perhaps the key challenge the team set for themselves was that the action should focus almost entirely on the character’s face.
“The close-up shot is really the base of the movie and the cinema,” says Šijak. “That’s really how the viewer is communicating and exchanging emotions with the character. Everything shows up.”
3Lateral Business Development Lead Uroš Sikimić explains how this challenge was used to drive the development of MetaHuman Animator. “The human face is tremendously complex,” he says. “We needed a tool powerful enough to be able to analyze all of that information, instead of trying manually to recreate each individual muscle movement on a human face.”
Aleksandar Popov, who is Art Director at 3Lateral, takes up the thread. “One of the most important aspects is obviously eyes,” he says. “That is what makes or breaks the whole impression. And so we put in a lot of effort to actually design from that kind of artistic standpoint and allow the technology to do the rest.”
Bukvić’s facial performance was recorded using a pair of stereo head-mounted cameras. The video and calculated depth data was then processed in MetaHuman Animator using a 4D solver, capturing every subtle detail and nuance of the performance and even reconstructing the eye gaze movements.
With MetaHuman Animator, the results are available to review in minutes. If required, the animation can be further tweaked in Unreal Engine for refinement or dramatic effect—however, the Blue Dot team found that there was minimal manual intervention required to polish the animation output from MetaHuman Animator.
“I was not expecting that my performance would be as natural as it is,” says Bukvić. “I was blown away when I saw all the tiny details are there.”
It’s the immediacy of these results and the iterative process that this facilitates between artists, animators, and actors—combined with the fidelity of the capture—that makes MetaHuman Animator such a powerful tool for creating cinematics. Animation studios can now work with an actor on set, using their creative intuition to direct a performance that will faithfully translate into the animated content to create emotional cinematic pieces.
“You cannot distinguish the quality of the output that was done through this technology or shot on set with a real camera,” says Šijak. “The camera work and the performance of the actor and everything that gets the audience involved in the shot, it’s there. Nothing’s lost.”
High-quality facial animation from your iPhone
One of the beauties of MetaHuman Animator is that it doesn’t require you to have a professional stereo head-mounted camera or other expensive hardware: it also works with an iPhone (12 or above) and a desktop PC, meaning you can create your own cinematic pieces in Unreal Engine using any MetaHuman character, using the equipment you may already have at home.
You can find out more about how to get started, as well as browsing the complete documentation and discussing your projects with other developers and creators, on our MetaHuman community hub.
Get MetaHuman Animator
MetaHuman Animator is ready and waiting to put a smile on the face of your digital human. It’s part of the MetaHuman Plugin for Unreal Engine, and requires Unreal Engine 5.2 or later. Download the latest version to get started today.