Using NVIDIA RTX and Unreal Engine, Digital Domain creates realistic virtual characters that capture emotions and actions in real time.
Meet DigiDoug, the first digital human to give a TED talk in real time.
DigiDoug is the virtual version of Dr. Doug Roble, senior director of Software R&D at Digital Domain, the award-winning visual effects studio behind the characters and visual effects for movies like The Curious Case of Benjamin Button, Maleficent, Disney’s The Beauty and the Beast and Avengers: Endgame.
Roble and Digital Domain’s Digital Human Group have presented DigiDoug at multiple events, showcasing their state-of-the-art digital human technology that’s driven by an inertial motion-capture suit and a single camera capture for facial animation.
But to capture and recreate emotions and actions in real time, the Los Angeles-based studio turned to more powerful and advanced technology: machine learning and real-time rendering.
With NVIDIA RTX technology and Unreal Engine from Epic Games, Digital Domain is bringing photorealistic digital humans to life — all in real time.
The Machine (Learning) Behind Digital Humans
Creating realistic digital humans in film and television is known as the final frontier of visual effects. It typically takes thousands of hours and hundreds of artists to create a virtual human that contains all the physical details of a real-life person.
With machine learning, Digital Domain can speed up the process while achieving an unprecedented level of photorealism in their characters.
To create DigiDoug, Digital Domain integrated machine learning into its creative process, but needed enormous amounts of data for the technique to work. So they started by taking thousands of images of Roble’s face using different angles and lighting stages to capture as much data as possible.
The team built a deep neural network that would take those images and learn all the details about Roble’s face — from the mix of expressions to physical details — so it could compute all the information that would allow DigiDoug to act like a real person.
DigiDoug is then rendered in real time using NVIDIA RTX technology and Unreal Engine, the game engine developed by Epic Games. Roble’s movements and expressions are captured using a special motion-capture suit and camera. That data is transferred to the machine learning software, and then translated to the rendered digital human in real time.
“Capturing all the details is what makes the digital human realistic,” said Roble. “What’s cool about machine learning is you can teach it to get better. With NVIDIA and Epic Games, Digital Domain is building the technology that allows artists to create the most photorealistic digital humans possible, and it’s all happening in real time.”
For more information, watch the recent panel at SIGGRAPH on “Progress Towards Real-Time, Photorealistic Digital Humans,” moderated by Danielle Costa, vice president of visual effects at Marvel Studios. In addition to Roble, the talk features Simon Yuen, director of graphics at NVIDIA, and Vladimir Mastilovic, director of digital humans at Epic Games, who discuss the journey of digital humans and where the technology is headed.
Learn more about the latest NVIDIA technology by watching on-demand sessions or developer talks from SIGGRAPH.
Light Sail VR Scales Creative Business with OpenDrives Data Management Infrastructure
OpenDrives, Inc., a leading provider of software-defined media storage workflow solutions, today announced that Light Sail VR, a pioneer specializing in immersive media storytelling has selected the Atlas storage platform to modernize and scale its creative production business. Atlas boasts superior performance, enabling customers like Light Sail VR to increase data storage and processing on the fly, accelerating creative workflows by as much as threefold. The transition to Atlas has greatly improved the team's productivity, streamlined their workflows, and optimized their data management capabilities. With features like mount management and containerization, Light Sail VR was able to effectively scale their existing storage and add resources, thus enabling them to onboard more commercial projects effectively and swiftly using various third-party tools available through OpenDrives’ containers marketplace. Light Sail VR Co-Founder and Creative Director Matthew Celia, a recognized leader in virtual reality, comments on the transformation OpenDrives has brought to its operation, "OpenDrives was our most important investment at the very outset, as the new infrastructure would set the stage for greater efficiency. The... Read More