- Articles | Series
- Columns | Departments
- Publicity News
- Events Calendar
- Trending Now
- My Membership
- Wednesday, Sep. 25, 2019
Advanced Imaging Society's Lumiere Award winners include tech deployed in such films as "The Lion King," "Gemini Man"
- HOLLYWOOD, Calif.
The Advanced Imaging Society has unveiled the winners of its 2019 Entertainment Technology Lumiere Awards. The honorees are Dolby Laboratories, DreamWorks Animation, Epic Games, Felix and Paul Studios, Glassbox Technologies, LG, Magnopus, Pixelworks, Radiant Images, Skydance/Paramount Pictures, Sony Innovation Studios, Unity Technologies, and Varjo.
The winning achievements have in some cases involve deployment on major motion pictures, including Skydance/Paramount for its Multi Format Production work on Ang Lee’s Gemini Man, which also features a CGI “human” character developed from massive amounts of data taken of star Will Smith. Additionally, the Lumiere Award-winning Tycoon Virtual Production System from Magnopus was used in the creation of Jon Favreau's The Lion King, utilizing headsets and software to afford filmmakers the freedom to view their scenes and surroundings in VR.
The Advanced Imaging Society was formed in 2009 by such stalwarts as Walt Disney Studios Motion Pictures, DreamWorks Animation, Sony, Paramount, IMAX, Dolby, Panasonic, MasterImage, among others. to advance the creative arts and sciences, recognizing cutting edge, innovative technologies.
This year’s Lumiere Awards will be formally presented on Oct. 28th in a gala ceremony at the Four Seasons Hotel in Beverly Hills.
Here’s a rundown of this year’s honorees:
Dolby Laboratories--Pulsar Professional Reference Monitor
Dolby’s Pulsar monitor played a pioneering role in putting HDR on the map by enabling creative teams to experience never before seen HDR picture quality in the color grading suite. Dolby and the Pulsar monitor played a significant role in jump starting the UHD/HDR industry by expanding the availability of Hollywood movies and episodic TV shows powered by Dolby Vision.
The MoonRay/Arras Lighting Workflow is a Monte Carlo Ray Tracing film production rendering system that can assemble multiple shots simultaneously bringing full production quality scenes to artist desktops in seconds.
Epic Games--Unreal Engine
Unreal Engine 4.22 has provided creatives with a highly flexible and scalable real-time visualization platform. The technology provides real-time ray tracing, collaborative multi-user editing, advanced compositing, and new support for HoloLens 2.
Felix and Paul Studios
In shooting correct stereoscopic VR in otherwise impossible close proximity spaces (such as the International Space Station), the company’s technical team created a special algorithm. The result is a system which enables a parallax-tolerant capture for close proximity cinematic VR.
BeeHive is a collaborative virtual scene synching, editing and review system allowing users to see live changes from multiple users at the same time, regardless of their location or the tools they use.
The LG OLED Flatscreen system has shown itself capable of producing the pixel light and color strength necessary to display impressive entertainment content. The “organic light emitting diode” system allows each individual “smart” pixel to emit its own light and to be controlled individually (including be turned off), producing bright colors and deep blacks.
Magnopus--Tycoon Virtual Production System
The Virtual Production system used to create this year’s The Lion King utilized headsets and software to allow filmmakers the freedom to view their scenes and surroundings in VR. The system combined an estimated 58 square miles of computer-generated CG African scenery elements, which were viewable by wearing a VR headset on the Playa Vista soundstage. The system incorporated traditional live-action production techniques allowing for on-set decisions to be made in minutes.
Pixelworks--TrueCut Grading Software
TrueCut Motion Grading software allows filmmakers the ability to cinematically minimize the challenges of motion blur, judder, and frame-rate appearance. The system allows filmmakers to shoot at any frame rate, then deliver at a cinematically tuned high frame rate with options for a range of desired motion appearances.
Radiant Images--AXA Volumetric Light Field Stage
The Radiant AXA Volumetric Light Field Stage utilizes highly accurate camera positioning for AI, Volumetric and Light Field softwares. Forged from lightweight but rigid carbon fiber, the stage combines extremely low coefficients of thermal expansion while meeting high requirements for vibration absorption. Combined with fully synchronized sensors and expandable density capable of managing 100+ cameras, the stage offers creative teams an adaptive and accurate capture environment.
Skydance/Paramount Pictures--Multi Format Production, Gemini Man
Director, Ang Lee’s Gemini Man will be the world’s first theatrical release to be widely distributed in 120 and 60 frames per second, 4k and 3D. Additionally, the creative team produced a complete CGI “human” character developed from massive amounts of data taken of star Will Smith.
Sony Innovation Studios--Atom View
Atom View software allows creators to bring the real-world and the computer-generated world into real-time with output to film, TV and virtual reality. Atom View unifies content and creation for film and games with high quality volumetric assets and rendering technology.
Unity Technologies--Data Oriented Technology Stack
Unity’s “game-engine” technology is transforming media creation, becoming the entertainment industry’s “creative engine”. In 2019, the company strengthened its efforts driving real-time filmmaking with the software’s Data-Oriented Technology Stack. The software has now become an integral part of creative processes for motion pictures, episodic television, video games and commercial/industrial content.
Varjo--XR-1 Developer Edition Headset
The XR-1 mixed reality professional headset blends real and virtual content to deliver extremely photorealistic imaging or “Hard AR”. The device employs cameras to digitize the world in real time, then multiplexes that content inside the GPU blending it with the virtual content assets. The result is a high-resolution, extremely low latency visual experience.