• Tuesday, May. 28, 2019
Vislink to showcase advanced connectivity at BroadcastAsia
Vislink HCAM wireless system
SINGAPORE -- 

Vislink Technologies will showcase a wide selection of its state-of-the-art products at BroadcastAsia 2019 (Stand 6J2-01, Level 6). The company’s presence at the show represents its continued commitment to bringing the next generation of broadcast solutions to markets around the world. Vislink’s lineup includes the award-winning HCAM+ULRX-LD HEVC 4K UHD wireless system, DVE6100 encoder and IRD6200 receiver/decoder, IMTDragonFly Transmitter and its MSAT (man-portable satellite) antenna system. In addition, Vislink’s new INCAM-HS integrated wireless camera transmitter for Sony’s new HDC-5500 4K will be on display at the Sony stand.

Vislink product manager David Edwards will share his expertise on 4K during a presentation on Wednesday, June 19, at 11:30am in the Innovation Hub, level 4. His presentation, entitled “4K Wireless Delivery with HDR--and why it’s not for Pandas,” will discuss the latest trends and considerations in deploying 4K and HDR wireless productions.

Vislink will be showcasing its low-latency HCAM transmitter and UltraReceiver-LD (ULRX-LD), an NAB 2019 Best of Show awarded solution. With single-frame end-to-end capabilities, this latest wireless camera system is the perfect complement to broadcast sports and ENG news coverage applications. As the first company to bring single-frame, end-to-end latency in 4K, the HCAM is the most widely deployed HEVC 4K UHD wireless transmitter on the market today. It features user-interchangeable RF modules and a range of software options – including HDR-ready capability – and supports quad 3/6/12G SDI, HDMI, fiber-optic, SMPTE 2022-6 and HD/SDI-over-IP interfaces. With highly flexible and configurable mounting options and intuitive video interfaces, the unit can be mounted to broadcast, ENG and other professional-grade cameras. In addition, the unit features integrated camera control with Vislink FocalPoint compatibility and direct-docking battery plates with integral power feed through. The associated ULRX is a 1RU half-width, rackmount chassis receiver that fills the need for reliable HEVC transmission and allows for better compression without sacrificing quality. The versatile receiver features four UHF inputs with maximum-ratio combining, DVB-T and proprietary LMS-T demodulation, as well as ASI and IP capabilities.

Updates to Vislink’s satellite communications line will also be on display with the new DVE6100 encoder and IRD6200 receiver/decoder for applications where size and weight is at a premium. The company’s DVE6100 is a compact, multi-format, multi-channel 4K HEVC exciter that is ideally suited for flyaway and vehicle-mounted uses. The IRD6200 is a compact, multi-format, multi-channel 4K integrated receiver/decoder (IRD) ideally suited to news and events applications. Both solutions offer ultimate satellite bandwidth efficiency by utilizing the latest HEVC video compression and DVB-S2X satellite modulation technology, allowing a 50-percent reduction in leased satellite bandwidth compared to MPEG-4, DVB-S2 technology--dramatically reducing satellite OPEX. The DVE6100 and IRD6200 are the smallest and lightest 4K UHD DVB-S2X exciter and 4K UHD HEVC DVB-S2X IRD on the market, respectively. 

  • Thursday, May. 23, 2019
DP Dos Reis selects Cooke lenses for "David Makes Man" TV series
Todd A. Dos Reis, ASC
LEICESTER, UK -- 

David Makes Man is the first TV series written by Tarell Alvin McCraney, who won an Academy Award for co-writing the 2017 Best Picture winner Moonlight based on his play, “In Moonlight Black Boys Look Blue.” McCraney, serving as the series executive producer along with Michael B. Jordan (Creed) and Oprah Winfrey, tells a semi-autobiographical story of his own life, as did Moonlight, delving into childhood trauma and using the power of imagination to survive.

To shoot this story of a 14-year old prodigy looking for a way out of his poor neighborhood in Miami, Florida, DP Todd A. Dos Reis, ASC relied heavily on a set of refurbished Cooke Speed Panchro prime lenses and Cooke Anamorphic/i primes.

“We didn’t shoot a typical pilot and wait for the series to be picked up, but shot ten episodes back-to back,” said Dos Reis. “I was able to get almost a complete set of refurbished original Speed Panchros from Otto Nemenz--18mm-100mm, except for the 65mm--and because of late changes in production, I never got a chance to test them. I knew from my experience with the Cooke S4 lenses that I would get what I wanted. I started shooting with the Speed Panchros on day one--almost always starting a scene with the 32mm and with the 40mm being my favorite--and I got exactly what I expected.”

However, during the prep for the pilot, director Michael Francis Williams decided he wanted to shoot anamorphically for the Ville housing project scenes, while using the Speed Panchro sphericals for scenes set outside the Ville. (This lens choice would be reversed for episodes six to ten to show a drastic change in the main character’s life.) But this late change to anamorphic meant that Dos Reis had to get a set of anamorphic lenses from another manufacturer.

“Those anamorphic lenses we used on the pilot...they just weren’t right,” explained Dos Reis. “They had quick falloff but it was the softness that I wasn’t happy with. I’m very familiar with Cooke’s spherical lenses from working with S4 primes on HBO’s Entourage, so I reached out to Dana Gonzales, ASC, as I knew he shot with the Cooke Anamorphic/i primes for Amazon Prime’s Legion. He said, ‘Get them, they’re great, they’ll give you exactly what you want,’ and that’s what made me switch to the Cookes, with a set of 25mm-180mm Anamorphic/i primes.”

As well as switching between spherical and anamorphic lenses, Dos Reis used different color temperatures, styles of camera work and styles of lighting to further highlight the separation between the Ville housing project and everything outside of that.

“While most folks think of Miami as being lush, the Ville is a housing project,” he said. “Our colors are desaturated, with a coolness in the shadows and skin tones with the interior lit for daylight with ARRI LED SkyPanels and the night exterior lit with tungsten, HMI Fresnels, sodium vapor and mercury vapor for soft moonlight — like the Ville was alive, a place where you have to have your head on a swivel to be aware of all your surroundings. That meant a handheld look.”

Dos Reis used three Alexa Minis for all 10 episodes, with two being handheld and one on a crane, dolly or Steadicam rig. In addition, he had a Blackmagic Micro Cinema Camera to use as a bodycam or on a swinging door.

  • Thursday, May. 23, 2019
CATV TOKUSHIMA taps into Blackmagic
Performers at the Awa dance festival
FREMONT, Calif. -- 

CATV TOKUSHIMA, Inc. in Tokushima, Japan, replaced its 2K live broadcast system with a new 4K live broadcast infrastructure built around and Blackmagic Design products. This included URSA Broadcast, ATEM Television Studio Pro 4K, ATEM Camera Control Panel and HyperDeck Studio 12G.

CATV TOKUSHIMA produces local news, high school baseball games and various live broadcast programs such as the recent Awa dance festival. Tokushima Awa dance is one of the traditional Japanese performing arts that has been around for 400 years, since the Edo period. The city of Tokushima, with a population of approximately 260,000, is a festival that attracts more than 1.2 million tourists from around the country every year.

With the new 4K live broadcast workflow, CATV TOKUSHIMA, Inc. set up four URSA Broadcasts at the main dance venue. Each camera was equipped with a Blackmagic Camera Fiber Converter connected via an SMPTE fiber optic cable to a Studio Fiber Converter at a relay headquarters several dozen meters away.

The feed from the URSA Broadcasts was sent to ATEM Television Studio Pro 4K, which then sent signals to an ATEM Camera Control Panel, SmartView 4K and SmartScope Duo 4K. The ATEM Television Studio Pro 4K was used for switching, inputting titles and logos and to manage talkback direct to the cameraman, with a HyperDeck Studio 12G providing backup recording.

Takushi Ichisaka, Program production manager of CATV TOKUSHIMA, Inc., said, “ATEM Television Studio Pro 4K supports eight 12G-SDI inputs with format conversion, so you can format any signal and it’s really easy because you don’t need to convert. The switcher area has become very clean, as no additional equipment like cables or converters is needed compared to before. The ATEM Television Studio Pro 4K has a very easy to understand interface, so anyone can operate it intuitively, and the combination of ATEM Television Studio Pro 4K and ATEM software control makes it easy to manage titles and logos. The 2K material titles and logos are also converted to 4K automatically, which is very useful.”

He added, “Blackmagic products are all very compact and all the necessary functions are in place. The amount of equipment has been reduced to one-third compared to before, and we have been able to improve efficiency in various ways. Live broadcasts have different environments every time, so I think that saving space for equipment is a great advantage. All these products can be stored in a 20U rack, making it a lot more easy on the move side.” 
 

  • Tuesday, May. 21, 2019
Submissions open for 71st Engineering Emmy Awards
NOHO ARTS DISTRICT, Calif. -- 

Submissions for the 71st Engineering Emmy® Awards are now open through Friday, June 7. The Engineering Emmy Award honors an individual, a company or an organization that considerably improves existing methods or innovations that materially affect the transmission, production, recording or reception of television.

The 2019 Engineering Emmy Awards Entry Form can be downloaded from the Television Academy’s website here.

 TelevisionAcademy.com/downloads

Recipients of the Engineering Emmy, The Charles F. Jenkins Lifetime Achievement Award and the Philo T. Farnsworth Corporate Achievement Award will be selected by the Engineering Awards Committee comprised of highly qualified Academy members appointed from technically oriented Peer Groups. Winners will be presented with their Engineering Emmy at a ceremony on October 23, 2019.

Previous Engineering Emmy Award winners include AVID, Canon, Dolby Laboratories, Disney, FUJI, Netflix, NASA, Sony Corporation and YouTube.

  • Thursday, May. 16, 2019
Digital Filmtree deploys DaVinci Resolve to color “The 100”
Bob Morley (l) and Eliza Taylor in season 6 of "The 100"
HOLLYWOOD, Calif. -- 

DigitalFilm Tree (DFT) has brought color to the post-apocalyptic, sci-fi drama The 100 with a team that includes sr. colorist Dan Judy, and VP of post Chad Gunderson. Just prior to the premiere of its sixth season on April 30, The 100 was announced as renewed for a seventh season by The CW.

Crafting environments like ice worlds, desolate desert-scape cities, and acid cloud storms, Judy is enthused over the support he and DFT have seen from Grant Petty and Blackmagic Design. Judy gave a presentation on his approach to color correction utilizing Resolve at this year’s NAB. Blackmagic announced its latest version of the NLE software, DaVinci Resolve 16, during the convention.

“I worked with the fourth DaVinci machine ever built, and hand-in-hand with the people down at the facility that invented DaVinci back in the day,” he said. 

During his talk, Judy referenced several advantages to the DaVinci Resolve software that have come in particularly useful for The 100, like OFX--Open FX plug-ins--advanced tools that expand the colorist’s ability to treat everything from beauty and Boris & Sapphire FX all the way to fixing damaged images and all available within Resolve Color.  There are Unlimited Power Windows with amazing tracking abilities.  Working clip-based allows proper access to the client’s camera original materials, maintained and managed via DFT’s object storage SAN.  Judy has the freedom to use a LUT from production.  He will use discretion to make certain it’s a benefit to the scene at hand.  Again, working clip-based provides a nondestructive approach allowing Judy the full dynamic range to protect the show’s color design.
 
Shooting on the ARRI Alexa gives extra dynamic range when matching footage shot across different times of day and then leveraging that ability when matching to other cameras used in production--GoPro, Red and Canon.  With The 100’s color design constantly evolving over locations and seasons, Resolve provides its evolving technology to help the DFT team meet these challenges.  Resolve’s last iteration--version 15--incorporated the Fusion Platform, this allows Edit, Color and VFX to contribute to the project’s final outcome giving the client real-time reviews of these efforts, even remotely.

Co-executive producer and director, Tim Scanlan, Judy’s liaison on The 100, and his creative partner for more than two decades, can even monitor and provide feedback from the production offices for The 100, in Santa Monica, or from his home office, in Newport Beach utilizing DFT’s remote services. Judy, meanwhile, can be found on a dedicated Resolve workstation at DFT’s facilities in Hollywood. He has partnered with Scanlan for more than 20 years, and the duo’s lineage in color can be traced back to wildly popular shows in broadcast, including ABC’s Charlie’s Angels and the CW’s Smallville, the latter of which they collaborated on for more than a decade.

“Over the years, Tim has pushed me very hard to utilize my tools to their fullest,” Judy continued, adding his thanks to DP Michael C. Blundell for his contributions to The 100. “Michael has been a real feather in our cap helping us out throughout the past five seasons.” Judy also expressed appreciation for associate producer Emanuel “E” Fidalgo. Gunderson thanked post supervisor Mark Knoob, whom he communicates with on a daily basis, and sr. editor Thomas Galyon.

“Dan Judy has a massive skill set, and can bend this software, somehow, to do magical things,” added Gunderson, who joined the post team for The 100 four seasons ago. “I help facilitate everything that the production needs around here from the time it hits online to finish. As this show is being managed primarily in Resolve’s remote workflows, it’s imperative that you have a strong management team, strong editorial team, and a strong color team, all working simultaneously, to see all of this through.”

  • Tuesday, May. 14, 2019
Christie partners with Cannes Film Festival for 13th year
CANNES, France -- 

Christie® is once again the projection technology partner for the Cannes Film Festival. For the 2019 festival, which runs May 14-25, Christie is supplying 36 digital cinema projectors from across its Solaria™ Series, and accessories. It is the 13th year that Christie has supported the event.

The ongoing partnership with the Cannes Festival confirms Christie’s reputation in the cinema industry as a provider of market leading digital cinema solutions, and its ongoing commitment to offer the best image quality for one of cinema’s most prestigious events.

“It’s an honor for Christie to be involved in such an important and globally renowned festival, now in its 72nd year,” said Francis Zee, consultant, Christie. “The Cannes Festival is all about the celebration of cinema and delivering the director’s vision to an audience of their peers. Our technical approach is to deliver rock-solid reliability of image quality with our projectors, and let the movie speak for itself.”

Technical specification for the projection technology in the festival and market screening rooms is overseen by the CST (Commission Supérieure Technique de l’Image et du Son). “The technicians that make up CST are very experienced, and the operators inside all the projection rooms are hand-picked by the Festival,” added Zee. “It is a pleasure to work alongside them.”

In the festival’s five rooms--also known as Competition rooms--a combination of Christie’s CP4230, CP2230 and CP4220 projectors will be used. The main competition rooms will have both 2K and 4K projectors to match the content on show. Celebrating its 60th anniversary this year, the Marché du Film is the largest international gathering of professionals in the sector and will feature the Christie CP2215, a popular, compact 2K DCI Xenon projector. 

This year’s curtain raiser is The Dead Don’t Die, directed by Jim Jarmusch and starring Tilda Swinton, Adam Driver, Bill Murray, Selena Gomez, Chloe Sevigny and Iggy Pop. 
 

  • Wednesday, May. 8, 2019
AWS Cloud empowers Tangent Animation for Netflix’s “Next Gen” 
Tangent Animation studio at work in Toronto
TORONTO -- 

Animated sci-fi feature “Next Gen” tells the story of a lonely, rebellious teenage girl who teams up with a top-secret robot to stop a madman’s plot for world domination. Featuring the voice talent of John Krasinski, Charlyne Yi, Jason Sudeikis, Michael Peña, David Cross and Constance Wu, the CG action-adventure was created as a joint effort by Baozou and Tangent Animation, a subsidiary of Tangent Studios.

Released globally by Netflix, “Next Gen” marks the largest project that Tangent Animation has tackled to date both in size and scope, requiring four 2K resolution deliverables, including mono and stereo versions in English and Mandarin. With only 25 percent of the project’s rendering completed and three months until delivery, Tangent looked to AWS Thinkbox to help them scale compute resources with Amazon Web Services (AWS) Elastic Compute Cloud (EC2), ultimately completing more than 65 percent of the film’s rendering with AWS.

“AWS Cloud was a godsend for us on ‘Next Gen;’ it allowed us to render about two and half versions of the movie in just 36 days and far outstripped our on-premises capabilities. Without it, lighting and rendering would have to have started nearly eight months ahead of time and that would have required an entirely different creative strategy,” said Jeff Bell, Tangent Studios COO and co-founder. “We’ve always used Deadline for managing resources on premises, but our per hour compute costs are so low, we weren’t sure if the cloud was the right option for us until it became clear we needed a lot more CPU to get the job done. We couldn’t have hit our deadlines without AWS.”

Ken Zorniak, CEO and president, Tangent Studios, added, “Removing the limitations of a physical farm allows creatives to make changes to the story and look until the last minute, then let the computers do the heavy lifting. Since artists receive shots back faster using AWS, they can work more effectively. It’s definitely improved our studio’s flow and throughput and that helps keep everyone motivated and engaged.”

Tangent deploys open source software for 3D production, primarily using Blender for content creation. Tangent’s local 600-node farm is housed at the studio’s Winnipeg headquarters, which has about 60 people who typically focus on asset creation, lighting and VFX. Much of Tangent’s animation is done out of its Toronto studio, which is about twice the size staff-wise and is designed to be highly flexible.

“Our Toronto office is run on a data center so looking to the cloud wasn’t a foreign concept,” Bell explained. “After just two months of setup, configuration and testing--which AWS Thinkbox helped us with, we went from being inexperienced to spinning up 3,000 AWS instances--five times the capacity of our local resources.”

Already well into production once they decided to leverage the AWS cloud, Tangent was able to use AWS Snowball, a data transport solution that can be sent to a studio’s location, to quickly load upwards of 100TB of data onto AWS servers. AWS Thinkbox helped them determine where to locate the data, and how to balance machine power and RAM needs with pricing and core availability, making use of economical Spot Instances where possible.  

Looking to the future, Bell envisions broadening Tangent’s relationship with AWS. He shared, “I can see us moving our whole production pipeline to AWS: disk tiering for cold storage, remote users, backups, virtual workstations, and beyond. With AWS, we see a partner that we can rely on long term, not just to bring more cores online to finish a project but also a resource for the work we do beyond the animation studio in developing SaaS technology.”

Currently, Tangent Animation is working on a new project expected to announce in the coming weeks.

  • Wednesday, May. 8, 2019
Cooke Optics to showcase wares at Cine Gear Expo
Cooke 50mm anamorphic full frame lens
LEICESTER, UK -- 

Cooke Optics will demonstrate its strength in the full frame arena at Cine Gear Expo 2019, with lenses from its S7/i and Anamorphic/i Full Frame Plus lens sets on Stand 67. The lens manufacturer will also present the latest developments for the /i Technology metadata system that provides detailed lens data to VFX and postproduction teams, and Cooke Optics TV will be live broadcasting interviews from the stand throughout the show. All nine of Cooke’s lens families will be represented on the stand.

The new Anamorphic/i Full Frame Plus range has been designed to meet the growing appetite for large format production, while offering the popular anamorphic characteristics including flare and oval bokeh. This range is also available with Cooke’s SF ‘Special Flair’ coating, which enables an exaggerated flare that gives yet more choice to cinematographers.

The 18mm and 180mm lenses from the S7/i full frame spherical range will also be featured on the Cooke stand. These, together with the 27mm, are going into production over the coming months to round out the range.

The Panchro/i Classic lenses that emulate the look of old Cooke Speed Panchros are rapidly growing in popularity, for their painterly vintage look paired with the conveniences of modern housing and the ability to match lenses through the Panchro/i Classic range. Visitors to Stand 67 will see the recently announced 65mm Macro lens--a 2-1 Macro--which also covers the full frame sensor.

Cooke will also present /i3 (/i Cubed), the latest version of its /i Technology metadata system that provides detailed lens data to VFX and post-production teams. /i3 firmware now provides distortion mapping - not just a theoretical measurement of spherical lenses of a particular focal length, but of the specific lens in use.

“We have been pushing /i for a very long time as a standard for the industry, and we believe this latest version represents a sea-change for postproduction and producers to really understand the value of lens metadata to reduce time and costs in post,” said Les Zellan, chairman, Cooke Optics. “When we can literally show how lens data collected on set reduces tasks in post from hours to seconds--why wouldn’t you use it?”

In addition, the team from Cooke Optics TV will be on the stand shooting and broadcasting live to the Cooke Optics Facebook and Cooke Optics TV YouTube channels throughout the show, interviewing cinematographers, camera department and film production professionals. Cooke Optics TV is an educational content channel for the film industry, which is lens agnostic.

  • Tuesday, Apr. 30, 2019
Drone used to aid 3D remake of Japanese internment camp
In this Nov. 16, 2007, file photo, Bob Fuchigami looks through one of the albums of photographs that he has collected on Camp Amache during an interview at his home near Evergreen, Colo. Fuchigami was 12-year-old when he and his family were forced to leave their 20-acre farm in Northern California for the Japanese-American internment camp in Granada, Colo. A University of Denver team is using a drone to create a 3D reconstruction of the camp in southern Colorado. The Amache effort is part of a growing movement to identify and preserve historical sites connected to people of color in the U.S. (AP Photo/Ed Andrieski, File)
DENVER (AP) -- 

A University of Denver team is using drone images to create a 3D reconstruction of a World War II-era Japanese internment camp in southern Colorado, joining a growing movement to restore U.S. historical sites linked to people of color.

Researchers last week dispatched the drone from the Switzerland-based company senseFly as part of a mapping project to help future restoration work at Camp Amache in Granada, Colorado.

The senseFly eBee X drone flew over the 1-square-mile (1.6-square-kilometers) site and took more than 4,000 images as part of a project to document where barracks, schools and other buildings once stood, said Adam Zylka, the senseFly pilot who flew the drone.

Currently, the site only contains concrete foundations, artifacts, a handful of restored buildings and a cemetery of internees who died at the camp.

But Zylka said researchers can use the information gathered by the drone to create virtual reality and augmented reality apps so that visitors can experience what life was like at the internment camp with almost precisely reconstructed images.

"This is a game changer," Jim Casey, geographic information system specialist with the University of Denver who has been working to create digital maps of Amache. "You could be standing at the site, looking at nothing for sagebrush and weeds. Then, you can point your smartphone at the view and see what was once there."

Casey said people who cannot go to the isolated location around 230 miles (370 kilometers) southeast of Denver will be able to visit the site virtually after researchers process the new drone data.

From 1942 to 1945, more than 7,000 Japanese-Americans and Japanese immigrants were forcibly relocated to Camp Amache. They were among the more than 110,000 Japanese-Americans ordered to camps in California, Colorado, Idaho, Arizona, Wyoming, Utah, Arkansas, New Mexico and other sites.

Executive Order 9066, signed by President Franklin D. Roosevelt, forced Japanese-Americans, regardless of loyalty or citizenship, to leave the West Coast and other areas for the camps surrounded by barbed wire and military police. Half of those detainees were children.

At Amache, internees lived in an area next to poor Mexican-American farm workers. They produced a newspaper, tried farming and formed football and baseball teams.

Casey said the recreation of the camp is important for the U.S. to come to terms with this dark period in history.

"Children and grandchildren of internees also are trying to learn about what their parents went through," he said. "That's because they rarely talked about it."

The Amache drone project is the latest example of preservation advocates working to save and restore historical sites connected to black, Latino and Asian American history.

A digital project headed up by Brown University professor Monica Martinez seeks to locate sites connected to racial violence along the Texas border with Mexico. Some of the sites she and other researchers have identified have resulted in historic markers documenting acts of violence against Mexican Americans from 1900 to 1930.

Advocates also are working to restore the birthplace of civil rights leader Dolores Huerta in Dawson, New Mexico. The old mining community in northern New Mexico is now a ghost town and there is no marker commemorating Huerta's connection to the area.

  • Monday, Apr. 29, 2019
Tim Burton's "Dumbo" delivered in Dolby Vision with Blackmagic Design
A scene from "Dumbo"
FREMONT, Calif. -- 

Blackmagic Design has announced that DaVinci Resolve Studio was used throughout the full color pipeline on Disney’s live action remake of Dumbo. Directed by Tim Burton with a screenplay by Ehren Kruger, the DI was delivered by Goldcrest Post’s Adam Glasman, who collaborated with DP Ben Davis, BSC.

Using an ACES workflow in DaVinci Resolve, Glasman and Davis began preproduction by defining a warm, golden-hour period look inspired by the layered colors of the original cell animation’s minimalist production design. Finished in 2K to enhance the soft, filmic quality of the rushes, the team had to cater to a variety of different deliveries, including Dolby Vision 2D/3D, SDR 2D/3D, and both HDR and SDR Rec 709.

“While a lot of Dumbo was built and shot in-camera with amazing sets and many extras, a decision was made early on that the animals and all the skies would be CG,” said Glasman, explaining that purpose built sets constructed against blue screen backgrounds would be used to film Dumbo. The integration of fully CG skies was crucial to reflect the expressionist, dramatic painted backdrops of the original animation.

Using an ATEM Television Studio HD switcher as part of the DIT workflow, the team was able to key in several different dramatic sky reference images shot by Davis during preproduction together with a live feed from the camera. With feedback from Burton, these were then used to inform the lighting and mood of the entire set.

“Tim was keen on keeping a good level of contrast in everything to help integrate the computer generated assets with the background,” Glasman continued. “The VFX vendors (MPC) were given references for how a scene would probably look and lit their CG accordingly, so I had to be very careful not to spoil that.”

This was especially important for the Dolby Vision deliveries, said Glasman. “The CG skies, for instance, look amazing. If you compare a traditional DLP projection at 48 NITs to the Dolby Vision version at 1000 NITs, you instantly notice that you get a far wider color gamut with more added dimension with Dolby Vision. The sky is just as bright as it would be in the real world, so you have to treat it very sensitively.”

The DI wasn’t just about maintaining the integrity of the picture, however. Working with Tim Burton also meant there were plenty of opportunities to experiment with color too.

“Tim’s genius came to light in a scene with Dumbo’s mother in a cage, with a strong red light on her,” Glasman concluded. “There are all these animals dressed up as monsters in the cages surrounding Dumbo’s mother, and Tim just decided we should give those other cages strong colors too. I had a lot of fun making each monster a different hue, from bright green to ultraviolet. It adds to the scene. Between the production design, cinematography, and Tim’s vision, the whole film is visually stunning.”

MySHOOT Company Profiles