Tuesday, October 16, 2018

Toolbox

  • Tuesday, Jul. 24, 2018
Dr. Michael Neuhaeuser named exec board member for technology at the ARRI Group
Professor Franz Kraus (l) and Dr. Michael Neuhaeuser
MUNICH, Germany -- 

The supervisory board of ARRI Group with headquarters in Munich, appointed Dr. Michael Neuhaeuser, effective September 1, 2018, as the new executive board member responsible for technology. He is the successor to Professor Franz Kraus who, after more than 30 years of highly successful work for the ARRI Group, joins the supervisory board and will continue to be closely associated with the company.

Kraus, with his vision and many innovative developments, has played a decisive role in the successful development of ARRI over that last few decades. He earned great merits in his tenure especially during the digitalization of the film industry with the development of the ALEXA digital camera system and early competence in multi-channel LED technology for ARRI lighting. During the time Kraus was responsible for research and development at ARRI, the company was presented with nine Scientific and Technical Awards by the Academy of Motion Picture Arts and Sciences for its outstanding technical achievements. In 2011, together with two colleagues, he was personally honored with an Academy Award of Merit, an Oscar statuette—the highest award in the film industry worldwide—for the design and development of the digital film recorder, the ARRILASER, which pioneered the development of digital products at ARRI.

Neuhaeuser previously served as VP of automotive microcontroller development at Infineon Technologies in Munich. He studied electrical engineering at the Ruhr-University Bochum, Germany, subsequently completed his doctorate in semiconductor devices, and can look back on an international career of 30 years in the electronics industry. Neuhaeuser started his industrial career at Siemens Semiconductor in Villach, Austria, and also took over leadership development at MICRAM Microelectronic in Bochum. He joined Infineon Technologies in 1998 where he performed various management functions in Germany and abroad. Some of his notable accomplishments include being responsible for the digital cordless business since 2005 and, together with his team, having developed the world’s first fully integrated DECT chip. In 2009, he was appointed to VP and general manager at Infineon Technologies Romania in Bucharest where, as country manager, he built up various local activities with more than 300 engineers. In 2012, he was asked to head up the automotive microcontroller development division for which he and his team developed the highly successful AURIX product family.

  • Monday, Jul. 23, 2018
DaVinci Resolve Studio deployed for editing, grading and audio post on Elton John projects
Chris Sobchack
FREMONT, Calif. -- 

Blackmagic Design has announced that DaVinci Resolve Studio and DaVinci Resolve Mini Panel are being used by video director John Steer and on-tour postproduction specialist Chris Sobchack for end-to-end post on a variety of video productions for Grammy-winning legend and singer/songwriter Elton John.

A Micro Studio Camera 4K is also used on tour, in conjunction with a Micro Cinema Camera, Video Assist and Video Assist 4K to shoot interview and behind-the-scenes (BTS) footage, along with HyperDeck Studio Pro and HyperDeck Studio Mini to record the live performances, and MultiView 4 to monitor camera feeds. Teranex Mini and UltraStudio Express are also used to send camera signals to the video wall on stage.

“While on tour, John and I are responsible for shooting and post on a variety of video productions, such as content for Elton’s YouTube channel, clips for broadcast television and award shows, packages for fan clubs and VIPs, dedication videos, and more,” said Sobchack. “It can involve archival footage, current tour footage and new footage that we, or outside video production companies, shoot, such as interviews and BTS, so we rely heavily on DaVinci Resolve Studio in post to bring everything together, often on very tight deadlines.”

The process involves taking tour footage with reference audio tracks from the sound engineer and deconstructing the footage, along with archival elements, BTS and newly shot footage, into component shots for editing, grading and audio sweetening all in DaVinci Resolve Studio. According to Sobchack, audio post can be as simple as compression and leveling, or as complex as getting some or all of the multi-track files the team records each night to augment, or even create a complete studio mix. “I’ll also add and keyframe audience microphones to enhance the live ambiance or use the Fairlight page to minutely fix any visual sync issues,” he noted.

For editing, Steer splices the archival footage, BTS and newly shot footage with the raw footage from the tour’s live shows. “I handle the offline edit, while Chris handles the online, and that’s where DaVinci Resolve Studio works so well, as we can work simultaneously by sharing files back and forth,” he said. “I also use DaVinci Resolve Studio to make copies of the whole show in a lower resolution, so we have a backup viewing copy. We use DaVinci Resolve Studio to put together everything from video idents to full songs from the show while we are on the road touring, and I find it so intuitive and easy to use. Also, with Fairlight, it’s so easy to sweeten the audio, and its features keep expanding.”

As Steer noted, Sobchack is responsible for online editing, grading, audio editing and sweetening, and delivery.

“During the live performances, the lighting is constantly changing, and overall, the footage is darker than what’s needed for broadcast or the web, as the concerts are lit for the human eye rather than for a camera. My main objective is to retain the flair of the live show’s lighting design, but also be able see Elton’s face. I also have to make the performance footage cohesive with any BTS or newly shot footage,” explained Sobchack. “In DaVinci Resolve Studio, I use gradients, vignettes on faces, HSL qualifiers and Power Windows to brighten things up and meld the radically different colors in the shots.”

He continued, “I also reframe shots on occasion and rely on DaVinci Resolve Studio’s temporal noise reduction. Since we don’t shoot in light that is really video project-friendly, when we make the kinds of adjustments we need for broadcast, especially if it’s being up-resed for a prime time network for instance, this feature can take a shot from a zoomed in camera that was 60 yards away from the stage and make it look perfect.”

As multi-camera recording has not been feasible on the tour, the footage also has burned in transitions, so when grading, Sobchack picks a cut point from shot A to shot B and implements an animated color transition using keyframes, ensuring the first frame of shot B matches the last frame of shot A. “Instead of using primary wheels, I use levels, and being able to jot down numbers and easily match them using the DaVinci Resolve Mini Panel is great. The panel not only adds a great tactile feel to my workflow, but since it has both dedicated and soft, page-specific knobs, it really lets me dial in and drop down to exact values, which helps with getting everything to match really quickly,” added Sobchack.

He concluded, “John and I have our regular tour duties on top of the video production work, so there’s no time for transcoding and bouncing between programs. DaVinci Resolve Studio is a one-stop-shop that gives us the capability to go from media ingest all the way through to final output in one system, and that capability is huge.”

  • Friday, Jul. 20, 2018
Avid Customer Association Vote yields insights into technology priorities
Avid CEO and president Jeff Rosica
BURLINGTON, Mass. -- 

Avid and the Avid Customer Association (ACA) have announced the results of the 2nd annual ACA Vote. The ACA Vote gives TV, film and music professionals a uniquely powerful hand in influencing Avid’s future offerings. The results help prioritize Avid’s product and innovation roadmap for 2018 and beyond, while calling attention to its customers’ most pressing requirements and concerns. The findings on emerging technology and new business requirements also provide valuable understanding of the media industry’s plans and challenges related to multiplatform content delivery, remote production, 4K/UHD, multichannel/immersive audio, and file-based/tapeless workflows.

“Each year, the ACA Vote uncovers the most important needs of Avid’s customers—and indeed the industry as a whole,” said ACA co-chair Richard Friedel, EVP and general manager for Fox Networks Engineering & Operations. “The results help Avid identify where it needs to focus its efforts and contribute to the ongoing dialogue between Avid and the ACA so that we can work together to address the most pressing industry challenges.”

This year, more than 4,800 people working in all aspects of media and representing over 3,500 organizations in 117 countries participated in the vote. Their collective input ensures that Avid is focused on helping them to secure their most promising opportunities while fostering deeper collaboration between Avid and its customers.  ACA leadership and Avid appointed Devoncroft Partners, a leading media technology research provider, to help expand the efforts of surveying, capturing and analyzing revealing opinions from the community on evolving business dynamics, emerging trends, and technology directions that are important to the media industry.

Major Findings
The ACA Vote revealed that across all sectors—broadcast, video and audio—the top three technology trends facing media companies are multiplatform content delivery, remote production and 4K/UHD. Broadcast and video professionals exclusively (not audio professionals) also consider 4K/UHD, file-based/tapeless workflows and high dynamic range (HDR) to be significant trends.

  • 56.9% indicate that multiplatform content delivery is the most important trend because it allows content to be available on all distributed platforms, with 17% believing it provides the potential for new revenue streams.
  • 9.8% of respondents believe multiplatform content delivery helps retain consumers’ attention, while 9.2% believe it helps them achieve a competitive advantage in the marketplace. 

According to respondents, remote production is the second most important trend:

  • 22.6% of voters indicate that remote production saves costs compared to on-premise event production.
  • 17.4% believe remote production better utilizes technology infrastructures at existing facilities.
  • 13.9% say remote production improves the quality of event production capabilities.

4K/UHD is another trend that continues to be top of mind for media companies:

  • The majority of media professionals (47.9%) believe that viewer demand for 4K will make it a necessity to remain competitive.
  • Nearly 20% of respondents believe 4k/UHD adds image quality in the production process, while 19.7% perceive it helpful to futureproof content.

The ACA Vote results also showed meaningful UHD penetration across several categories, including encoding/transcoding, graphics and branding, production servers and video editing.

According to report findings, the most significant obstacles to achieving goals within these trend areas were cost and interoperability. When looking at end users’ budgets and their plans to allocate resources over the next 12-18 months:

  • 50% say they will upgrade audio capabilities.
  • 25% say they will invest in cloud services and technology.
  • 25% say they will upgrade their infrastructure for 4K/UHD operations.

“The Avid Customer Association continues to take important strides to give anyone in the media industry a voice that can positively influence the direction of the industry and Avid’s technical contributions,” said Avid CEO and president Jeff Rosica. “Over the past 12 months, we’ve delivered two major waves of product innovations based on last year’s ACA Vote, and we’ll begin to deliver on these latest voter priorities with the next big wave we’ll be announcing at IBC 2018.”

  • Tuesday, Jul. 17, 2018
Why is Facebook keen on robots? It's just the future of AI
This photo shows Yann LeCun, Facebook's chief AI scientist. Facebook is announcing several academic hires in artificial intelligence, including Carnegie Mellon researcher Jessica Hodgins, who's known for her work making animated figures move in more human-like ways. (Facebook via AP)
SAN FRANCISCO (AP) -- 

Facebook announced several new hires of top academics in the field of artificial intelligence Tuesday, among them a roboticist known for her work at Disney making animated figures move in more human-like ways.

The hires raise a big question — why is Facebook interested in robots, anyway?

It's not as though the social media giant is suddenly interested in developing mechanical friends, although it does use robotic arms in some of its data centers. The answer is even more central to the problem of how AI systems work today.

Today, most successful AI systems have to be exposed to millions of data points labeled by humans — like, say, photos of cats — before they can learn to recognize patterns that people take for granted. Similarly, game-playing bots like Google's computerized Go master AlphaGo Zero require tens of thousands of trials to learn the best moves from their failures.

Creating systems that require less data and have more common sense is a key goal for making AI smarter in the future.

"Clearly we're missing something in terms of how humans can learn so fast," said Yann LeCun, Facebook's chief AI scientist, in a call with reporters last week. "So far the best ideas have come out of robotics."

Among the people Facebook is hiring are Jessica Hodgins , the former Disney researcher; and Abhinav Gupta, her colleague at Carnegie Mellon University who is known for using robot arms to learn how to grasp things.

Pieter Abbeel, a roboticist at University of California, Berkeley and co-founder of Covariant.ai, says the robotics field has benefits and constraints that push progress in AI. For one, the real world is naturally complex, so robotic AI systems have to deal with unexpected, rare events. And real-world constraints like a lack of time and the cost of keeping machinery moving push researchers to solve difficult problems.

"Robotics forces you into many reality checks," Abbeel said. "How good are these algorithms, really?"

There are other more abstract applications of learnings from robotics, says Berkeley AI professor Ken Goldberg. Just like teaching a robot to escape from a computerized maze, other robots change their behavior depending on whether actions they took got them closer to a goal. Such systems could even be adapted to serving ads, he said — which just happens to be the mainstay of Facebook's business.

"It's not a static decision, it's a dynamic one," Goldberg said.

For Facebook, planting a flag in the hot field also allows it to be competitive for AI talent emerging from universities, Facebook's LeCun said.

Bart Selman, a Cornell computer science professor AI expert, said it's a good idea for Facebook to broaden its reach in AI and take on projects that might not be directly related to the company's business — something that's a little more "exciting" — the way Google did with self-driving cars, for example.

This attracts not just attention, but students, too. The broader the research agenda, the better the labs become, he said.

AP Technology Writer Barbara Ortutay in New York contributed to this report.

  • Tuesday, Jul. 17, 2018
Foundry unveils Nuke 11.2
Nuke 11.2
LONDON -- 

Creative software developer Foundry has launched Nuke 11.2, bringing a range of new features and updates to the compositing toolkit. The latest instalment of this cutting-edge series will let artists work quicker than ever before through upgraded UI features and performance capabilities, alongside a new API for deep compositing that can increase the speed of script processing.

Jeff Ranasinghe, VFX supervisor, commented: “Faster playback performance, UI updates such as drag and drop user knobs, and masks on Smart Vectors, all come together to make the experience far more gratifying and productive. With Nuke Studio, it’s great to see what was always an inspired concept become even more complete as the workhorse for production projects.”

Key features for Nuke 11.2 include:

  • New API for deep compositing. Foundry tests have delivered 1.5x faster processing with a new API for deep compositing which manages memory efficiently. Larger scripts have scope for even faster processing speeds. Nuke 11.2 also includes updates to the DeepExpression node and the ability to use Nuke’s metadata nodes within a deep stream.
  • Faster node and user parameter creation. In this instalment, the Nuke Tab menu and UI for creating user knobs have been enhanced to improve user experience for some of the most common tasks: adding nodes and creating Gizmos.
  • The updated Tab menu allows artists to find nodes using partial words, set “favorite” nodes and organize them via a weighting system. These improvements add up to substantial time savings when building scripts with a large number of nodes.
  • A new UI allows user knobs to be linked between nodes by simply dragging and dropping. Artists can add, rearrange or remove user parameters using the same interface. This replaces the drop-down menus for picking user knobs, dramatically speeding up the setup of Live Groups and Gizmos and reducing the average number of clicks required from seven to three.
  • Smart Vector live output and mask input. The Smart Vector toolset is now even faster to use and more effective in shots with occluding objects. Smart Vector and Vector Distort have been optimized for the GPU, allowing users to generate Smart Vectors on the fly and preview the result without needing to pre-render the vectors. A new mask input allows artists to identify areas of motion to ignore when generating the Smart Vectors and warping the paint or texture. As a result, the Smart Vector toolset can now be used on shots with occluding objects with less laborious manual clean-up, speeding up the use of the toolset in more complex cases.

Upgrades for Nuke Studio
Nuke Studio now benefits from an updated project panel UI, providing the artist with new visual controls for managing and organizing complex projects. For quick visual reference, artists can assign colors to items in the project bin and the timeline, based on file type and other parameters accessible via the UI and python API.

Artists can also set the poster frame for single or multiple clips in the bin or from the viewer: useful when working with clips that have black frames or slates at the start. These improvements will help artists to visually distinguish between different file types in a far easier way.

Nuke 11.2 brings improved sorting and searching features to Nuke Studio, allowing artists to easily arrange project bins in custom orders. Artists can also search through project items using multiple keyword and file metadata with keywords, or through all metadata. This improved functionality of the project panel will aid artists managing larger projects.

Christy Anzelmo, sr. commercial product manager at Foundry, commented: “With Nuke 11.2, we’ve listened to our customers and built on the features introduced in previous Nuke 11 releases. Our focus has been on improving artists’ day-to-day experience and speeding up time-intensive tasks like deep compositing. This release will help teams tackle complex VFX work faster.”

Nuke 11.2 goes live today and will be available for purchase--alongside full release details--on Foundry’s website and via accredited resellers.

  • Tuesday, Jul. 17, 2018
Caledonia Investments acquires majority stake in Cooke Optics
Les Zellan
LEICESTER, UK -- 

Caledonia Investments plc, a self-managed investment trust, has acquired a majority stake in Cooke Optics. The current Cooke Optics management team, including chairman Les Zellan, CEO Robert Howard and COO Alan Merrills, remains in place, and day-to-day activities at the Leicester-based company will continue unchanged.

“We have been experiencing a sustained period of growth, and the time was right to look for a new investor to help us develop further,” said Zellan. “Caledonia, another historic British company, has a reputation for long-term investment and for supporting management teams to grow their businesses. With a strong financial partner in Caledonia, Cooke is in an excellent position to continue designing and making more of our coveted lenses for the film and television industry.”

The lineup of Cooke lenses is designed from the ground up, not repurposed from existing components, and every lens is hand crafted in the Leicester factory.

Cooke’s most recent developments include the acclaimed S7/i full frame lens range which correctly anticipated the current large format trend; the Panchro/i Classic range which replicate the beloved look of vintage Speed Panchro lenses but with modern housing, mounts and glass; and the Anamorphic/i SF range that adds more exaggerated anamorphic attributes including lens flare and oval bokeh. The company is also behind the lens metadata standard, /i Technology, which captures valuable lens information for use on-set and in postproduction.

The latest Cooke lens ranges will be on display on Stand 12.D10 at IBC 2018, held in Amsterdam, Netherlands from September 14-18, 2018.

  • Thursday, Jul. 12, 2018
Carlo Bolognesi to lead Broadcast Pix EMEA sales
Carlo Bolognesi
CHELMSFORD, Mass. -- 

Carlo Bolognesi has joined Broadcast Pix to lead sales in the Europe, Middle East, and Africa (EMEA) region. Since 1994, Bolognesi has owned IREL Engineering, a systems integration and consulting firm based in Alessandria, Italy.

“Carlo has been delivering Broadcast Pix systems and other equipment to European broadcasters and video professionals through his company for more than 20 years,” said Russell Whittaker, worldwide director of channel sales. “We are excited to have him join our team to help drive sales of our BPswitch integrated production switchers, VOX visual radio systems, and ioGates cloud-based media management solutions.”

  • Thursday, Jul. 12, 2018
25+ major summer feature films used Blackmagic Design
This image released by Twentieth Century Fox shows Ryan Reynolds in a scene from "Deadpool 2." (Twentieth Century Fox via AP, File)
FREMONT, Calif. -- 

Blackmagic Design announced that more than 25 of the 2018 summer season’s worldwide film releases used numerous Blackmagic Design products during production and post, including its digital film cameras, DaVinci Resolve Studio editing, color correction and audio post production application and more. This included some of the biggest blockbusters and expected blockbusters of the summer, such as “Deadpool 2,” and “Jurassic World: Fallen Kingdom.”

Blackmagic Design products were used at nearly every stage of production and post production on various summer films created around the world. DaVinci Resolve continues to be the go-to application for many of the world’s leading editors, colorists and postproduction facilities, such as EFILM’s Skip Kimball on “Deadpool 2,” Harbor Picture Company’s Joe Gawler for “Solo: A Star Wars Story” and “Mile 22” by Stefan Sonnenfeld of Company 3.

The summer films that used Blackmagic Design products for production include:

  • Marvel Studios’ “Avengers: Infinity War” DIT Kyle Spicer used DaVinci Resolve, UltraStudio, SmartVideohub 40x40, DeckLink Quad 2, DeckLink Mini Monitor and a variety of converters for on-set work;
  • Marvel Studios’ “Ant-man and Wasp” DIT Daniele Colombera used a variety of Blackmagic Design capture and playback devices, Mini Converter, Mini Monitor, SmartScope Duo, Smart Videhub and DaVinci Resolve for on-set work;
  • “The Darkest Minds” Action Unit DP Paul Hughen, ASC, and Action Unit Director Jack Gill used four Micro Studio Camera 4Ks as small and lightweight stunt cameras;
  • “Deadpool 2” DIT Simon Jori used DaVinci Resolve and ATEM 1M/E for greenscreen/bluescreen and on-set work;
  • “Dog Days” DIT Dane Brehm used Smart Videohub CleanSwitch 12x12, SmartScope Duo and SmartView Duo for on-set work;
  • “The First Purge” DIT Lewis Rothenberg used UltraStudio 4K, HDLinks and Smart Videohub 20x20 and Smart Videohub 16x16 for on-set work;
  • “Hotel Artemis” DIT Lonny Danler used DaVinci Resolve, a Micro Videohub and DeckLink Mini Monitor for on-set work;
  • “Mile 22” DIT Urban Olsson used Teranex Mini SDI to HDMI, MultiView 4, Smart Videohub 20x20, DeckLink 4K Extreme 12G, Ultrastudio 4K, Ultrastudio and DaVinci Resolve for on-set work and for the creation of an on-set VR monitoring solution for the film’s director;
  • “Overboard” DIT Mark Allan used DaVinci Resolve, HDLinks and UltraStudio Express for on-set work;
  • “Show Dogs” DIT Thomas Patrick used DaVinci Resolve, HyperDeck Studio, SmartView monitors and Videohub monitors for on-set;
  • “Siberia” DP Eric Koretz used a Micro Studio Camera 4K with a Video Assist 4K, as well as DaVinci Resolve and a DaVinci Resolve Micro Panel for testing different looks on-set;
  • “Sicario: Day of the Soldado” DIT Ryan Nguyen used various routers, Mini Monitors and DeckLink cards, along with DaVinci Resolve on-set; and
  • “Skyscraper” 2nd Unit DIT Mark Allan used HDLink, UltraStudio Express and DaVinci Resolve for on-set work.
     

For Post Production using DaVinci Resolve Studio:

  • “Adrift” by Stefan Sonnenfeld of Company 3;
  • “American Animals” by Rob Pizzey of Goldcrest for The Orchard;
  • “Deadpool 2” by Skip Kimball of Deluxe’s EFILM;
  • “Hearts Beat Loud” by Mike Howell of Color Collective;
  • “Juliet, Naked” by Nat Jencks of Goldcrest;
  • “Jurassic World: Fallen Kingdom” by Adam Glasman of Goldcrest;
  • “A Kid Like Jake” by Nat Jencks of Goldcrest;
  • “Madeline’s Madeline” by Nat Jencks of Goldcrest;
  • “Mile 22” by Stefan Sonnenfeld of Company 3;
  • “The Miseducation of Cameron Post” by Nat Jencks of Goldcrest;
  • “On Chesil Beach” by Tom Poole of Company 3;
  • “Overboard” by Trent Johnson of MTI Film;
  • “RBG” by Ken Sirulnick of Glue Editing & Design, along with Fusion Studio for stabilizing work;
  • “Sicario: Day of the Soldado” by Stephen Nakamura of Company 3;
  • “Skyscraper” by Stefan Sonnenfeld of Company 3;
  • Disney Lucasfilms’ “Solo: A Star Wars Story” by Joe Gawler of Harbor Picture Company;
  • “Sorry to Bother You” by Sam Daley at Goldcrest;
  • “Sorry to Bother You” for VFX by Darren Orr of Beast; and
  • “SuperFly” by David Hussey of Company 3.
  • Tuesday, Jul. 10, 2018
Dejero, Draganfly partner on real-time video transport
Draganfly's Commander UAV bundled with the Dejero EnGo mobile transmitter
WATERLOO, Ontario -- 

Dejero, known for cloud-managed solutions that provide video transport and Internet connectivity while mobile or in remote locations, has formed a technology partnership with Canadian-based sUAS (small Unmanned Aircraft System) industry experts, Draganfly Innovations Inc. The collaboration sees Draganfly’s Commander UAV (unmanned aerial vehicle) quadcopter bundled together with the Dejero EnGo mobile transmitter, providing real-time video transport from the air. In addition, the companies’ combined expert knowledge will bring new and innovative solutions and services to Dejero’s broadcast customers and Draganfly’s customers across the many industry verticals they serve.
 
This collaboration enables broadcasters to integrate live video captured with UAVs into their newsgathering, sports and event coverage, and video production for television and online audiences. It will also facilitate Dejero in reaching new industries and applications, providing real-time on-board video transport over IP to the military, public safety, and government sectors that Draganfly has traditionally operated in.
 
The Draganflyer Commander UAV is a remotely operated, unmanned, miniature helicopter designed to carry wireless camera systems. The professional quality, powerful, easy to fly aerial platform is specifically designed for high endurance applications such as public safety, search and rescue, agriculture, mapping, aerial photography, and more. Dejero’s highly versatile EnGo mobile transmitter will be instrumental in reliably providing high-quality live video from Draganfly’s Commander, which will in turn allow Draganfly to elevate its offering.
 
“Historically, UAV use in broadcast has been challenging, in particular when it comes to providing high-quality video with low latency and with the reliability needed for live broadcasts,” explained Kevin Fernandes, VP of sales at Dejero. “Through our collaboration with Draganfly, we can provide an effective solution for broadcast and media organizations, as well as other industries requiring the reliability and picture quality that customers require.”
 
“We are thrilled to be adding broadcast-quality live video feeds to our Commander vehicle,” said Draganfly president Zenon Dragan. “The timing couldn’t be better as we’ve recently expanded into contract engineering and custom product development. Our partnership with Dejero will greatly support this.”
 
Well-versed in the design of sophisticated multi-rotor aircraft, ground-based robots, and fixed wing aircraft, Draganfly also provides custom payloads, ground-up software design, electronics, UAV program development, and flight training.

  • Thursday, Jul. 5, 2018
Encore's Pankaj Bajpai turns to Baselight to color the story of Picasso
A scene from "Genius: Picasso" (photo courtesy of National Geographic)
LONDON -- 

The second series of National Geographic’s popular series Genius focuses on the extraordinary life of painter Pablo Picasso. Colorist Pankaj Bajpai at Encore in Hollywood was charged with creating the look that evoked the time and place as well as the art.

Bajpai was the colorist on the first series of Genius, which featured Einstein and was set in Germany. Series two inhabits the much warmer climes of Spain and Paris. Working with DP Mathias Herndl, it was the place that led them to the look: “We anchored ourselves in the quality, color and texture of the Spanish light in Málaga, the birthplace of Picasso,” Bajpai explained.

The series tells the story of Picasso (played by Antonio Banderas) and his complicated, chaotic lifestyle which was the source of his vibrant paintings. National Geographic and the show’s creator Kenneth Biller were concerned with creating the sense and style of the first half of the 20th century. “A key challenge was to maintain the authenticity of the period, and yet somehow keep a contemporary flair,” Bajpai recalled.

“We start with Picasso’s father and his memory of the bullfights--it’s all incredibly warm,” he continued. “Then Picasso as a young man, with many candlelit interiors. And towards the very end the palette becomes sparse and cold, as his life becomes more isolated.”

Herndl shot the series using the ARRI ALEXA, and Bajpai was involved from the earliest days of the production. “Mathias and I have a long working relationship, and much of our understanding is intuitive--it’s a partnership where few words are exchanged. I know Mathias’s instincts when he is shooting, and he knows how I might approach the captured image. When it all comes together, it’s wonderful.”

At Encore, colorist Bajpai used the latest Version 5.0 of Baselight software, giving him access to Base Grade, FilmLight’s popular feature of the colorist’s toolkit. “It allowed me to approach grading using the classic zone system for the first time, which was tremendous. It is possibly one of the most practical and significant advances in grading technology in a long time.

“There are many scenes in the show where there are old European-style big and bright windows,” he said. “To be able to maintain details in the high-high-highlights and low-low-lowlights and still keep everything in between was unbelievably fast and clean.”

Executive produced by Ron Howard and Brian Grazer, Genius is a major success for National Geographic. Season one was the network’s most-watched show of 2017 and earned the network a record 10 Emmy nominations.

MySHOOT Profiles

Ky Dickens
Cinematographer, Director

Director

MySHOOT Company Profiles