Monday, December 11, 2017

Toolbox

  • Monday, Dec. 11, 2017
RED WEAPON camera with MONSTRO sensor introduced to marketplace
RED WEAPON camera with the MONSTRO 8K VV sensor
IRVINE, Calif. -- 

RED Digital Cinema® has announced that its cinematic full frame WEAPON® camera with the MONSTRO™ 8K VV sensor is available for purchase. MONSTRO is an evolutionary step in large-format sensor technology, with improvements in image quality including dynamic range and shadow detail.

RED’s newest camera and sensor combination, WEAPON 8K VV, offers full frame lens coverage, captures 8K full format motion at up to 60 fps, produces ultra-detailed 35.4 megapixel stills, and delivers incredibly fast data speeds--up to 300 MB/s. And like all of RED’s DSMC2 cameras, WEAPON shoots simultaneous REDCODE® RAW and Apple ProRes or Avid DNxHD/HR recording and adheres to the company’s dedication to OBSOLESCENCE OBSOLETE® — a core operating principle that allows current RED owners to upgrade their technology as innovations are unveiled and move between camera systems without having to purchase all new gear.

The WEAPON 8K VV is priced starting at $79,500 (for the camera BRAIN) with upgrades available for carbon fiber WEAPON customers.  RED is also offering  RED® ARMOR-W, an upgraded coverage program for RED WEAPON® cameras that includes increased warranty protection, and a sensor swap service.

  • Thursday, Dec. 7, 2017
URSA Mini Pros, DaVinci Resolve Studio deployed on "The S.P.A.A.C.E. Program"
Lensing "The S.P.A.A.C.E. Program"
FREMONT, Calif. -- 

Blackmagic Design announced that director, editor, colorist and post supervisor Alex Ferrari used Blackmagic URSA Mini Pro digital film cameras and DaVinci Resolve Studio to shoot, edit, grade and finish the streaming series “The S.P.A.A.C.E. Program.”
 
“The Scientific Pop and Also Cultural Explorations Program” aka “The S.P.A.A.C.E. Program” is a new streaming series from Nerdist and Legendary Digital Networks available on the Alpha streaming platform. The eight-episode series blends together science and pop culture by visiting different fictional planets and realms, such as Tatooine, Krypton, Arrakis and Westeros, and examining them through a scientific lens. Host Kyle Hill and his robot assistant AI visit a different place from pop culture each episode and break down the big scientific questions, such as what is it really like to live on a planet with two suns or what makes a White Walker a White Walker.
 
Led by Ferrari, the series was shot using two URSA Mini Pros. “We only had four days to shoot all eight episodes, so it was a very fast-paced shoot,” Ferrari explained. “We decided to shoot with the URSA Mini Pros because we knew they’d be reliable and fast, and they’d get us the cinematic look we were going for. You can take them straight out of the box and they’re ready to go with no fuss. The menu and operating system is intuitive and easy to use, so you don’t waste any time while shooting, and having the timecode on the side was helpful. Reliability can be a rarity, so the fact that we could count on them when we were in the heat of battle really made a difference.”
 
“We shot everything in a practical spaceship set that showed the cockpit, hallway and war room,” Ferrari continued. “All the windows were green screen, and we created outer space and the surrounding worlds in post. Being able to cleanly pull keys was crucial, and the camera’s sensors made it easy. We shot the whole series in 4.6K ProRes, which gave us a lot of latitude in post.”
 
DaVinci Resolve Studio was used on set by the DIT and then in post by Ferrari for the series’ full editing, grading and finishing.
 
“Using DaVinci Resolve Studio on set allowed us to organize and synch everything in real-time. At the end of the shoot, we easily exported everything and went right into editing,” said Ferrari. “By keeping everything in the ecosystem, I was able to directly edit the entire series in native 4.6K ProRes without having to transcode to a smaller proxy file. Doing everything soup to nuts in DaVinci Resolve Studio saved us a lot of time that would have been spent roundtripping.
 
“Moreover, it’s allowed me to evolve my editing process so color is intertwined rather than a separate function. As I edit and select shots, I can easily jump from the Edit Page to the Color Page to see if I can save a shot that might be too blown out or too dark. I can work on the lighting in real-time to see if I can make the shot work, which is invaluable in the creative process. Using DaVinci Resolve Studio, I can make editorial decisions based on what I know I can make work in color, rather than just hoping something will work down the line or scrapping what might be the best take because it initially seems unusable.”
 
When grading the series, Ferrari was inspired by the planets and realms Hill and AI visited. “We wanted the series to look cohesive from episode to episode, but we also wanted each to have its own look that mirrors the land we’re visiting. For example, the episode on LV-426 is more cold and desaturated. I used a greenish overtone for the episode with the Borg, whereas I used a very warm palette for King Kai’s planet in the Dragon Ball Z episode. Since each place the series visited has such a strong look already associated with it, we wanted to play homage to that,” Ferrari concluded.

  • Wednesday, Dec. 6, 2017
Dalet placed at core of BBC Wales' new broadcast center
BBC Wales
PARIS -- 

Dalet, a provider of solutions and services for broadcasters and content professionals, announced that BBC Wales has selected the enterprise Dalet Galaxy Media Asset Management (MAM) and Orchestration platform to facilitate all workflows and asset management requirements at its new state-of-the-art media facility located in Cardiff, Wales. Once deployed, Dalet Galaxy will offer a centralized content repository and provide tools to orchestrate workflows and media processes across production, news, studios and delivery departments. The massive installation design and multi-year deployment will be managed by Dalet Professional Services, which will ensure customer success in the transformation journey towards agility and maximize return on investment (ROI).

“BBC Wales is pleased to be working with Dalet to provide an asset management system for our new home in Central Square, Cardiff.  Dalet was chosen after a very competitive process, and will provide an important part of the technology solution at Central Square within a state of the art broadcast center.  We are looking forward to the successful delivery of the project,” said Gareth Powell, chief operating officer, BBC Wales.  

As the core media hub, Dalet Galaxy will be deployed as the cornerstone of the new digital facility. All systems and sub-systems deployed in future phases will connect to this hub. The state-of-the-art, BPMN-compliant Dalet Workflow Engine will enable the BBC to orchestrate a combination of user tasks and media services ranging from ingest, transcoding and QC, to logging, editing, media packaging and distribution. A simple-to-use workflow designer interface allows users to model business processes, picking from a palette of stencils operations such as user tasks and notifications, media and metadata services, gateways, timeout and error management, and much more.

The comprehensive and open Dalet Galaxy API will allow the BBC to tightly connect storage and infrastructure technologies, media services and post-production applications, and traffic and business platforms, orchestrating a fluid workflow that tracks assets and associated metadata across the media enterprise.

“We have been working with the BBC on a multitude of projects for more than fifteen years. commented Adrian Smith, regional manager, Dalet UK. “Dalet Galaxy’s flexible architecture provides a future-proof framework on which the BBC can evolve to meet new requirements and production needs that arise over coming months and even years. The Dalet Professional Services team’s experience in managing such enterprise rollouts will help them navigate the juggernaut of this multi-year, large-scale deployment.”

In addition to Dalet Galaxy, Dalet will be supplying a new Dalet HTML application for simplified management of camera card ingests and its Dalet Brio video server. Supporting both SDI and IP, the versatile, high-density Dalet Brio ingest and playout platform adheres to the SMPTE 2110 standards, allowing broadcasters to step into the future of IP while retaining the security of SDI.

  • Monday, Dec. 4, 2017
Foundry launches Cara VR 2.0 
Cara VR 2.0's new GlobalWarp feature
LONDON -- 

Creative software developer Foundry, has announced the launch of Cara VR 2.0, the next chapter for the cutting-edge virtual reality plug-in toolset for Nuke.

Building on the first-of-its-kind plug-in debuted in 2016, Cara VR 2.0 boasts improved stitching and stabilization, allowing for more efficient creation of seamless VR and 360 video content with the highest levels of quality. The new version features major updates in stitching, the introduction of 360 match-move tracking and automatic stabilization, with new tools for stereoscopic corrections using the cutting edge algorithms from Ocula.
 
Craig Rodgerson, CEO  of Foundry commented: “To fully realize the potential of VR, we need to enable content creators to build experiences that are more immersive than anything before. The first iteration of our Cara VR toolkit was hugely well-received, and this latest version will help usher in the next level of VR experiences. Artists can now better meet the demand for VR content thanks to our industry-leading creative toolset.”
 
Cara VR 2.0’s new GlobalWarp node speeds up delivery of stitches while producing a high quality 360 stitch with minimal ghosting. Global Warp adds additional controls for lining up key features in overlapping areas and allows you to add constraints to reduce warping on known straight lines, even for objects overlapping multiple camera views, helping users achieve the highest quality stitch faster.

Cara VR 2.0 includes a redesigned Tracker which accelerates the process of stabilization and match-moving for a more comfortable VR experience and easier alignment of 3D elements.  Automatically track a 360 stitch for stabilization and create a 360 match-move camera to assist in 3D corrections and CG insertion. The Tracker node now simplifies stabilization, adding 3D stabilisation to remove parallax changes, and brings match-moving to Cara VR.
 
Cara VR 2.0 also includes a suite of tools adapted from the powerful Ocula toolset which have now been optimized to work with 360 video, making these powerful algorithms accessible to VR content creators and more efficient to use in a 360 video workflow.

These tools take the headache out of stereo cleanup, allowing for efficient correction of alignment, focus and color across stereo views resulting in sharp and accurate stereoscopic content.  This release includes updated versions of the Disparity Generator node, which generates high quality disparity vectors for depth estimation, Disparity To Depth for calculating depth from the disparity vectors, a new Stereo Colour Matcher for unifying color between left and right views, and New View, allowing you to rebuild one stereo view from another, all optimized for use with 360 footage.
 
Cara VR 2.0 is available for purchase on Foundry’s website and via accredited resellers.

  • Friday, Oct. 13, 2017
Cinematographers play key role in Tech Emmy win for Fujinon Cine Zooms
During Fujinon Day Atlanta, Bill Wages, ASC (r), gives feedback on FUJINON Cabrios with Radames Gonzalez from Arri Rental in lens projection room.
WAYNE, NJ -- 

The Optical Devices Division of FUJIFILM has been awarded  an Engineering Emmy® for its FUJINON “4K Cine Zoom Lenses providing imagery in television” by the Television Academy, and will receive the honor at the Academy’s October 25 Engineering Awards ceremony at Loews Hollywood Hotel. The introduction of FUJINON’s Cabrio and Premier series of cinema zoom lenses brought about the ability to cover Super 35mm imagers and efficiently shoot the full gamut of television production without sacrificing image quality.

“The willingness of some of the top cinematographers and their rental houses to test, explore and provide feedback about our lenses is an integral part of this Emmy win,” states Thomas Fletcher, director of sales, FUJIFILM Optical Devices Division. “They’re a very loyal group, devoted to their lens choice. To test a new cinema lens is not something that’s considered lightly. Winning an honor as prestigious as an Emmy is an affirmation of Fujifilm’s dedication to the art and craft of cinematography. We thank the Academy for their recognition of our work and for the support we’ve received from the cinematography community.”

In fact, two cinematographers won Creative Arts Emmys this year using FUJINON cine zooms: David Miller, ASC, for Veep won Outstanding Cinematography for a Single-Camera Series (Half Hour) honors; and Donald A. Morgan, ASC, was awarded an Emmy for The Ranch in the Outstanding Cinematography for a Multi-Camera Series category.

Others who’ve embraced the new FUJINON zoom lenses include three-time ASC Award winner William Wages, ASC (Burn Notice, Containment, Revolution, Sun Records). For Wages, the FUJINON Cabrio 19-90 and 85-300mm zooms have “changed the way I shoot.”  Wages added: “With their speed alone, they’re virtually the only lenses I’m using. The optical quality, small size and speed are unequaled. The combination of these two lenses are ideal for television production.” Wages is an ASC recipient of the Career Achievement in Television honor. 

This marks the sixth Engineering Emmy award granted to Fujifilm and Fujinon. Past awards include:

  • “Development of the new high-speed color negative film A250 Color Negative Film” in 1982
  • “Developments in Metal Tape Technology” in 1990
  • “Implementation in Lens Technology to Achieve Compatibility with CCD sensors” in 1996
  • “Lens technology developments for solid state imagers cameras in high definition formats” in 2005
  • The world's first autofocus system, "Precision Focus," in 2009
  • Wednesday, Oct. 11, 2017
Facebook gets real about broadening virtual reality's appeal
In this Jan. 6, 2016, file photo, Peijun Guo wears the Oculus Rift VR headset at the Oculus booth at CES International in Las Vegas. (AP Photo/John Locher, File)
SAN FRANCISCO (AP) -- 

Facebook CEO Mark Zuckerberg seems to be realizing a sobering reality about virtual reality: His company's Oculus headsets that send people into artificial worlds are too expensive and confining to appeal to the masses.

Zuckerberg on Wednesday revealed how Facebook intends to address that problem, unveiling a stand-alone headset that won't require plugging in a smartphone or a cord tethering it to a personal computer like the Oculus Rift headset does.

"I am more committed than ever to the future of virtual reality," Zuckerberg reassured a crowd of computer programmers gathered in San Jose, California, for Oculus' annual conference.

Facebook's new headset, called Oculus Go, will cost $199 when it hits the market next year. That's a big drop from the Rift, which originally sold for $599 and required a PC costing at least $500 to become immersed in virtual reality, or VR.

Recent discounts lowered the Rift's price to $399 at various times during the summer, a markdown Oculus now says will be permanent.

"The strategy for Facebook is to make the onboarding to VR as easy and inexpensive as possible," said Gartner analyst Brian Blau. "And $199 is an inexpensive entry for a lot of people who are just starting out in VR. The problem is you will be spending that money on a device that only does VR and nothing else."

Facebook didn't provide any details on how the Oculus Go will work, but said it will include built-in headphones for audio and have a LCD display.

The Oculus Go will straddle the market between the Rift and the Samsung Gear, a $129 headset that runs on some of Samsung's higher-priced phones. It will be able to run the same VR as the Samsung Gear, leading Blau to conclude the Go will rely on the same Android operating system as the Gear and likely include similar processors as Samsung phones.

The Gear competes against other headsets, such as Google's $99 Daydream View, that require a smartphone. Google is also working on a stand-alone headset that won't require a phone, but hasn't specified when that device will be released or how much it will cost.

Zuckerberg promised the Oculus Go will be "the most accessible VR experience ever," and help realize his new goal of having 1 billion people dwelling in virtual reality at some point in the future.

Facebook and other major technology companies such as Google and Microsoft that are betting on VR have a long way to go.

About 16 million head-mounted display devices were shipped in 2016, a number expected to rise to 22 million this year, according to the research firm Gartner Inc. Those figures include headsets for what is known as augmented reality.

Zuckerberg, though, remains convinced that VR will evolve into a technology that reshapes the way people interact and experience life, much like Facebook's social networks and smartphones already have. His visions carry weight, largely because Facebook now has more than 2 billion users and plays an influential role in how people communicate.

But VR so far has been embraced mostly by video game lovers, despite Facebook's efforts to bring the technology into the mainstream since buying Oculus for $2 billion three years ago.

Facebook has shaken up Oculus management team since then in a series of moves that included the departure of founder Palmer Luckey earlier this year.

Former Google executive Hugo Barra now oversees Facebook's VR operations.

  • Tuesday, Oct. 10, 2017
Digital Nirvana to demo media management wares at NAB Show NY
Digital Nirvana will showcase its sports clipping service at NAB New York
FREMONT, Calif. -- 

Digital Nirvana will showcase its full suite of media management products and services at the upcoming NAB Show New York. Booth highlights will include closed captioning solutions, automated sports clipping service, and the newest version of the MonitorIQ media management platform. NAB Show New York takes place October 18-19 at the Javits Convention Center, and Digital Nirvana will exhibit in booth N662.

“We enjoy NAB New York because it gives us a chance to connect with many regional current and potential customers that may not have attended recent international shows, such as IBC,” said Hiren Hindocha, president and CEO, Digital Nirvana. “We create smart media management solutions that streamline workflows, and our newest solutions and services were developed in response to consumer demand in the ever-changing broadcast and content creation landscape.”

One booth highlight will be the company’s all-in-one automated sports clipping service. Introduced earlier this year, the service enables broadcasters to easily capture and share every fast-paced moment in a game. Offering a state-of-the-art workflow and customization options, the service automatically analyzes sports broadcasts in real-time and generates ready-to-publish clips of those highlights. Digital Nirvana’s sports clipping service is coupled with automated caption synchronization, enabling sports broadcasters to publish sports media content online and via social media without any considerable time delay while complying with all FCC regulations. Watch this video to learn more about the sports clipping service. 

Another NAB Show New York highlight will be the company’s cloud-based closed captioning, subtitling, and video logging services. Offering postproduction, pop-on, and roll-up captioning services, the company offers high-quality caption generation for all pre-recorded and online video content through an automated process over the cloud. Digital Nirvana’s cloud-based caption synchronization technologies use audio fingerprinting to automate near-live synchronization of live broadcast captions. Automated speech-to-text conversion, coupled with state-of-the-art workflow and experienced captioners, reduces the time and cost to publish, provides better search engine discoverability - while complying with FCC guidelines.

On the product side, Digital Nirvana will showcase its MonitorIQ media management platform, which delivers a full range of multi-channel signal monitoring, repurposing, logging, compliance and archiving functions. The latest version of MonitorIQ, V5.0, features cloud-based recording, OTT stream monitoring functions, and HTML5 and HTTP Live Streaming (HLS) support, and incorporates the ability to record from Matrox’s Monarch HDX streaming appliance. Digital Nirvana will also showcase its standalone media management products, including the CAR/TS (Capture, Analyze, Replay – Transport Stream) transport stream recorder, which records and monitors the transport stream, provides alerts of non-compliance, offers time-shifted playout, and allows users to cut segments and export section of the transport stream for more detailed analysis. Other standalone product highlights include AnyStreamIQ for cloud-based OTT monitoring and MediaPro for content repurposing.

  • Friday, Oct. 6, 2017
RED Digital Cinema unveils Monstro 8K VV sensor
RED WEAPON with MONSTRO sensor
IRVINE, Calif. -- 

RED Digital Cinema® announced a new cinematic full frame sensor for WEAPON® cameras, MONSTRO™ 8K VV.  MONSTRO is an evolutionary step beyond the DRAGON 8K VV sensor with improvements in image quality including dynamic range and shadow detail.

This newest camera and sensor combination, WEAPON 8K VV, offers full frame lens coverage, captures 8K full format motion at up to 60 fps, produces ultra-detailed 35.4 megapixel stills, and delivers incredibly fast data speeds — up to 300 MB/s. And like all of RED’s DSMC2 cameras, WEAPON shoots simultaneous REDCODE® RAW and Apple ProRes or Avid DNxHD/HR recording and adheres to the company’s dedication to OBSOLESCENCE OBSOLETE® — a core operating principle that allows current RED owners to upgrade their technology as innovations are unveiled and move between camera systems without having to purchase all new gear.

“RED’s internal sensor program continues to push the boundaries of pixel design and MONSTRO is the materialization of our relentless pursuit to make the absolute best image sensors on the planet,” said Jarred Land, president of RED Digital Cinema. “The Full Frame 8K VV MONSTRO provides unprecedented dynamic range and breathtaking color accuracy with full support for our IPP2 pipeline.”

The new WEAPON will be priced at $79,500 (for the camera BRAIN) with upgrades for carbon fiber WEAPON customers available for $29,500. MONSTRO 8K VV will replace the DRAGON 8K VV in RED’s line-up, and customers that had previously placed an order for a DRAGON 8K VV sensor will be offered this new sensor beginning today. New orders will start being fulfilled in early 2018.

RED has announced a comprehensive service offering for all carbon fiber WEAPON owners called RED ARMOR-W. RED ARMOR-W offers enhanced and extended protection beyond RED ARMOR, and also includes one sensor swap each year.

Additionally, RED has made its enhanced image processing pipeline (IPP2) available in-camera with the company’s latest firmware release (v7.0) for all cameras with HELIUM and MONSTRO sensors. IPP2 offers a completely overhauled workflow experience, featuring enhancements such as smoother highlight roll-off, better management of challenging colors, an improved demosaicing algorithm, and more. 

  • Wednesday, Oct. 4, 2017
Frame.io gains infusion of capital, looks to advance technology
Frame.io review page
NEW YORK -- 

Frame.io, developers of the video review and collaboration platform for content creators, has raised $20 million in Series B growth funding led by FirstMark Capital. The latest round of funding was also supported by return backers Accel Partners, SignalFire and Shasta Ventures. This latest infusion of capital brings Frame.io’s total funding to date to $32 million, which the startup will use to develop in key areas including the core video review and collaboration product, cloud and content security, and the Frame.io developer ecosystem. Founded in 2014, Frame.io is also backed by Hollywood heavyweights Jared Leto and Kevin Spacey.

Emery Wells, co-founder and CEO of Frame.io., described the new round of funding as “an exciting milestone for our team, and more importantly our community, and will help us in elevating our mission from reimagining video collaboration to reimagining postproduction itself.”

Frame.io makes the process of sharing and collaborating on video projects incredibly simple, through an intuitive user interface where users can upload and organize projects, then share internally or with clients to review and add feedback. Used by leading media and entertainment companies including TechCrunch, BBC, Vice, The Onion and Facebook, Frame.io has helped countless organizations in the transition to video, as more and more companies implement video into their branding strategies.

With the Series B funding, Frame.io will be making a sizable investment in iterating the core product, with a significant focus on cloud and content security. Trusted by some of the world’s largest media corporations, security is top of mind for leaders in the industry; as such, security has become a core pillar of the Frame.io product offering, and will continue to expand with features such as watermarking and a host of new security/compliance certifications including MPAA. This investment in security will also extend to the inclusion of artificial intelligence and machine learning into the core and enterprise product roadmap.

“Artificial Intelligence is going to play a huge role in Frame.io’s future,” stated Matthew Ruttley, head of data at Frame.io, who spearheads the company’s data science initiatives. “Enterprise customers will benefit from a whole host of powerful, proprietary Machine Learning systems. These apply to everything from streamlining video review workflows, to robust, all-important security features.”

2017 has been a year of milestones for the New York City-based Frame.io, with the release of Frame.io 2 followed by the official launch of Frame.io Enterprise--the company’s enterprise-grade product designed to help the largest media clients, including Turner Broadcasting Systems and Buzzfeed, collaborate at scale. With over 370,000 users (and counting) in over 170 countries, Frame.io will be using this investment to double down on strategic product innovation, offering content creators a platform that connects the many different creative tools, publishing tools, stock services, asset management and storage systems, and many other specialty products involved in the business of creating video.

The new funding will also help Frame.io expand its rapidly growing team, which has doubled in the past year, across the board.

  • Wednesday, Oct. 4, 2017
Corso, Kalas, Silverman, Yedlin among new members of Academy’s Science and Tech Council
Leon Silverman, general manager, Digital Studio for the Walt Disney Studios
BEVERLY HILLS, Calif. -- 

Nafees Bin Zafar, Maryann Brandon, Bill Corso, Andrea Kalas, Ai-Ling Lee, Leon Silverman and Steve Yedlin have accepted invitations to join the Science and Technology Council of the Academy of Motion Picture Arts and Sciences, bringing the Council’s 2017–2018 membership roster to 25.

Bin Zafar, a technology development supervisor at Digital Domain, has worked in live-action visual effects and feature animation for the past 17 years. He received a 2007 Academy Scientific and Engineering Award for his work on the development of Digital Domain’s fluid simulation system, and a 2014 Academy Technical Achievement Award for the development of large-scale destruction simulation systems. His software has been used in a diverse set of films, including “Pirates of the Caribbean: At World’s End,” “2012,” “The Croods” and “Kung Fu Panda 3.” He also serves on the Academy’s Scientific and Technical Awards Committee and the Digital Imaging Technology Subcommittee. He became a member of the Visual Effects Branch in 2017.

Film editor Brandon earned an Oscar® nomination for her work on “Star Wars: The Force Awakens.” Her credits include such films as “Star Trek,” “Star Trek Into Darkness” and “Passengers,” and she is currently working on the feature “The Darkest Minds” for 20th Century Fox. Brandon has been a member of the Academy since 1998, and also is active in the Directors Guild of America (DGA), American Cinema Editors (ACE) and Women in Film (WIF). This month she appears in TCM’s “Trailblazing Women” series.

Corso is an Oscar-winning makeup artist and designer, whose recent credits include “Deadpool,” “Kong: Skull Island,” “Bladerunner 2049” and the upcoming “Star Wars: The Last Jedi.” His desire to bridge the gap between practical, on-set makeup and digital technology led him to create Digital Makeup Group (DMG), specializing in CG beauty work, age manipulation and makeup effects done from an expert makeup artist’s perspective. Corso has been a member of the Academy since 2004 and has served as governor of the Makeup Artists and Hairstylists Branch and chair of its executive committee. He has also served on the Academy’s Preservation and History Board Committee.

Kalas, vice president of archives at Paramount Pictures, has restored or preserved more than 2,000 films and is a technical innovator in systems for digital preservation and archive-based analytics. She also is a public advocate for preservation and film history through the Association of Moving Image Archivists, where she currently serves as president. She recently joined the Academy as a Member-at-Large.

Born in Singapore, sound designer Lee earned Oscar nominations for Sound Editing and Sound Mixing for “La La Land.” Her credits include “Buena Vista Social Club,” “Spider-Man 2,” “Transformers: Dark of the Moon,” “Godzilla,” “Wild,” “Deadpool” and “Battle of the Sexes.” She has been a member of the Academy’s Sound Branch since 2014.

As general manager, Digital Studio for the Walt Disney Studios, Silverman oversees digital studio services, which provide post production on-lot infrastructure, mastering, digital distribution services and workflow expertise. He is a past president and founder of the Hollywood Professional Association (HPA), a trade association focused on the professional media content creation industry. He currently serves as governor-at-large of the Society of Motion Picture Television Engineers (SMPTE) and is an associate member of the American Society of Cinematographers (ASC) and affiliate member of ACE. He has been an Academy Member-at-Large since 2015 and serves on the Members-at-Large executive committee.

Yedlin is a cinematographer best known for his collaboration with director Rian Johnson on his films “Brick,” “The Brothers Bloom,” “Looper” and “Star Wars: The Last Jedi.” He has made ongoing contributions to industry technical awareness and education with his short film demos, papers and seminars. Yedlin has been a member of the ASC since 2015, a guest lecturer at the American Film Institute since 2011, and a member of the Academy’s Cinematographers Branch since 2016.

The returning Council co-chairs for 2017–2018 are two members of the Academy’s Visual Effects Branch: Academy governor Craig Barron, an Oscar-winning visual effects supervisor; and Paul Debevec, a senior staff engineer at Google VR, adjunct professor at the USC Institute for Creative Technologies and a lead developer of the Light Stage image capture and rendering technology, for which he received a Scientific and Engineering Award in 2009.

The Council’s 16 other returning members are Wendy Aylsworth, Academy president John Bailey, Rob Bredow, Annie Chang, Douglas Greenfield, Rob Hummel, Academy governor John Knoll, Beverly Pasterczyk, Cary Phillips, Joshua Pines, Douglas Roble, David Stump, Steve Sullivan, Bill Taylor, Academy vice president Michael Tronick and Beverly Wood.

Established in 2003 by the Academy’s Board of Governors, the Science and Technology Council provides a forum for the exchange of information, promotes cooperation among diverse technological interests within the industry, sponsors publications, fosters educational activities, and preserves the history of the science and technology of motion pictures.