Wednesday, December 19, 2018

Toolbox

  • Tuesday, Jul. 17, 2018
Why is Facebook keen on robots? It's just the future of AI
This photo shows Yann LeCun, Facebook's chief AI scientist. Facebook is announcing several academic hires in artificial intelligence, including Carnegie Mellon researcher Jessica Hodgins, who's known for her work making animated figures move in more human-like ways. (Facebook via AP)
SAN FRANCISCO (AP) -- 

Facebook announced several new hires of top academics in the field of artificial intelligence Tuesday, among them a roboticist known for her work at Disney making animated figures move in more human-like ways.

The hires raise a big question — why is Facebook interested in robots, anyway?

It's not as though the social media giant is suddenly interested in developing mechanical friends, although it does use robotic arms in some of its data centers. The answer is even more central to the problem of how AI systems work today.

Today, most successful AI systems have to be exposed to millions of data points labeled by humans — like, say, photos of cats — before they can learn to recognize patterns that people take for granted. Similarly, game-playing bots like Google's computerized Go master AlphaGo Zero require tens of thousands of trials to learn the best moves from their failures.

Creating systems that require less data and have more common sense is a key goal for making AI smarter in the future.

"Clearly we're missing something in terms of how humans can learn so fast," said Yann LeCun, Facebook's chief AI scientist, in a call with reporters last week. "So far the best ideas have come out of robotics."

Among the people Facebook is hiring are Jessica Hodgins , the former Disney researcher; and Abhinav Gupta, her colleague at Carnegie Mellon University who is known for using robot arms to learn how to grasp things.

Pieter Abbeel, a roboticist at University of California, Berkeley and co-founder of Covariant.ai, says the robotics field has benefits and constraints that push progress in AI. For one, the real world is naturally complex, so robotic AI systems have to deal with unexpected, rare events. And real-world constraints like a lack of time and the cost of keeping machinery moving push researchers to solve difficult problems.

"Robotics forces you into many reality checks," Abbeel said. "How good are these algorithms, really?"

There are other more abstract applications of learnings from robotics, says Berkeley AI professor Ken Goldberg. Just like teaching a robot to escape from a computerized maze, other robots change their behavior depending on whether actions they took got them closer to a goal. Such systems could even be adapted to serving ads, he said — which just happens to be the mainstay of Facebook's business.

"It's not a static decision, it's a dynamic one," Goldberg said.

For Facebook, planting a flag in the hot field also allows it to be competitive for AI talent emerging from universities, Facebook's LeCun said.

Bart Selman, a Cornell computer science professor AI expert, said it's a good idea for Facebook to broaden its reach in AI and take on projects that might not be directly related to the company's business — something that's a little more "exciting" — the way Google did with self-driving cars, for example.

This attracts not just attention, but students, too. The broader the research agenda, the better the labs become, he said.

AP Technology Writer Barbara Ortutay in New York contributed to this report.

  • Tuesday, Jul. 17, 2018
Foundry unveils Nuke 11.2
Nuke 11.2
LONDON -- 

Creative software developer Foundry has launched Nuke 11.2, bringing a range of new features and updates to the compositing toolkit. The latest instalment of this cutting-edge series will let artists work quicker than ever before through upgraded UI features and performance capabilities, alongside a new API for deep compositing that can increase the speed of script processing.

Jeff Ranasinghe, VFX supervisor, commented: “Faster playback performance, UI updates such as drag and drop user knobs, and masks on Smart Vectors, all come together to make the experience far more gratifying and productive. With Nuke Studio, it’s great to see what was always an inspired concept become even more complete as the workhorse for production projects.”

Key features for Nuke 11.2 include:

  • New API for deep compositing. Foundry tests have delivered 1.5x faster processing with a new API for deep compositing which manages memory efficiently. Larger scripts have scope for even faster processing speeds. Nuke 11.2 also includes updates to the DeepExpression node and the ability to use Nuke’s metadata nodes within a deep stream.
  • Faster node and user parameter creation. In this instalment, the Nuke Tab menu and UI for creating user knobs have been enhanced to improve user experience for some of the most common tasks: adding nodes and creating Gizmos.
  • The updated Tab menu allows artists to find nodes using partial words, set “favorite” nodes and organize them via a weighting system. These improvements add up to substantial time savings when building scripts with a large number of nodes.
  • A new UI allows user knobs to be linked between nodes by simply dragging and dropping. Artists can add, rearrange or remove user parameters using the same interface. This replaces the drop-down menus for picking user knobs, dramatically speeding up the setup of Live Groups and Gizmos and reducing the average number of clicks required from seven to three.
  • Smart Vector live output and mask input. The Smart Vector toolset is now even faster to use and more effective in shots with occluding objects. Smart Vector and Vector Distort have been optimized for the GPU, allowing users to generate Smart Vectors on the fly and preview the result without needing to pre-render the vectors. A new mask input allows artists to identify areas of motion to ignore when generating the Smart Vectors and warping the paint or texture. As a result, the Smart Vector toolset can now be used on shots with occluding objects with less laborious manual clean-up, speeding up the use of the toolset in more complex cases.

Upgrades for Nuke Studio
Nuke Studio now benefits from an updated project panel UI, providing the artist with new visual controls for managing and organizing complex projects. For quick visual reference, artists can assign colors to items in the project bin and the timeline, based on file type and other parameters accessible via the UI and python API.

Artists can also set the poster frame for single or multiple clips in the bin or from the viewer: useful when working with clips that have black frames or slates at the start. These improvements will help artists to visually distinguish between different file types in a far easier way.

Nuke 11.2 brings improved sorting and searching features to Nuke Studio, allowing artists to easily arrange project bins in custom orders. Artists can also search through project items using multiple keyword and file metadata with keywords, or through all metadata. This improved functionality of the project panel will aid artists managing larger projects.

Christy Anzelmo, sr. commercial product manager at Foundry, commented: “With Nuke 11.2, we’ve listened to our customers and built on the features introduced in previous Nuke 11 releases. Our focus has been on improving artists’ day-to-day experience and speeding up time-intensive tasks like deep compositing. This release will help teams tackle complex VFX work faster.”

Nuke 11.2 goes live today and will be available for purchase--alongside full release details--on Foundry’s website and via accredited resellers.

  • Tuesday, Jul. 17, 2018
Caledonia Investments acquires majority stake in Cooke Optics
Les Zellan
LEICESTER, UK -- 

Caledonia Investments plc, a self-managed investment trust, has acquired a majority stake in Cooke Optics. The current Cooke Optics management team, including chairman Les Zellan, CEO Robert Howard and COO Alan Merrills, remains in place, and day-to-day activities at the Leicester-based company will continue unchanged.

“We have been experiencing a sustained period of growth, and the time was right to look for a new investor to help us develop further,” said Zellan. “Caledonia, another historic British company, has a reputation for long-term investment and for supporting management teams to grow their businesses. With a strong financial partner in Caledonia, Cooke is in an excellent position to continue designing and making more of our coveted lenses for the film and television industry.”

The lineup of Cooke lenses is designed from the ground up, not repurposed from existing components, and every lens is hand crafted in the Leicester factory.

Cooke’s most recent developments include the acclaimed S7/i full frame lens range which correctly anticipated the current large format trend; the Panchro/i Classic range which replicate the beloved look of vintage Speed Panchro lenses but with modern housing, mounts and glass; and the Anamorphic/i SF range that adds more exaggerated anamorphic attributes including lens flare and oval bokeh. The company is also behind the lens metadata standard, /i Technology, which captures valuable lens information for use on-set and in postproduction.

The latest Cooke lens ranges will be on display on Stand 12.D10 at IBC 2018, held in Amsterdam, Netherlands from September 14-18, 2018.

  • Thursday, Jul. 12, 2018
Carlo Bolognesi to lead Broadcast Pix EMEA sales
Carlo Bolognesi
CHELMSFORD, Mass. -- 

Carlo Bolognesi has joined Broadcast Pix to lead sales in the Europe, Middle East, and Africa (EMEA) region. Since 1994, Bolognesi has owned IREL Engineering, a systems integration and consulting firm based in Alessandria, Italy.

“Carlo has been delivering Broadcast Pix systems and other equipment to European broadcasters and video professionals through his company for more than 20 years,” said Russell Whittaker, worldwide director of channel sales. “We are excited to have him join our team to help drive sales of our BPswitch integrated production switchers, VOX visual radio systems, and ioGates cloud-based media management solutions.”

  • Thursday, Jul. 12, 2018
25+ major summer feature films used Blackmagic Design
This image released by Twentieth Century Fox shows Ryan Reynolds in a scene from "Deadpool 2." (Twentieth Century Fox via AP, File)
FREMONT, Calif. -- 

Blackmagic Design announced that more than 25 of the 2018 summer season’s worldwide film releases used numerous Blackmagic Design products during production and post, including its digital film cameras, DaVinci Resolve Studio editing, color correction and audio post production application and more. This included some of the biggest blockbusters and expected blockbusters of the summer, such as “Deadpool 2,” and “Jurassic World: Fallen Kingdom.”

Blackmagic Design products were used at nearly every stage of production and post production on various summer films created around the world. DaVinci Resolve continues to be the go-to application for many of the world’s leading editors, colorists and postproduction facilities, such as EFILM’s Skip Kimball on “Deadpool 2,” Harbor Picture Company’s Joe Gawler for “Solo: A Star Wars Story” and “Mile 22” by Stefan Sonnenfeld of Company 3.

The summer films that used Blackmagic Design products for production include:

  • Marvel Studios’ “Avengers: Infinity War” DIT Kyle Spicer used DaVinci Resolve, UltraStudio, SmartVideohub 40x40, DeckLink Quad 2, DeckLink Mini Monitor and a variety of converters for on-set work;
  • Marvel Studios’ “Ant-man and Wasp” DIT Daniele Colombera used a variety of Blackmagic Design capture and playback devices, Mini Converter, Mini Monitor, SmartScope Duo, Smart Videhub and DaVinci Resolve for on-set work;
  • “The Darkest Minds” Action Unit DP Paul Hughen, ASC, and Action Unit Director Jack Gill used four Micro Studio Camera 4Ks as small and lightweight stunt cameras;
  • “Deadpool 2” DIT Simon Jori used DaVinci Resolve and ATEM 1M/E for greenscreen/bluescreen and on-set work;
  • “Dog Days” DIT Dane Brehm used Smart Videohub CleanSwitch 12x12, SmartScope Duo and SmartView Duo for on-set work;
  • “The First Purge” DIT Lewis Rothenberg used UltraStudio 4K, HDLinks and Smart Videohub 20x20 and Smart Videohub 16x16 for on-set work;
  • “Hotel Artemis” DIT Lonny Danler used DaVinci Resolve, a Micro Videohub and DeckLink Mini Monitor for on-set work;
  • “Mile 22” DIT Urban Olsson used Teranex Mini SDI to HDMI, MultiView 4, Smart Videohub 20x20, DeckLink 4K Extreme 12G, Ultrastudio 4K, Ultrastudio and DaVinci Resolve for on-set work and for the creation of an on-set VR monitoring solution for the film’s director;
  • “Overboard” DIT Mark Allan used DaVinci Resolve, HDLinks and UltraStudio Express for on-set work;
  • “Show Dogs” DIT Thomas Patrick used DaVinci Resolve, HyperDeck Studio, SmartView monitors and Videohub monitors for on-set;
  • “Siberia” DP Eric Koretz used a Micro Studio Camera 4K with a Video Assist 4K, as well as DaVinci Resolve and a DaVinci Resolve Micro Panel for testing different looks on-set;
  • “Sicario: Day of the Soldado” DIT Ryan Nguyen used various routers, Mini Monitors and DeckLink cards, along with DaVinci Resolve on-set; and
  • “Skyscraper” 2nd Unit DIT Mark Allan used HDLink, UltraStudio Express and DaVinci Resolve for on-set work.
     

For Post Production using DaVinci Resolve Studio:

  • “Adrift” by Stefan Sonnenfeld of Company 3;
  • “American Animals” by Rob Pizzey of Goldcrest for The Orchard;
  • “Deadpool 2” by Skip Kimball of Deluxe’s EFILM;
  • “Hearts Beat Loud” by Mike Howell of Color Collective;
  • “Juliet, Naked” by Nat Jencks of Goldcrest;
  • “Jurassic World: Fallen Kingdom” by Adam Glasman of Goldcrest;
  • “A Kid Like Jake” by Nat Jencks of Goldcrest;
  • “Madeline’s Madeline” by Nat Jencks of Goldcrest;
  • “Mile 22” by Stefan Sonnenfeld of Company 3;
  • “The Miseducation of Cameron Post” by Nat Jencks of Goldcrest;
  • “On Chesil Beach” by Tom Poole of Company 3;
  • “Overboard” by Trent Johnson of MTI Film;
  • “RBG” by Ken Sirulnick of Glue Editing & Design, along with Fusion Studio for stabilizing work;
  • “Sicario: Day of the Soldado” by Stephen Nakamura of Company 3;
  • “Skyscraper” by Stefan Sonnenfeld of Company 3;
  • Disney Lucasfilms’ “Solo: A Star Wars Story” by Joe Gawler of Harbor Picture Company;
  • “Sorry to Bother You” by Sam Daley at Goldcrest;
  • “Sorry to Bother You” for VFX by Darren Orr of Beast; and
  • “SuperFly” by David Hussey of Company 3.
  • Tuesday, Jul. 10, 2018
Dejero, Draganfly partner on real-time video transport
Draganfly's Commander UAV bundled with the Dejero EnGo mobile transmitter
WATERLOO, Ontario -- 

Dejero, known for cloud-managed solutions that provide video transport and Internet connectivity while mobile or in remote locations, has formed a technology partnership with Canadian-based sUAS (small Unmanned Aircraft System) industry experts, Draganfly Innovations Inc. The collaboration sees Draganfly’s Commander UAV (unmanned aerial vehicle) quadcopter bundled together with the Dejero EnGo mobile transmitter, providing real-time video transport from the air. In addition, the companies’ combined expert knowledge will bring new and innovative solutions and services to Dejero’s broadcast customers and Draganfly’s customers across the many industry verticals they serve.
 
This collaboration enables broadcasters to integrate live video captured with UAVs into their newsgathering, sports and event coverage, and video production for television and online audiences. It will also facilitate Dejero in reaching new industries and applications, providing real-time on-board video transport over IP to the military, public safety, and government sectors that Draganfly has traditionally operated in.
 
The Draganflyer Commander UAV is a remotely operated, unmanned, miniature helicopter designed to carry wireless camera systems. The professional quality, powerful, easy to fly aerial platform is specifically designed for high endurance applications such as public safety, search and rescue, agriculture, mapping, aerial photography, and more. Dejero’s highly versatile EnGo mobile transmitter will be instrumental in reliably providing high-quality live video from Draganfly’s Commander, which will in turn allow Draganfly to elevate its offering.
 
“Historically, UAV use in broadcast has been challenging, in particular when it comes to providing high-quality video with low latency and with the reliability needed for live broadcasts,” explained Kevin Fernandes, VP of sales at Dejero. “Through our collaboration with Draganfly, we can provide an effective solution for broadcast and media organizations, as well as other industries requiring the reliability and picture quality that customers require.”
 
“We are thrilled to be adding broadcast-quality live video feeds to our Commander vehicle,” said Draganfly president Zenon Dragan. “The timing couldn’t be better as we’ve recently expanded into contract engineering and custom product development. Our partnership with Dejero will greatly support this.”
 
Well-versed in the design of sophisticated multi-rotor aircraft, ground-based robots, and fixed wing aircraft, Draganfly also provides custom payloads, ground-up software design, electronics, UAV program development, and flight training.

  • Thursday, Jul. 5, 2018
Encore's Pankaj Bajpai turns to Baselight to color the story of Picasso
A scene from "Genius: Picasso" (photo courtesy of National Geographic)
LONDON -- 

The second series of National Geographic’s popular series Genius focuses on the extraordinary life of painter Pablo Picasso. Colorist Pankaj Bajpai at Encore in Hollywood was charged with creating the look that evoked the time and place as well as the art.

Bajpai was the colorist on the first series of Genius, which featured Einstein and was set in Germany. Series two inhabits the much warmer climes of Spain and Paris. Working with DP Mathias Herndl, it was the place that led them to the look: “We anchored ourselves in the quality, color and texture of the Spanish light in Málaga, the birthplace of Picasso,” Bajpai explained.

The series tells the story of Picasso (played by Antonio Banderas) and his complicated, chaotic lifestyle which was the source of his vibrant paintings. National Geographic and the show’s creator Kenneth Biller were concerned with creating the sense and style of the first half of the 20th century. “A key challenge was to maintain the authenticity of the period, and yet somehow keep a contemporary flair,” Bajpai recalled.

“We start with Picasso’s father and his memory of the bullfights--it’s all incredibly warm,” he continued. “Then Picasso as a young man, with many candlelit interiors. And towards the very end the palette becomes sparse and cold, as his life becomes more isolated.”

Herndl shot the series using the ARRI ALEXA, and Bajpai was involved from the earliest days of the production. “Mathias and I have a long working relationship, and much of our understanding is intuitive--it’s a partnership where few words are exchanged. I know Mathias’s instincts when he is shooting, and he knows how I might approach the captured image. When it all comes together, it’s wonderful.”

At Encore, colorist Bajpai used the latest Version 5.0 of Baselight software, giving him access to Base Grade, FilmLight’s popular feature of the colorist’s toolkit. “It allowed me to approach grading using the classic zone system for the first time, which was tremendous. It is possibly one of the most practical and significant advances in grading technology in a long time.

“There are many scenes in the show where there are old European-style big and bright windows,” he said. “To be able to maintain details in the high-high-highlights and low-low-lowlights and still keep everything in between was unbelievably fast and clean.”

Executive produced by Ron Howard and Brian Grazer, Genius is a major success for National Geographic. Season one was the network’s most-watched show of 2017 and earned the network a record 10 Emmy nominations.

  • Monday, Jul. 2, 2018
Academy's Sci-Tech Awards Committee exploring 8 areas for Oscar consideration
The Motion Picture Academy's Scientific and Technical Awards ceremony in 2016.
LOS ANGELES -- 

The Academy of Motion Picture Arts and Sciences announced that eight distinct scientific and technical investigations have been launched for the 2018 Oscars®.

These investigations are made public so individuals and companies with devices or claims of innovation within these areas will have the opportunity to submit achievements for review.

The deadline to submit additional entries is Wednesday, July 18, at 5 p.m. PT. The Academy’s Scientific and Technical Awards Committee has started investigations into the following areas:

  • Systems for the creation of motion graphics for motion picture content
  • Remote, distributed, secure and collaborative review frameworks of dailies and sequences
  • 2D, multi-layer, raster, image editing and digital paint systems used in motion picture production
  • Dense-mesh accurate animated facial geometry capture
  • Texture layout toolsets
  • Efficient, sequence-based paint and rotoscoping toolsets
  • Capture of facial appearance for photorealistic rendering
  • Lavalier microphones widely used in motion picture production

For more information on the Scientific and Technical Awards and to submit similar technology, click here.

After thorough investigations are conducted in each of the technology categories, the committee will meet in the fall to vote on recommendations to the Academy’s Board of Governors, which will make the final awards decisions.

The 2018 Scientific and Technical Awards Presentation will be held on Saturday, February 9, 2019.

The 91st Oscars will be held on Sunday, February 24, 2019, at the Dolby Theatre® at Hollywood & Highland Center® in Hollywood, and will be televised live on the ABC Television Network at 6:30 p.m. ET/3:30 p.m. PT. The Oscars also will be televised live in more than 225 countries and territories worldwide.

  • Thursday, Jun. 28, 2018
NEP mobile unit selects FUJINON 4K UHD zoom lenses
Shooting a Seattle Mariners’ game at Safeco Field
VALHALLA, NY -- 

Mobile production provider NEP Group has equipped its latest truck, the M-15--a 4K unit featuring advanced IP delivery capability--with a range of FUJINON 4K UHD lenses. While build on the M-15 is in its final stages of completion, the new lenses are being used to cover Seattle Mariner baseball games for the Root Sports Northwest network. Currently, the eight FUJINON UA107x8.4BESM 4K Box Field lenses, three UA24x7.8BERM 4K UHD lenses and three UA14x4.5BERD 4K UHD Wide-Angle lenses complement Grass Valley LDK Series 4K UHD cameras on NEP’s M-5 for Seattle area baseball coverage.

In addition to being a fully 4K capable production truck, the M-15 will be NEP USMU’s second ST 2110-capable regional mobile unit. (SMPTE ST 2110 standards specify the carriage, synchronization, and description of IP streams for real-time video production and playout).

“We appreciate our longstanding relationship with Fujifilm and are looking forward to M-15 hitting the road, complete with our new 4K UHD lenses,” said Glen Levine, president, NEP U.S. “The IP system alongside the superior production tools will propel live coverage to a whole new level. For now, the lenses are already creating outstanding images of the Mariner games on the M-5.”

“Production teams and crews like the quality of NEP’s newest set of FUJINON lenses.” Speaking about the UA107x, Levine added, “We went with these lenses because of their 900mm telephoto and their ability to do tight shots from long distances. Camera operators have come back from the games happy with how tight they were able to go.”

NEP’s M-15 will be fully 4K capable. “The cost/performance ratio of the entire FUJINON 4K lens range makes them ideal for this new truck,” said Levine. “We always do a thorough cost evaluation, but we will never sacrifice quality to cut costs. These lenses offer great quality while being cost-effective.”

  • Wednesday, Jun. 27, 2018
Filmotechnic USA introduces electric EVU camera car
Filmotechnic's Electric EVU
LOS ANGELES -- 

Filmotechnic USA, one of the largest camera car companies in the world, introduces the newest edition to its fleet, the Electric EVU. Filmotechnic is known worldwide for its Academy-award winning fleet of stabilized camera car systems. Filmotechnic headquarters and fabrication facilities are located in Los Angeles, with support offices in Detroit, Atlanta, Dallas and Orlando.

According to Filmotechnic manager John Urso, the Electric EVU was developed to facilitate the need for camera car shots where emissions or noise are not wanted. “There is demand for a camera car that can be used where internal combustion vehicles are prohibited, such as on a soundstage,” he said. “The EVU also works great on noise sensitive shoots, such as filming horses for a Western.”

The EVU operates in silence while capturing all the action with fluid acceleration. With zero emission, the EVU expands the possibilities of getting the best shot. One rig configuration features the Filmotechnic’s Telescoping U-Crane, the noted robotic crane, paired with the company’s Flight Head Mini, a fully digital gyro-stabilized flight head. “This is just one of many arm/flight head combos this vehicle can handle,” said Urso. “We can provide a variety of rig options based on the job at hand, fully customized to meet a customer’s specific need.”

MySHOOT Profiles

Hari Sama
Director

Director
Angel Soto
Director

MySHOOT Company Profiles