• Thursday, Apr. 25, 2019
Walmart experiments with AI to monitor stores in real time
Mike Hanrahan, CEO of Walmart's Intelligent Retail Lab, discusses a kiosk that describes to customers the high technology in use at a Walmart Neighborhood Market, Wednesday, April 24, 2019, in Levittown, N.Y. "If we know in real time everything that's happening in the store from an inventory and in stock perspective, that really helps us rethink about how we can potentially manage the store," said Hanrahan. (AP Photo/Mark Lennihan)
LEVITTOWN, NY (AP) -- 

Inside one of Walmart's busiest Neighborhood Market grocery stores, high resolution cameras suspended from the ceiling point to a table of bananas. They can tell how ripe the bananas are from their color.

When a banana starts to bruise, the cameras send an alert to a worker. Normally, that task would rely on the subjective assessment of a human who probably doesn't have time to inspect every piece of fruit.

The thousands of cameras are a key feature of Walmart's Intelligent Retail Lab, which officially opens inside this 50,000-square-foot store on Thursday. It's the retail giant's biggest attempt so far to digitize the physical store.

Walmart envisions using the cameras, combined with other technology like sensors on shelves, to monitor the store in real time so its workers can quickly react to replenish products or fix other problems. The technology, shown first to The Associated Press, will also be able to track when shelves need to be restocked or if shopping carts are running low. It can spot spills and even detect when more cash registers need to be opened before long lines start forming.

Walmart's deep dive into artificial intelligence in its physical store comes as Amazon raised the stakes in the grocery business with its purchase of Whole Foods Market nearly two years ago.

That's put more pressure on Walmart and other traditional retailers like Kroger and Albertson's to pour money into technology in their stores. At the same time, they're trying to keep food prices down and manage expenses. Amazon has been rolling out cashier-less Amazon Go stores , which have shelf sensors that track the 1,000 products on their shelves.

Walmart's online U.S. sales are still a fraction of Amazon's online global merchandise empire, which reached $122.98 billion last year.

Walmart hopes to start scaling some of the new technology at other stores in the next six months, with an eye toward lower costs and thus lower prices. As the shopping experience improves, the retailer expects to see higher sales.

"We really like to think of this store as an artificial intelligence factory, a place where we are building these products, experiences, where we are testing and learning," said Mike Hanrahan, CEO of Walmart's Intelligent Retail Lab and co-founder of Jet.com, purchased by Walmart three years ago.

Hanrahan says the cameras are programmed to focus primarily on the products and shelves right now. They do not recognize faces, determine the ethnicity of a person picking up a product or track the movement of shoppers, he says. Some other companies have recently started experimenting with store shelf cameras that try to guess shoppers' ages, genders and moods.

There are signs throughout the Neighborhood Market educating shoppers about how it is being used as a lab. Still, the cameras could raise privacy concerns.

"Machine learning fundamentally finds and matches patterns," says Steven M. Bellovin, a computer science professor at Columbia University and a privacy expert, who hasn't seen the new Walmart AI Lab. But he says companies run into trouble when they start to match behavior to a specific customer.

Hanrahan says Walmart has made sure to protect shoppers' privacy and emphasized that there are no cameras at the pharmacy, in front of the rest rooms or in employee breakrooms.

The lab is Walmart's second in a physical store. A glass enclosed data center at the back of the store houses nine cooling towers, 100 servers and other computer equipment that processes all the data.

Last year, Walmart's Sam's Club opened a 32,000 square foot lab store, a quarter of the size of a typical Sam's Club.  The lab is testing new features surrounding the Scan & Go App, which lets customers scan items as they shop and then buy from their phones, skipping the checkout line.

The retail lab is the third project from Walmart's new incubation arm, created after the Jet.com acquisition as a way for the discounter to shape the future of retail.

It follows the launch of Jetblack, a shopping by text service aimed at affluent shoppers in New York. Walmart's second incubation project was Spatial&, a VR tech company. As part of the launch, it's bringing tractor-trailers to some of Walmart parking lots so customers can experience DreamWorks Animation's "How to Train Your Dragon" through virtual reality.

Hanrahan says the company is embracing the labs in stores to better understand the real ways that technology affects customers and workers. It also wants to educate shoppers. Walmart has made a point to not hide the technology, and small educational kiosks are set up throughout the Neighborhood Market.

Despite the signs and visible cameras, many shoppers, including Marcy Seinberg from Wantagh, New York, didn't seem to notice or care.

"I am not bothered by it," Seinberg said. "If technology saves me money, I would be interested."

 

  • Thursday, Apr. 25, 2019
RED R3d SDK for NVIDIA CUDA-accelerated workflow now available
RED R3D SDK with REDCINE-X PRO for NVIDIA CUDA
IRVINE, Calif. -- 

RED Digital Cinema® released its RED R3D® SDK and accompanying REDCINE-X PRO® software with accelerated decode and debayering on NVIDIA CUDA® platforms. By offloading the compute-intensive decoding and debayering of RED R3D files onto one or more NVIDIA GPUs, real-time playback, edit and color grade of 8K footage is now possible.

Benefits and efficiencies of this new software-hardware combination during the postproduction process include:

  • 8K real-time 30 fps or greater playback performance
  • Up to 10x faster transcoding, depending on the format and content
  • Improved efficiencies and quality control within the content review process
  • Creative freedom using flexible R3D files instead of proxy files

 
8K performance is available with NVIDIA Quadro® RTX™ 6000 and 8000, GeForce® RTX™ 2080 Ti and TITAN RTX™ GPUs when coupled with a moderately configured PC. Creators can achieve additional performance improvements with multi-GPU configurations and may see noticeable gains even with older NVIDIA GPUs. Also, new NVIDIA RTX laptops from leading computer manufacturers, including Razer, Acer, Alienware, ASUS, Dell, Gigabyte, HP, Lenovo, MSI and Samsung, provide real-time playback at up to 8K and offer flexibility in choosing the right tools to fit a variety of budgets.

Support from major NLEs and other SDK integrators is expected soon.

  • Tuesday, Apr. 23, 2019
Cooke Optics promotes Catherine Crawley to director of marketing
Catherine Crawley
LEICESTER, UK -- 

Cooke Optics has promoted Catherine Crawley to director of marketing, following a two year tenure as the company’s digital marketing manager.

Crawley began her career in sales, marketing and management roles across the media sector, from production to postproduction and distribution, at companies including The Hospital Club and Air Post Production. She then spent several years working in design and digital agencies building expertise on then-nascent social media platforms and e-commerce, as well as organizing events and content marketing, before building a successful freelance career as a social media strategist. Crawley joined Cooke Optics in 2017 to build on the company’s growing social media presence, driving marketing content to engage with Cooke’s global customers and the cinematography community. Most recently she has been focusing on the soft launch of #shotoncooke, a dedicated website gallery where cinematographers are invited to upload clips of their work shot with Cooke lenses, with the aim of informing others about their experiences.

In her new role, Crawley will continue to build Cooke’s digital marketing strategy as well as taking on the strategy and execution of more traditional marketing elements including events, advertising and sponsorships.

Les Zellan, chairman, Cooke Optics, said, “Catherine has done a tremendous job in building our online presence and enabling what has become a vibrant virtual dialogue with our global community. It was the obvious next step for her to bring her significant expertise to our whole marketing program.”

  • Wednesday, Apr. 17, 2019
HPA issues call for entries for Engineering Excellence Awards
2018 HPA Awards
BURBANK, Calif. -- 

For the 14th year, The Hollywood Professional Association (HPA®) will honor the companies and individuals who draw upon technical and creative ingenuity to develop breakthrough technologies with the HPA Engineering Excellence Award.  The call for entries for the Engineering Excellence Award opened today (4/16), and submissions will close on May 24, 2019.

Joachim Zell, VP of Technology for EFILM and chair of the HPA Engineering Excellence Award Committee, said, “True success in our field lies in making it possible for filmmakers to realize their artistic visions. It is that goal that drives the development of technical and engineering processes that bring that vision to life. The companies and individuals supporting creative storytellers face constant pressure to evolve to expand the creative palette. Their contribution to the entertainment industry cannot be overstated. The Engineering Excellence Award is a highly competitive honor, judged and awarded by tried and tested leaders in the field, and the past winners have changed the course of entertainment technology. We encourage the submission of your significant technological achievements.”

Entrants for this peer-judged award may include products or processes and must represent a significant step forward for its industry beneficiaries. Last year’s winners were Blackmagic Design, Canon, Cinnafilm, and IBM Aspera & Telestream.  Rules and procedures can be found here.     

Applicants present to a blue-ribbon industry panel on June 22 at the IMAX facility in Los Angeles. More information about the presentation dates and location will be announced soon. Winners will be announced in advance, and honors presented during the HPA Awards gala on the evening of November 21, 2019, at the Skirball Cultural Center in Los Angeles.

At the gala, HPA Awards will again honor important creative categories including Outstanding Color Grading, Editing, Sound and Visual Effects for feature film, television and commercials. The call for entries in these categories will be announced in May.

  • Tuesday, Apr. 16, 2019
Avid helps HBO to innovate postproduction for program promotions
Jeff Rosica, Avid CEO and president
BURLINGTON, Mass. -- 

Avid® (Nasdaq: AVID) is helping HBO® to re-define the promotional content finishing workflows that serve all of the network’s distribution outlets.

HBO’s innovative approach includes unlimited on-demand licenses for Avid Media Composer® nonlinear editing systems. It allows the network’s production engineering group to scale editing resources up and down on a moment’s notice to address end-user demand from marketing, sports, documentary and home entertainment to create their promotions and market their programming with greater agility and speed. HBO’s virtualized Media Composer deployment integrates with its Avid NEXIS® storage resources.

“Our production engineering group supports hundreds of clients who create promotions and packages to drive the success of HBO’s growing offerings, so we’ve established an efficient, on-demand resource that corresponds to the elastic needs of the operation,” said Stefan Petrat, SVP of media technology at HBO. “As needed, we can spin up our Media Composer seats and have hundreds of editors working on promotional pieces for all HBO distribution outlets. When that push is over, we can immediately spool down our excess systems.”

“HBO’s production engineering group is taking an inventive approach toward unlocking new gains in postproduction performance, and Avid is very pleased to support their vision with the virtualization of Media Composer,” said Jeff Rosica, CEO and president, Avid. “It’s exciting to see world-class customers like HBO successfully rethinking and reimagining the sheer scale of their workflows with Avid tools and solutions.”

  • Tuesday, Apr. 9, 2019
SMPTE reimagines Annual Technical Conference
LOS ANGELES & WHITE PLAINS, NY -- 

The SMPTE 2019 Annual Technical Conference & Exhibition (SMPTE 2019) has been reimagined with a fresh style, focus, layout, program schedule, logo, and website. The Society’s flagship event, SMPTE 2019, will run Oct. 21-24 at the Westin Bonaventure Hotel & Suites in downtown Los Angeles.

“We’ve updated and restructured our annual conference at every level so that it’s easy and enjoyable to discover, engage, and excel,” said SMPTE executive director Barbara Lange. “We’re bringing the conference and exhibition elements together to create a richer and more interactive atmosphere for attendees. In addition to making the educational experiences more engaging, we’ll be hosting various networking and social events throughout the conference.”

SMPTE’s flagship annual event is the world’s premier forum for the exploration of media and entertainment technology. Full conference registration for SMPTE 2019 will include the keynote presentation, entry to the exhibition and to all conference sessions, a rooftop lunch each day of attendance, and an opening night party.

SMPTE 2019 will provide access to the latest technology and offer top-quality education and professional development opportunities to help attendees increase their personal and professional value within the media and entertainment industry. The event is known for attracting the industry’s innovators — both creative and technical — and its business leaders, and this year’s event will provide attendees with a more intimate atmosphere for meeting and exchanging ideas.

SMPTE 2019 offers a more focused combination of submitted and programmed content to address the needs and interests of both experienced creatives and technologists as well as early career professionals. On the first day of the show, the latter group can leverage tutorials that will lead to in-depth sessions later in the week. Through a series of brief presentations, the industry’s most thought-provoking thinkers and doers will share their insights and anecdotes on motion-imaging technology and future directions in a non-commercial setting.

The exhibits and special events will create an enhanced experience for attendees, providing them with opportunities for networking and face-to-face meetings with industry experts.

Extended lunch breaks each day will take place on the iconic hotel’s spectacular rooftop venue. Attendees looking for more sunshine and fresh air can take part in the second annual 4K 4Charity Fun Run. Pop-up happy-hours and featured exhibits in the conference foyer will allow for impromptu gatherings.

“As always, SMPTE technical conference sessions will address timely, forward-looking topics like no other event in the industry does,” said SMPTE Education VP Sara Kudrle, who is also product marketing manager at Imagine Communications. “From the fundamental elements of cinema and broadcast workflows to the latest in immersive experiences, SMPTE 2019 will offer expert insights on the technologies driving the future of storytelling.”

Tickets for many SMPTE 2019 events are limited, and early registration is encouraged. Early-bird registration pricing is available now through July 27. Attendees also can save by booking SMPTE group room rate at the Westin Bonaventure, where a limited block of reduced-rate rooms will be available through Sept. 27, or while rooms remain available. A NAB Show special — available only through April 13 at the 2019 NAB Show — gives attendees $100 off registration. Come by the SMPTE booth located in the south hall, upper level, LSU1, for the discount code.

SMPTE is seeking technical manuscript proposals for SMPTE 2019. Abstracts are due by May 3. Authors of manuscript proposals selected by the SMPTE 2019 program committee will have the opportunity to present at the event and network with the industry’s most esteemed technology thought leaders and engineering executives during the world’s premier forum for the exploration of media and entertainment technology. Following SMPTE 2019, accepted manuscripts will be published to the SMPTE digital library, hosted on the IEEE Xplore platform, and video of each paper presentation will be posted on the Society’s YouTube channel, Submitted manuscripts will also go through peer review for possible publication in the award-winning SMPTE Motion Imaging Journal. Program sessions will address advancements in current technology, plus future-looking developments in media technology, content creation, image and sound, and the allied arts and sciences.

Details on SMPTE’s call for papers, including topics and instructions on how to submit an abstract, plus additional information about SMPTE 2019 is available here.

 .

  • Monday, Apr. 8, 2019
Deluxe, Amazon Web Services form strategic cloud collaboration
Deluxe's Andy Shenkler

Deluxe Entertainment Services Group Inc. (Deluxe) has entered into a multi-year strategic collaboration with Amazon Web Services (AWS) to offer faster and at-scale solutions for content creators and distributors. Additionally, Deluxe selected AWS as the company’s primary cloud provider, fully integrating AWS services to enable end-to-end content solutions offered via the Deluxe One platform. The agility of serverless workflows on AWS enables Deluxe to combine services such as Amazon Translate and Amazon Transcribe with Deluxe’s expert media services and capabilities to address industry challenges around localization and global distribution.

Together, Deluxe and AWS are maximizing their experiences and offerings in the media and entertainment space to provide unique and innovative solutions across the content supply chain. Deluxe One’s unique capabilities are leading the transformational shift and completely redefining workflows as content creators and distributors make the transition to the cloud. By leveraging the extensive cloud services provided by AWS, Deluxe has the ability to offer scalable and efficient solutions for the creation, storage, processing and delivery of content, connecting the media supply chain with an open platform to all vendors and partners to meet market demands.

“As more companies adopt native cloud workflows, our combined efforts are establishing how the modern digital media supply chain functions,” said Andy Shenkler, chief product officer of Deluxe. “We’re going all-in with AWS to leverage every aspect of their services across our Deluxe One ecosystem, enabling us to jointly provide content creators and distributors with innovative solutions across the end-to-end media ecosystem, as well as expanding the automation and enhancing the efficiency of our business operations and interactions with our customers.”

Swami Sivasubramanian, VP of machine learning, Amazon Web Services, Inc., said, “Deluxe’s rich history of serving this market segment combined with AWS services, such as Amazon Translate and Amazon Transcribe, will accelerate the development of new opportunities for the industry to create, localize, transform, and deliver personalized content to viewers around the world.”

In addition to existing offerings, the first industry challenge that Deluxe and AWS are tackling with this collaboration is the need for scalability and rapid innovation within the localization business. Global reach and increasing consumer demands are leading to shifts in the industry that require faster turnarounds aligned with shrinking release windows for content delivery. Deluxe and AWS are working together to revamp the modern digital media supply chain by enabling rapid, highly accurate, automated transcreation at scale, combining Deluxe’s expertise in localization with AWS’ AI/ML services, including Amazon Translate and Amazon Transcribe. The goal is to have a truly automated localization service for subtitling, closed captioning, and compliance that considers regional context and transcreation requirements not currently possible today.

Info on and insights into the Deluxe and AWS collaboration will be available at the AWS booth at NAB located in the South Hall Upper--SU2202--of the Las Vegas Convention Center.

  • Sunday, Apr. 7, 2019
Avid introduces all-new Media Composer
Avid Media Composer 2019
LAS VEGAS -- 

Media Composer®, Avid’s flagship video editing system, has been redesigned and reimagined. Unveiled this weekend at Connect 2019, Avid Customer Association’s gathering of media and entertainment users, the all-new Media Composer 2019 will be in the spotlight starting Monday, April 8 at the NAB Show in Avid’s booth (#SU801).

With Media Composer 2019, aspiring and professional editors, freelancers and journalists will be inspired to work more creatively by taking advantage of a new user experience, a next generation Avid media engine with distributed processing, finishing and delivering capabilities, a customizable role-based user interface for large teams, and so much more.

“After receiving input from hundreds of editors and teams across the media industry, and knowing where the industry is headed, we reimagined Media Composer, the product that created the nonlinear video editing category and remains the gold standard,” said Jeff Rosica, CEO and president at Avid. “Media Composer 2019 is both evolutionary and revolutionary. It maintains what longtime users know and love while giving them more of what they need today--and what they will need tomorrow.”

Media Composer 2019
With Media Composer 2019, an editor can go from first cut to delivery without ever leaving the application. Prime features include:

  • New User Experience – makers can work at the speed of creativity with a paneled interface that reduces clutter, reimagined bins to find media faster, and task-based workspaces showing only what the user wants and needs to see.
  • Next Generation Avid Media Engine – puts more power at a user’s fingertips with features, such as native OP1A, support for more video and audio streams, Live Timeline and background rendering, and a distributed processing add-on option to shorten turnaround times and speed up post production.
  • New Finishing and Delivery Workflows – Now, users can create and deliver higher-quality content with editing, effects, color, audio, and finishing tools without leaving Media Composer. Whether working in 8K, 16K, or HDR, Media Composer’s new built-in 32-bit full float color pipeline can handle it. Additionally, Avid has been working with OTT content providers to help establish future industry standards.
  • Customizable Toolset – built for large production teams, the new Media Composer | Enterprise provides administrative control to customize the interface for any role in the organization, whether the user is a craft editor, assistant, logger or journalist. It also offers unparalleled security to lock down content, reducing the chances of unauthorized leaks of sensitive media. 

Media Composer | Enterprise 2019
The Media Composer family adds Media Composer | Enterprise for postproduction, broadcast, media education and other larger production teams. Media Composer | Enterprise is billed as being the industry’s first role-specific video editing and finishing solution. Large production teams now have the ability to customize the interface and tailor workspaces for different job roles, providing end users access only to the tools and functions they need. This capability gives teams better focus so they can complete jobs faster and with fewer mistakes. Media Composer | Enterprise also integrates with Editorial Management 2019 to deliver collaborative workflow innovation for postproduction and enables creative teams to stay in sync.

Media Composer | Distributed Processing
Avid also announced Media Composer | Distributed Processing, an add-on option that shortens turnaround times and accelerates post production by sharing the media processing load. Tasks that previously took hours can now be done in minutes, strengthening post facilities’ competitive edge while delivering high-quality programming. Media Composer | Distributed Processing also offloads complex processing tasks when working in today’s emerging high resolution and HDR media-rich worlds.

Media Composer 2019 will be available in late spring for all of its models: Media Composer | First, Media Composer, Media Composer | Ultimate and Media Composer | Enterprise. 

  • Thursday, Apr. 4, 2019
Autodesk to showcase Flame 2020 at NAB
Flame 2020's Refraction feature on display
SAN RAFAEL, Calif. -- 

Autodesk has announced Flame® 2020, the latest release of the Flame Family of integrated visual effects (VFX), color grading, look development and finishing system for artists. A new machine learning-powered feature set along with a host of new capabilities bring Flame artists significant creative flexibility and performance boosts. This latest update will be showcased at the NAB Show in Las Vegas, April 8-11 between 9am-5pm in a demo suite at the Renaissance Hotel. 

Advancements in computer vision, photogrammetry and machine learning have made it possible to extract motion vectors, Z depth and 3D normals based on software analysis of digital stills or image sequences. The Flame 2020 release adds built-in machine learning analysis algorithms to isolate and modify common objects in moving footage, dramatically accelerating VFX and compositing workflows.

“Machine learning has enormous potential for content creators, particularly in the areas of compositing and image manipulation where AI can be used to track and isolate objects in a scene to pull rough mattes quickly,” said Steve McNeill, director of Flame Family Products, Autodesk, Media and Entertainment. “Flame has a reputation as the de facto finishing system of choice in the deadline driven world of professional production, and this latest 2020 release significantly extends creative flexibility and performance for our artists.”

Flame® Family 2020 highlights include: 

Creative Tools

  • Z Depth Map Generator— Enables Z depth map extraction analysis using machine learning for live action scene depth reclamation. This allows artists doing color grading or look development to quickly analyze a shot and apply effects accurately based on distance from camera.
  • Human face Normal Map Generator— Since all human faces have common recognizable features (relative distance between eyes, nose, location of mouth,) machine learning algorithms can be trained to find these patterns. This tool can be used to simplify accurate color adjustment, relighting and digital cosmetic/beauty retouching.
  • Refraction— With this feature, a 3D object can now refract, distorting background objects based on its surface material characteristics. To achieve convincing transparency through glass, ice, windshields and more, the index of refraction can be set to an accurate approximation of real-world material light refraction.

Productivity

  • Automatic Background Reactor— Immediately after modifying a shot, this mode is triggered, sending jobs to process. Accelerated, automated background rendering allows Flame artists to keep projects moving using GPU and system capacity to its fullest. This feature is available on Linux only, and can function on a single GPU.
  • Simpler UX in Core Areas— A new expanded full width UX layout for MasterGrade, Image surface, and several Map User interfaces, are now available, allowing for easier discoverability and accessibility to key tools.
  • “Manager” for Action, Image, Gmask—A simplified list schematic view, Manager makes it easier to add, organize and adjust video layers and objects in the 3D environment. 
  • Open FX Support—Flame, Flare and Flame Assist version 2020 now includes comprehensive support for industry standard Open FX creative plugins as Batch/BFX nodes or on the Flame timeline.
  • Cryptomatte Support—Available in Flame and Flare, support for the Cryptomatte open source advanced rendering technique offers a new way to pack alpha channels for every object in a 3D rendered scene. 

Licensing

  • Single user license offering—Linux customers can now opt for monthly, yearly and three-year single user licensing options. Customers with an existing Mac-only single user license can transfer their license to run Flame on Linux.

Availability
Flame®, Flare™, Flame® Assist and Lustre 2020 will be available on April 16, 2019 at no additional cost to customers with a current Flame Family 2019 subscription. 

Early Flame 2020 adopters on beta have responded with enthusiasm to the latest updates. Flame artist Bryan Bayley of Treehouse Edit said, “Machine learning can be used to automate a lot of processes, letting artists focus on more creative tasks. Z Depth Map Generator is a great tool for making depth-of-field adjustments but it’s also a really useful tool for speeding up selective color correction and beauty clean up too.”

VFX supervisor Craig Russo of cr2creative added, “Arthur C. Clarke once said ‘Any sufficiently advanced technology is indistinguishable from magic’. The machine learning technology inside of Flame is truly magic. I recently worked on a virtual set comp where they forgot to add depth-of-field on the backgrounds. It took me two hours per shot to roto and add motion blur; I ran the same shots through the Z Depth Map Generator and got results in two seconds.” 

And freelance London-based Flame artist Lewis Sanders said, “Loads of people have talked about machine learning for compositing work, but nobody has delivered anything in an actual product. This is really impressively fast compared to the object labelling/mask approach to getting rough mattes quickly.”

  • Tuesday, Apr. 2, 2019
Matthew Libatique, ASC Set For ICG Talk At NAB
Cinematographer Matthew Libatique (l) and director/co-writer/producer Bradley Cooper on the set of "A Star Is Born" (photo by Clay Enos/courtesy of Warner Bros. Pictures)
LOS ANGELES -- 

The International Cinematographers Guild (ICG, IATSE Local 600) is programming two events at the upcoming NAB Show at the Las Vegas Convention Center while many ICG members will be appearing throughout the event on over 15 panels and sessions covering a wide range of industry challenges.
 
Matthew Libatique, ASC (A Star is Born, Iron Man, Black Swan) will talk about his varied and innovative body of work as part of the Creative Master series in the South Hall #S222/S223 on Monday, April 8 at 2:15 pm - 3:05 pm. The two-time Oscar® nominee will explain his approach to lighting, color and a musical feel to camera movement that enhances the story and maintains narrative flow.
 
The following morning at 11:30 am, the ICG, along with the American Society of Cinematographers, will host the Birds of a Feather gathering “The Cinematographic Imaging Process: Where Does it Begin and End?” The session will bring together camera, previs, post, and VFX professionals to look at how to improve cross-departmental collaboration as well as to shine a light on the role of the cinematographer within this process. There will be featured speakers – Kees van Oostrum, ASC, president, ASC; Ryan McCoy, senior previs/postvis supervisor at Halon; Jim Berney, VFX supervisor; and Andrea Chlebak, senior colorist, Deluxe Group –as well as open discussion among all attendees. ICG production technology specialist, Michael Chambliss, will moderate.
 
“The Cinematographic Imaging Process: Where Does it Begin and End?” will take place April 9, 11:30 am – 12:30 pm, North Hall Upper Meeting Room N243.
 
Following is a working list of the various panels and sessions ICG members will be appearing in throughout the event: (Final schedule subject to change)
 
SATURDAY, APRIL 6, 2019
 
Skynet or Bust: How Machine Learning Can Serve Film Making – 11:00 am – 11:45 am

Location: S222/S223
Andrew Shulkind (Director of Photography, panelist)

SUNDAY, APRIL 7, 2019
 
Our Digital Selves in a Post-Reality Era – 9:55 am – 10:30 am

Location: S222/S223
Andrew Shulkind (Director of Photography, panelist)

What Comes After Movies – Is That All There Is? – 11:25 am – 11:55 am
Location: S222/S223
Andrew Shulkind (Director of Photography, panelist)

A Global View: How Diverse Crews are Making an Impact – 2:00 pm – 3:15 pm
Location: S222/S223
Robert Arnold (Camera Operator, panelist)
Kira Kelly (Director of Photography, panelist)

MONDAY, APRIL 8, 2019
 
Sculpting Images On-Set: The Cinematographer/DIT Relationship - 12:30 to 1:30 PM

Location: B&H - The Studio C10415
Rafel Montoya (DIT)
Michael Chambliss (ICG Production Technology Specialist)
 
Content Creation & Coverage in Today’s Evolving Industry – 1:15 pm – 2:05 pm
Location: Main Stage, North Hall
Sheila Smith (Camera Operator, panelist)

Matthew Libatique, ASC: Close-up -  2:15 pm - 3:05 pm
Location: S222/S223
Matthew Libatique, ASC (Director of Photography)
David Geffner (ICG Magazine, interviewer)
 
Shooting Space Exploration, From Launch to Landing – 2:20 pm
Location: N2936
Jillian Arnold (2nd AC, panelist)

Women on the Move (hosted/sponsored by Women in Media) – 2:30 pm
Location: N2936
Shanele Alvarez (Camera Operator, panelist)
Crystal Kelley (Camera Operator, panelist)
 
New Digital Workflow – 3:10 pm

Location: N2936
Jane Fleck (DIT, panelist)
 
TUESDAY, APRIL 9, 2019
 
#GALSNGEAR on NAB SHOW LIVE! Grand Lobby – 8:30 am – 10:00 am

Location: TBD
Sheila Smith (Camera Operator, panelist)
 
Birds of a Feather - The Cinematographic Imaging Process: Where Does it Begin and End? 11:30 AM – 12:30 PM
Location: North Hall Meeting Rooms, N243, Las Vegas Convention Center
Kees van Oostrum, ASC, (President, ASC)
Ryan McCoy, (senior previs/postvis supervisor, Halon)
Jim Berney, (VFX supervisor)
Andrea Chlebak, (senior colorist, Deluxe Group)
Michael Chambliss (ICG Production Technology Specialist)
 
Birds of a Feather - Solve your Grievances at the Prod/Post Festivus - 3:30 pm - 4:30 pm  
Location: North Hall Meeting Rooms, 243
Hosting Organization:   DIT-WIT
Dana Gonzalez, ASC (Director of Photography, panelist)
Chris Cavanaugh (DIT, panelist)
Michael Romano (DIT, panelist)

Join the Society of Camera Operators – Production Tips for Camera Operating – 5:00 pm – 6:30 pm
Location: S230
Eric Fletcher, SOC (Camera Operator, panelist)
David Sammons, SOC (Camera Operator, panelist)
Bill McClelland, SOC (Camera Operator, panelist)

WEDNESDAY, APRIL 10, 2019
 
Infinite Realities & Stunning Screens: The Cinematographer’s Expanding Role  - 11:00 am - 11:30 am.

Location: Adorama Booth, C4446
Steven Poster, ASC, (President, ICG)
Sheila Smith (Director of Photography)
Michael Chambliss (ICG Production Technology Specialist)
 
ASC 100th Anniversary: Full Circle – Past, Present and Future of Cinematography – 11:30 am – 12:30 pm
Location: Main Stage, North Hall
Bill Bennett, ASC (Director of Photography, panelist)
Sam Nicholson, ASC (Director of Photography, panelist)
David Stump, ASC (Director of Photography, panelist)

MySHOOT Company Profiles