Monday, October 15, 2018


  • Thursday, Mar. 29, 2018
Grass Valley, Calrec, Net Insight to discuss at-home production during NAB session
Grass Valley's Klaus Weber

Tasked with producing more live coverage with limited resources, broadcasters are turning to at-home/REMI (remote-integration model) production. At-home production can reduce the movement and help maximize the efficiency and utilization of people and equipment, and reduce on-site set-up times. During the Broadcast Engineering and Information Technology Conference (BEITC) at this year’s 2018 NAB Show, Klaus Weber of Grass Valley, Peter Walker of Calrec Audio and Larissa Görner of Net Insight will join forces to speak about the major challenges with at-home production and how these issues can be solved. The session, titled, “Live At-Home Production 2.0,” will take place on April 7 from 1:30-2:50PM (PST) in the North Hall Meeting Room (N260). 
The presentation provides a forum for each company to bring its specific industry experience to the discussion. Weber will speak about video and camera transmission, Walker is highlighting audio production, and Görner is focusing on signal transport. Together, they represent the main components of a remote production and will speak to the way each company’s state-of-the-art solutions provide better video, audio and transport workflows for at-home production. By utilizing a complementary technology approach, broadcasters are equipped with a complete, proven and easy way to generate significantly more live content.
The advantages of remote production are widely acknowledged across the industry, but what has been lacking is a focus to bring the at-home workflow to the next level by providing innovative audio-visual, networking and transport solutions. Grass Valley, Calrec Audio and Net Insight will explain this new workflow, as well as address challenges with remote production, in the At-Home Production 2.0 session.

All three companies are also holding at-home production demos live on the show floor at NAB 2018. Live demos will take place three times a day, with a production hub at the Grass Valley booth (booth SL106) providing live mixing for remote venues at the Net Insight booth (SU3821) and the Calrec booth (C7408).

  • Thursday, Mar. 29, 2018
Canon introduces its 1st full-frame cinema camera, the EOS C700 FF 
Canon's EOS C700 FF

Canon U.S.A. Inc. has announced the EOS C700 FF, the company’s first full-frame cinema camera. The beauty and majesty of full-frame digital cinema is now becoming a new creative reality. Since the introduction of the EOS 5D Mark II DSLR camera in 2008, Canon has been a part of the full-frame video movement, and the introduction of the C700 FF has reinforced Canon’s commitment to this market. At the heart of the camera is a novel Canon-developed CMOS image sensor having a total of 5952 (H) x 3140 (V) photosites with a digital cinema 17:9 aspect ratio, which gives it the same image circle size as the full frame EOS 5D camera series. This supports a wide range of shooting options.

Available in both PL and EF Mount, the EOS C700 FF provides users with the same outstanding performance, operation and modular design as the EOS C700 (released in December 2016). The camera is being shown publicly for the first time at the Canon booth (C4325) at the NAB Show 2018 in Las Vegas from April 9-12.

“Since the launch of Canon’s Cinema EOS line of products in November 2011, the goal was to one day develop a cinema camera worthy of being the ‘A’ camera on major Hollywood productions, and Canon met that goal with the introduction of the EOS C700,” said Yuichi Ishizuka, president and COO, Canon U.S.A., Inc. “After listening to our customers and closely monitoring market trends, Canon set forth a new goal: to launch a full-frame cinema camera. With this introduction, we are very excited to see the C700 FF in the hands of industry professionals as they shoot their latest projects.”

Existing owners of Canon’s original EOS C700 cinema camera will be pleased to know they can have their Super 35mm sensor upgraded to the new Full-Frame sensor for a fee*. Authorized Canon facilities such as Canon Burbank are ready to process C700 upgrades as well as lens mount swaps, and offer equipment drop off, on-site repairs and upgrades, as well as equipment testing and demonstration.

The Sensor
The newly developed sensor featured in the EOS C700 FF has an active image area of 38.1 x 20.1mm and supports readout at full size, as well as Super 35mm, Super 16mm and anamorphic modes. In addition to full-frame lenses, it can be used with conventional Super 35mm lenses to originate 4K / UHD standardized production formats and Super 16mm lenses (with an adapter) to originate 2K / HD production formats in crop modes. The sensor captures wide tonality exceeding 15 stops of dynamic range and a wide color gamut meeting ITU-R BT.2020 standards. This offers broad latitude when grading, providing outstanding effectiveness in HDR video production.

The EOS C700 FF embodies a choice of two high-performance codecs for on-board recording –Canon XF-AVC or Apple ProRes. Like other cameras in the 4K Cinema EOS family, the EOS C700 FF uses CFast cards to capture 4K / UHD or 2K / HD. A striking feature of the C700 FF is the Oversampling 4K Processing that processes a 5.9K image capture to produce 4K (DCI or UHD) having enhanced image sharpness, curtailed moire, and a lowered visibility of noise at the higher ISO settings. This is especially advantageous for on-board anamorphic image capture. Low-rate 2K/HD proxy data including metadata, can be recorded to SD cards, ideal for offline editing. The camera also allows high-frame-rate recording of up to 168fps in 2K crop and relay or simultaneous recording onto both CFast cards. In addition, the C700 FF can shoot at a Full HD high-frame-rate recording at a maximum of 168 fps. Additional formats are planned with future firmware updates.

To further complement the features of the EOS C700 FF, Canon has turned to its trusted partner Codex to provide a fully integrated (no cables) recording and workflow option. The combination of the optional Codex CDX-36150 recorder docked onto the back of the EOS C700 FF enables 5.9K 60 fps RAW recording, 4K RAW up to 72 fps (in 24p mode), 4K ProRes up to 60 fps and 2K ProRes up to 168 fps (in Super 16mm mode).

The C700 FF also supports the latest version (1.0) of the ACESproxy, the ACES (Academy Color Encoding System) color management transmission standard.

For users looking to create High Dynamic Range (HDR) imagery, the EOS C700 FF is an excellent solution, providing 15 stops of latitude (with Canon Log2 only), along with Canon’s proprietary Log Gammas (Canon Log3, Canon Log2 and Canon Log) and renowned color science. Canon Log2 is recommended when originating HDR imagery containing both highlight details and deep shadowed details. In comparison with Canon Log, Canon Log3 offers a wider dynamic range while retaining performance in darker regions.

Additionally, these cameras seamlessly integrate with Canon’s latest professional 4K UHD Reference Displays for on-set review and color management that conforms to SMPTE ST 2084 standards of HDR display.

The look of a cinematic production begins with the lens, and the EOS C700 FF offers both PL and EF lens mount options which are interchangeable at a Canon authorized service center. For full frame imaging, the EF lens mount version of the new EOS C700 FF is compatible with Canon’s family of seven Cinema Prime lenses, including the newly announced CN-E20mm T1.5 L F lens, as well as the diverse lineup of over 70 interchangeable EF lenses. The EF mount supports Canon’s Dual Pixel CMOS AF technology and Dual Pixel Focus Guide. The Focus Guide assists operators with a precision visual indicator in the viewfinder when pulling focus. Alternatively, for certain demanding shooting situations the reliable capabilities of Dual Pixel CMOS AF can be deployed. The EOS C700 FF PL mount version is also compatible with Cooke’s /i metadata communication technology.

The EOS C700 FF EF and EOS C700 FF PL are scheduled to be available in July 2018 for an estimated retail price of $33,000.

  • Tuesday, Mar. 27, 2018
RED introduces GEMINI 5K S35 sensor
RED EPIC-W with GEMINI 5K S35 sensor
IRVINE, Calif. -- 

RED Digital Cinema® has introduced the new GEMINI™ 5K S35 sensor for its RED EPIC-W® camera. The GEMINI 5K S35 sensor leverages dual sensitivity modes to provide greater flexibility for a variety of shooting environments.  Whether creators choose to shoot in standard mode for well-lit conditions or low light mode for darker environments, the RED EPIC-W with GEMINI 5K S35 sensor delivers incredible dynamic range and produces cinematic quality images.

The GEMINI 5K S35 sensor delivers exceptional low-light performance allowing for cleaner imagery with less noise and better shadow detail.  Operators can easily switch between modes through the camera’s on-screen menu with no down time.  The GEMINI 5K S35 sensor offers an increased field of view at 2K and 4K resolutions compared to the higher resolution HELIUM® sensor. In addition, the sensor’s 30.72 mm x 18 mm dimensions allow for greater anamorphic lens coverage than with HELIUM or RED DRAGON® sensors.

“While the GEMINI sensor was developed for low-light conditions in outer space, we quickly saw there was so much more to this sensor,” said Jarred Land, president of RED Digital Cinema. “In fact, we loved the potential of this sensor so much, we wanted to evolve it to make it have a broader appeal.  As a result, the EPIC-W GEMINI now sports dual-sensitivity modes. It still has the low-light performance mode, but also has a default, standard mode that allows you to shoot in brighter conditions.”

Built on the compact DSMC2® form factor, this new camera and sensor combination captures 5K full format motion at up to 96 fps along with incredibly fast data speeds of up to 275 MB/s.  Additionally, it supports RED’s IPP2 enhanced image processing pipeline in camera. Like all of RED’s DSMC2 cameras, EPIC-W is able to shoot simultaneous REDCODE® RAW and Apple ProRes or Avid DNxHD/HR recording and adheres to RED’s dedication to OBSOLESCENCE OBSOLETE®— a core operating principle that allows current RED owners to upgrade their technology as innovations are unveiled, as well as move between camera systems without having to purchase all new gear. 

Beginning at $24,500, the new RED EPIC-W with GEMINI 5K S35 sensor is available for purchase at select RED authorized dealers globally and on  Alternatively, WEAPON Carbon Fiber and RED EPIC-W 8K customers will have the option to upgrade to the GEMINI sensor at a later date.

  • Monday, Mar. 26, 2018
International Future Computing Association launches at Game Developers Conference
Neil Schneider, executive director of TIFCA

At a special launch event during the Game Developers Conference sponsored by Intel, market leaders announced The International Future Computing Association (TIFCA) which is built from the merging of the Open Gaming Alliance and the Immersive Technology Alliance.  TIFCA’s mission is to lay the groundwork and enhance the viability of building what’s next. TIFCA will do this through several key initiatives including: influential membership meetings, stakeholder education, international market-building events, and special initiatives.

Future computing refers to turning dreams into reality with computer technology and media.  This requires a balance of three pillars that include computer processing and peripherals, immersive technology, and innovative content and applications.  Goals include finding new uses and audiences for computer horsepower, connecting bleeding edge vendors with revenue generating clients, and giving influential industry players and content makers the tools and skills needed to benefit from present and future innovations.
TIFCA is divided into three specialized groups:

  • The Computing Technologists Group (CTG) grows the computer horsepower and peripheral ecosystems and is the backbone of turning big dreams into fruitful realities.  CTG is chaired by Frank Soqui, VP and general manager of Virtual Reality for Intel Corp. Activities are geared towards technology makers and OEMs working with PC, cloud, console, and mobile devices. “The evolution of computing and its role in technology innovation requires a strong ecosystem and a vision for the future. Intel has been instrumental in driving nearly 50 years of computing innovation and we look forward to collaborating with TIFCA and our partners to define what’s next,” said Soqui.
  • The Immersive Technology Alliance (ITA) enhances the adoption and viability of immersive technologies like virtual reality, augmented reality, mixed reality, stereoscopic 3D, and more.  The ITA is chaired by Daryl Sartain, director and worldwide head of VR, AI/ML, Displays, TV & Music Ecosystems at Advanced Micro Devices. Participants include device makers, IP makers, immersive content creators and toolsets, as well as academia. “AMD has been a member of TIFCA’s base for three years. We support them because they give us the data we need to make informed decisions and everything they do is focused on improving the bottom line of their membership,” said Sartain.
  • The Alliance of Content Creators (ACC) advances generations of innovative content and applications. The ACC is chaired by Wanda Meloni, CEO of M2 Insights. Types of content that are represented include video games, entertainment, enterprise, education, healthcare, and more.  Participants include content creators and content creation toolmakers. “The Open Gaming Alliance has worked with the Immersive Technology Alliance for many years. Having the OGA and ITA join forces to create this new trade organization is extremely exciting given the market opportunities for inspired future computing applications and professionals. It is the natural progression, and the potential has been clear to us for some time, with this bringing it all now full circle,” said Meloni.

“When groups of visionary companies and innovators get together like this, bright technological futures become impending realities. It’s an honor to help build these realities,” said Neil Schneider, executive director of TIFCA. Neil Schneider has served as executive director for The Immersive Technology Alliance since 2009, and is the Founder of the Meant to be Seen community where several leading innovations got their start.
TIFCA membership meetings will begin in April.

  • Monday, Mar. 26, 2018
David Jorba joins TVU Networks as European managing director
David Jorba

David Jorba, a 20-plus year broadcast industry veteran, has joined TVU Networks, a global technology and innovation company known for live IP video solutions, as executive VP, managing director of Europe. In his new role, Jorba will lead TVU’s European operations from its regional base in Barcelona and provide a local presence in all major European markets.

Jorba previously held a variety of senior positions during 16 years at Vizrt, a broadcast and digital media content solutions company. He most recently was the president and managing director at Vizrt Americas. Jorba has also recently served as a board advisor to Barcelona-based digital marketing agency TMT Factory, as well as to New York’s interactive data applications and visual branding Polygon Labs.

Paul Shen, CEO of TVU Networks, said, “As part of our commitment to TVU’s customers in the European market, we’ve built a significant presence to better accommodate our growing regional customer base. We could not have selected a better senior executive to lead our European operations. David brings years of industry expertise to TVU, having held senior leadership roles in the broadcast industry prior to joining TVU. David has successfully built and run business operations in Europe and the U.S. His expertise will help tremendously in accelerating our European efforts.”

TVU Networks has over 2,500 customers in more than 85 countries. The TVU Networks family of IP transmission and live production solutions gives broadcasters and organizations a powerful and reliable workflow to distribute live video content to broadcast, online and mobile platforms. TVU has become a critical part of the operations of many major media companies. The TVU Networks suite of solutions has been used to acquire, transmit, produce, manage and distribute professional-quality live IP HD footage as an integral part of news, sports and major global events.

As part of its enhanced support for the European region, on April 2 TVU is expanding its regional headquarters and moving to the heart of Barcelona’s innovation district, known as 22@Barcelona. Tripling its office space has enabled TVU to build and launch a new logistics platform to better support broadcaster live coverage requests and grow its team of broadcast engineering experts that are focused on helping clients transition their media workflows to IP.

Jorba said, “TVU is a company that has established itself as a leader in IP workflows and highly innovative production solutions. I’m excited to be part of a company that has the products, ideas and talent needed to build up the most cost-effective, flexible and reliable live production platform of the future.”

  • Friday, Mar. 23, 2018
Baselight 5 evolves with features for HDR, VFX and 360 VR
FilmLight's Baselight 5

“Baselight 5 feels like brand new software, improving on foundation features that merge a whole new level of incredible technical precision and control with fantastic tools”. That’s the view of Matt Watson, DI Colorist at SHED London. And there will be even more for colorists and creatives to enthuse over at NAB2018 (Las Vegas Convention Center, 9–12 April, booth SL4310), where FilmLight will be demonstrating the new functionality recently added to its grading platform.

Central to Baselight 5 is the Base Grade, used extensively on recent blockbuster movies and TV series. This has proven to be an invaluable productivity tool for many. “We’ve used Baselight 5 for several features, including the Netflix Original Mercury 13,” said Matthew Troughton, sr. colorist and head of picture technology at Creativity Media in London. “Base Grade has become a go-to feature for me. I can retain overall contrast while easily being able to shake out detail in the bottom end, a task that previously required keys. It’s quick and easy, and means I can spend more time creating.”

As HDR technology evolves, Baselight colorists increasingly work with HDR acquisition and delivery for movies and premium TV projects, like the recent Black Panther or BBC’s Blue Planet II. Base Grade already takes the sting out of moving between different delivery formats, and between SDR and HDR, but FilmLight has developed more revolutionary tools that give colorists the right controls to ease the transition: Boost Range and an HDR-ready Looks operator.

Boost Range expands the dynamic range of an image when converting from SDR to HDR, using a local tone mapping approach. This results in more natural looking results, with more faithful contrast reproduction. The algorithm also eliminates issues with noise in the extended highlights and makes the whole ‘up-conversion’ more robust while achieving greater dynamic range.

The Looks operator has been upgraded for HDR to bring preferred color rendition to digital image pipelines. The new HDR options add the artistic tweaks to the pipeline to produce a cinematic look, while still maintaining the highest possible image quality when producing all deliverables - from SDR to HDR.

“The new looks are extremely helpful,” said Fernando Rodrigues, sr. colorist at Filmmore in Amsterdam. “I was amazed by how easily I could re-obtain details that, at first glance, seemed to be clipped or crushed. The best thing is that they give a nice ‘thickness’ to the image, something a lot of my clients are looking for. The effect is very film-like.”

Munich-based DP, Matthias Fleischer, agreed. “The Looks feature produces very cinematic roll-offs, and it creates lovely skin tones. When working with dailies, it actually feels a lot like dealing with scanned film footage.”

New functionality to be introduced at NAB includes support for 360/VR, so that virtually any tool in the Baselight arsenal--including secondaries, Paint and Perspective--can be used to grade 360 footage by operating in a ‘sandwich’ of two Panorama operators, which convert from LatLong (or equirectangular) projections to normal projection and back. LatLong projections are also supported in Baselight’s powerful format system, and LatLong images with traditional perspective can be viewed via a new toggle in the Cursor View.

The Relight tool has been boosted too, with support for area lights so you can add and adjust specular highlights to make more realistic lighting changes. Relight allows you to control the location of lights and specular highlights precisely by directly clicking in the image viewer.

The initial release of Baselight 5 provided many tools for VFX, including Grid Warp and the Perspective Tracker. “Perspective tracking went into immediate use and has proven extremely useful,” added Troughton. “It cuts to a fraction the manual work often associated with manually compensating and working around perspective windowing issues.”

In the latest version, the completely rewritten Text tool reduces the workload further in projects where Baselight is used for finishing as well as allowing for greater creativity. It brings Shape-like transform operations to text, so text objects can be manipulated with tools such as Perspective Transform, or tracked throughout a shot.

There are a host of other new features too, designed to enhance efficiency and improve workflows. This includes: strip locking to prevent accidental deletion; selecting strips by operator; searching for buttons and actions in Chalk; file trimming for R3D, Phantom, Sony XAVC and Sony RAW MXF files; a new Client View that shows the current frame and metadata; the ability to include the Shots Layout on the main image monitor, allowing a scrolling list of thumbnails to appear on the main display; and improvements to the Baselight keyers to provide more precise operation in wide gamut scene-referred color spaces.

The Baselight 5 software platform is applied to all of FilmLight’s color range. The full range of FilmLight systems will be demonstrated on booth SL4310 at NAB2018, including Daylight dailies and media management, Prelight on-set pre-visualisation, Baselight Editions and a number of Baselight configurations.

  • Thursday, Mar. 22, 2018
Surf's Up For RED Digital Cinema, FUJINON
RED Digital Cinema's WEAPON outfitted with a FUJINON lens

Considered one of the premiere surfing competitions, Volcom Pipe Pro is the inaugural event for the World Surfing League (WSL), bringing together the world’s top pipeline surfers.

Live footage from the event was captured on RED Digital Cinema’s WEAPON with MONSTRO 8K VV sensor equipped with FUJINON Premier PL 25-300mm and 19-90mm Cabrio zoom lenses for Red Bull TV, in addition to Hawaii and Australia cable channels. The event also served as a real-world demo day with shooters from Red Bull, the WSL, and Uncle Toads Media Group trying out the new camera sensor and lens combination.

Local and visiting video professionals also received hands-on time with the RED camera and FUJINON zooms during an open house at Hawaii Camera. 

This year the Hawaii Camera demo day attracted upward of 75+ attendees.

This marks the third year that RED and the Optical Devices Division of FUJIFILM have collaborated during the Volcom Pipe Pro.

  • Thursday, Mar. 22, 2018
Feature tech meets streaming on "Altered Carbon"
"Altered Carbon" (photo courtesy of Netflix)
HOLLYWOOD, Calif. (AP) -- 

Altered Carbon (Netflix), produced by Skydance Television, envisions a grim future where technology has enabled humans to transfer their consciousness from one body to another--effectively giving eternal life to a privileged few as they move through a steady stream of cloned host bodies. The high-concept series is also pushing the boundaries of traditional TV technology.

Altered Carbon becomes one of the first episodic shows to be shot on the ARRI Alexa 65, and finished in Dolby Vision HDR. Deluxe’s global postproduction teams worked closely with Skydance Television’s production team to develop a highly collaborative and forward-thinking workflow to handle the massive file sizes from the Alexa 65--bringing feature-quality visuals to the small screen on an advanced level.

“We were in a unique position from the beginning because our DPs Martin Ahlgren and Neville Kidd loved the look of the Alexa 65, which at that point had only been used to shoot features,” said Dieter Ismagil, VP of post production at Skydance Television. “I’ve worked with Deluxe on TV projects for over a decade, so right away I went to them to brainstorm on how to handle these intimidating file sizes--how do you manage the dailies? How do you do VFX pulls? How do you get everything done on time and control costs? There was a lot to figure out initially just to make sure we could pull this off.”

Deluxe teams at Encore in Vancouver and Hollywood, EFILM, and Company 3 tackled these workflow challenges with a number of approaches, including their proprietary service, Synapse Portal. This service simplifies ingest and automates distribution of original camera plates to VFX vendors with the desired specs, dramatically reducing turnaround times and eliminating the potential for human error. 

Traditionally used for features, Synapse Portal for TV was first deployed by Encore and Company 3 on 2017’s American Gods, to great success. Knowing the speed and flexibility of the Synapse Portal would be crucial for the VFX schedule, Deluxe worked with production to develop a workflow to keep costs down using downrezzed 4K ProRes XQ files for mastering the show rather than the massive ARRI RAW files. 

Encore Vancouver handled dailies, EFILM handled VFX data management, Company 3’s Jill Bogdanowicz colored the series, and Encore Hollywood managed the finish and final delivery–-all in close collaboration with Altered Carbon post producer Allen Marshall Palmer and the DPs.

  • Thursday, Mar. 22, 2018
SGO releases new version of Mistika VR
MADRID, Spain -- 

SGO has released a new version of Mistika VR, the fastest stitching software on the market, which brings more than twenty new features to its users. These significantly contribute to more efficient workflow, improving performance and quality, and accelerating collaboration among multiple computers.

The latest release is already available at no additional cost to all existing Mistika VR customers.

Highlights of the new release include: 

  • Improved workflow. Mistika VR introduces improvements such as Media Relinking, Mass import of Media, Zoomable visual editor timeline, and many more that will notably enhance VR 360 post-production workflow.
  • Better performance and quality. You will also find better alignment for coaxial rigs and a forward Distance parameter that acts as an experimental alternative to Convergence.  
  • Faster collaboration between multiple hardware. Sharing projects between different operating systems is now possible through Path Translator.
  • Upgraded 360º photography support. Mistika VR users can now choose either if the images are a time sequence or if each image is a separate camera view. Also, maximum camera number is increased, allowing larger still image photography sets.  
  • Greater connectivity with new items. The nVidia h264/h265 render now supports the QP mode for constant quality which is much more appropriate for post-production workflows. Another considerable feature is the KanDao timelapses import tool which helps to reorder and to rename KanDao sequences thanks to the Drag and Drop utility, simplifying the import and management process.
  • Tuesday, Mar. 20, 2018
YouTube's Neal Mohan to headline NAB Show Opening
Neal Mohan

Neal Mohan, chief product officer for YouTube, will headline the 2018 NAB Show Opening on Monday, April 9 at 9 a.m. in the North Hall of the Las Vegas Convention Center. Mohan will discuss the future of TV and entertainment and how the intersection of content, hardware and software is allowing media companies to reach more audiences in new ways.

The opening event will also feature NAB president and CEO Gordon Smith’s “State of the Broadcast Industry” address and the presentation of the NAB Distinguished Service Award to “Good Morning America” co-host Robin Roberts. The event is sponsored by Blackmagic Design.

Mohan is responsible for YouTube products and user experience on all platforms and devices globally, including the company’s core mobile applications, technologies like Live and VR, subscription services YouTube Red and YouTube TV, and vertical experiences such as YouTube Kids, Music and Gaming. Products for media partners, content creators and musicians are also part of his portfolio.

Prior to his current role, Mohan served as senior VP of display and video ads at Google. He headed advertising efforts on YouTube, the Google Display Network, AdSense, AdMob, and the DoubleClick family of programmatic ad platform products. Additionally, he helped elevate the overall digital media industry by building innovative solutions for millions of Google’s advertising and media partners around the world.

At NAB Show, Google will showcase its media solutions via demonstrations at booth SU218 and will participate in more than 30 talks during the course of the show, both in conference sessions and presentations at the Google booth.

MySHOOT Profiles

Rich Michell
Cinematographer, Director
Laurie Rubin

MySHOOT Company Profiles