• Monday, Oct. 5, 2020
Microsoft plans $1 billion data center venture in Greece
Microsoft President Brad Smith, left, speaks with Greek Prime Minister Kyriakos Mitsotakis during a ceremony held in the Acropolis Museum; in the background is the ancient Parthenon temple, central Athens, on Monday, Oct. 5, 2020. Microsoft has announced plans to build three data centers in greater Athens, providing a badly needed investment of up to $1 billion to the Greek economy which has been hammered by the pandemic. (AP Photo/Petros Giannakouris)
ATHENS, Greece (AP) -- 

Microsoft has announced plans to build three data center sites in greater Athens, providing a badly needed investment of up to $1 billion to the Greek economy which has been hammered by the pandemic.

The news was announced Monday by the U.S. tech giant and Greek Prime Minister Mitsotakis and follows nine months of confidential negotiations for an agreement that also includes digital-skills training programs for some 100,000 government and private sector workers as well as educators and students.

"This significant investment is a reflection of our confidence in the Greek economy, in the Greek people and the Greek government," Microsoft President Brad Smith said at a ceremony held in the Acropolis Museum, facing the ancient site in central Athens. "It's not something we do often and it's not something that we do lightly."

Greece recently emerged from a years-long financial crisis but its economy has been hard hit by the pandemic, suffering a 15.2% drop in output on the year in the second quarter. Unemployment in June climbed to 18.3% from 16.4% at the start of the year. 

According to budget figures submitted to parliament Monday, Greece's economy is expected to contract 8.2% this year due to the effects of the coronavirus pandemic, and grow by up to 7.5% next year.

The COVID-19 pandemic has exposed the country's heavy reliance on tourism. The Mitsotakis government says it wants to shift the balance of the Greek economy during its recovery, developing the energy, tech, and defense sectors, hoping to lure back tens of thousands of graduates who left during the crisis. 

"We are starting to create the conditions for their return," Mitsotakis said. "The creation of a data center upgrades a country as an investment destination ... Greece has the sun and now it's getting a cloud."

Microsoft currently has data centers in 26 countries, including seven in the European Union. The company based in Redmond, Washington, is already working with the Greek government on an augmented reality project on Ancient Olympia, birthplace of the Olympic games.

Microsoft officials said the timetable for the development of the data center in Greece was still being worked out, but added added that the process in other countries typically took about two years. Greece, the officials said, would comply with Microsoft's pledge to run all its data centers worldwide on renewable energy sources by 2025. 

  • Tuesday, Sep. 29, 2020
Amazon sees its palm recognition tech in stadiums, offices
In this March 4, 2020 file photo, people walk out of an Amazon Go store, in Seattle. Amazon is rolling out a new device for contactless transactions that will scan an individual’s palm. The Amazon One, which will initially launch in two Amazon Go stores in Seattle, is being viewed as a way for people to use their palm to make everyday activities like paying at a store easier. (AP Photo/Ted S. Warren, File)
SEATTLE (AP) -- 

Amazon has introduced new palm recognition technology in a pair of Seattle stores and sees a broader potential audience in stadiums, offices and elsewhere. 

Customers at the stores near Amazon's campus in Washington can flash a palm for entry and to buy goods. 

The company chose palm recognition, according to Dilip Kumar, vice president of Physical Retail & Technology, because it's more private than other biometric technology, and a person would be required to purposefully flash a palm at the Amazon One device to engage. 

"And it's contactless, which we think customers will appreciate, especially in current times," Kumar wrote in a blog post Tuesday. 

The company expects to roll out Amazon One as an option in other Amazon stores in the coming months, which could mean Whole Foods Market grocery stores. But Amazon believes the technology is applicable elsewhere.

"In most retail environments, Amazon One could become an alternate payment or loyalty card option with a device at the checkout counter next to a traditional point of sale system," Kumar wrote. "Or, for entering a location like a stadium or badging into work, Amazon One could be part of an existing entry point to make accessing the location quicker and easier."

People can sign up for an Amazon One account with a mobile phone number and credit card. An Amazon account isn't necessary. 

  • Wednesday, Sep. 23, 2020
ARRI to introduce Signature Zoom lenses
ARRI's new Signature Zoom lenses
MUNICH -- 

Responding to market requests for high-quality zoom lenses to match and accompany the Signature Primes, ARRI has announced four new Signature Zooms designed for universal usage with any large-format or Super 35 camera. They offer a fast stop of T2.8 across all four lenses, what’s billed as unrivalled focal length range, absolute image consistency, HDR compatibility, and 8K resolving power.

The four Signature Zooms, with an extender for the longest of them, cover a focal length range of 16 mm to 510 mm, which is the largest in the industry. Features include a fast stop of T2.8 shared by all four lenses, as well as stable exposure and quality through all focal lengths and iris settings. 

The Signature Zooms carefully balance the requirements of a very high optical performance, pleasing look, and lightweight mechanical design. Like the Signature Primes, they render a beautifully warm image, with smooth, flattering skin tones, natural colors, and elegant out-of-focus highlights. The Signature look is unique because it is characterful, and yet is not based on visible aberrations or imperfections. Audiences are enveloped in the images and connected to the storytelling without interference.

There will always be instances where prime lenses are the first choice on set, but as a logical extension of the Signature family, the Signature Zooms increase the number of situations where the timesaving and practical benefits of zooms can be taken advantage of without appreciable compromises. They also share the detachable magnetic rear filter holder of the Signature Primes, allowing filmmakers to change and personalize the look of the entire Signature lens system.

In addition to the inherently faster on-set workflows that zoom lenses bring, specific features of the Signature Zooms speed things up still further. Flares are shaped and honed through the best available lens coatings, reducing delays; setups are made easier and quicker by the lightweight magnesium design; and the in-built ARRI LDS-2 Lens Data System simplifies complex tasks on set and in post. Truly exceptional close-focus distances increase creative flexibility, but also save time by reducing lens changes and save money by allowing crews to carry fewer lenses.

Fundamental to the design of the Signature Zooms is their suitability for shooting not just in large format (also known as full frame or VistaVision), but also in Super 35. While the high image quality and beautiful background separation enhance the almost three-dimensional feel of large-format cinematography, the 8K resolving power and stunning bokeh also ensure pristine, immersive images in the smaller Super 35 format. Equipped with an LPL lens mount, which is an open, universal standard for cross-format shooting, the Signature Zooms can be used with any large-format or Super 35 camera, from any manufacturer.

Recognizing that high-end zoom lenses are a significant investment, ARRI has drawn on all of its experience and expertise to future-proof the Signature Zooms. Aside from the fact that they can be used for all formats up to full frame, other attributes safeguard them for a future in which a lot of current zooms won’t make the grade. The fact that the look is not based on aberrations of any sort means easier HDR and UHD workflows, which tend to exaggerate the aberrations of other lenses and create problems in post. The 8K resolution and deep shadow detail also lend themselves to future display requirements, while ARRI’s build quality and support structure assure a long product life.

ARRI Signature Zooms will be especially useful to productions that want a high-end look but rely on the versatility and time-efficiency of zooms, such as TV series, commercials, music videos, and remote applications. Feature films now also have a high-quality zoom series to accompany and complement the Signature Primes, and productions that carry both will find that they are covered for every possible shooting situation.

The 45-135 mm T2.8 and 65-300 mm T2.8 Signature Zooms will be released during Q1 2021, along with a dedicated 1.7x extender for the 65-300 mm that makes it a 110-510 mm T4.9. The 16-32 mm T2.8 and 24-75 mm T2.8 will be released later in the same year.

  • Wednesday, Sep. 16, 2020
EIZO, Eluvio, Moxion, Carl Zeiss SBE named recipients of HPA Awards for Engineering Excellence
Joachim Zell, chair of the HPA Awards Engineering Committee
BURBANK, Calif. -- 

The Hollywood Professional Association (HPA) Awards Committee has unveiled the winners of the 2020 HPA Awards for Engineering Excellence.  The HPA Awards, including the HPA Award for Engineering Excellence, will be bestowed on November 19 in a virtual gala.

The judging process for this year’s entries was adapted to adhere to restrictions imposed upon the industry by the pandemic, while remaining in keeping with the rigorous standards for these highly regarded honors.  As always, a blue-ribbon judging panel reviewed the submissions, with the process for this year being split in two sections:  a video presentation sent in advance to the judges, and a “live” virtual interaction with judges after the video submissions were reviewed.

This year marks the 15th anniversary of the HPA Awards, which were founded to recognize creative artistry and innovation in the professional media content industry. A coveted and competitive honor, the Engineering Excellence Award rewards outstanding technical and creative ingenuity in media, content production, finishing, distribution, and archive.

HPA Awards Engineering Committee chair Joachim Zell said, “It is always exciting to experience first-hand the innovative brilliance of our industry. This year, we had a great turnout of entries, and in keeping with the standards of the Engineering Excellence awards, an outstanding judging panel of industry professionals reviewed the submissions.  Without exception, every entry was compelling, and judging was as inspiring as ever. We are proud of our industry in these moments where we see that even a global crisis does not break our spirit or the dedication to incredible work. Congratulations to the winners, and all the entrants, we are impressed and inspired.”

The winners of the 2020 HPA Award for Engineering Excellence are:

EIZO: Prominence CG3146 31.1
EIZO’s Prominence CG3146 31.1” HDR monitor is the first in the world with a built-in calibration sensor. New technologies were invented to include advanced brightness and color uniformity correction allowing the calibration to occur at the top of the monitor and a stable XYZ glass type sensor that resist temperature. New algorithms were needed for one sensor in both low brightness areas with higher noise, and high bright areas with more light saturation. The machine learning display stabilization and custom ASIC technologies were implemented to improve image stability and increase the available colors of the LCD panel by over 22%.

Eluvio: Eluvio Content Fabric
Eluvio has pioneered a novel video distribution platform to service content providers around the globe. The Eluvio Content Fabric provides an efficient, secure, and cost-effective means to deliver ultra-low latency video globally just-in-time, without CDNs, cloud transcoding, or databases.  As a single backend for distribution, the platform links media, metadata, and programmable code; and serves high quality streaming video to the viewer without intermediaries, separate file copies, or static versions. Content is personalized dynamically and rights-controlled via software blockchain contracts. This submission includes a description, deployments with Tier 1 video content companies, and supportive quotes from MGM.  

Moxion: Immediates
Moxion Immediates are the result of multiple innovations enabling post and production to securely share and review each other’s footage within minutes of creation.  Immediates footage, complete with Camera, Action, and VFX Wranglers’ metadata, flows from the QTAKE video assist into the Moxion cloud where it can be reviewed and logged before flowing into Editorial or VFX workflows.  Conversely, footage created in post; for example, VFX Previs Cuts can be sent via the Moxion Cloud, back to set, security, the Video Assist for playback on the onset monitors, or to any onset iOS device.

Carl Zeiss SBE: eXtended Data - Lens Metadata Technology
Simplify and increase the accuracy of the image capture and processing workflow. ZEISS eXtended Data unifies two data sets: Key lens data based on the Cooke, and i-Technology and the ZEISS distortion and shading lens data.  After recording on set together with the video files, the data can be applied in post-production for more accurate compositing and editing by using the suite of ZEISS developed lens plugins.

Honorable mention
Frame.io and Sohonet received honorable mention for their Camera-to-Cloud and ClearView Flex technologies, respectively.

In addition to the Engineering Excellence Award, the virtual awards gala on November 19 will recognize excellence in craft categories including color grading, editing, sound and visual effects.  The Judges Award for Creativity and Innovation will be announced in advance of the awards and presented during the program.

HPA, and its HPA Awards committee, remains dedicated to continuing to support the artists, technical innovators, and companies in the industry that continuously raise the bar for content. In light of this year’s unique challenges, a number of changes were put in place to encourage participation and engagement. Tickets are free with registration, and entry fees were reduced. The awards will be presented in an engaging, interactive platform which encourages a large audience to join in the evening’s activities.

  • Monday, Sep. 14, 2020
Nvidia to buy UK's Arm, sparking fears of chip dominance
In this Tuesday, May 30, 2017 file photo, Nvidia CEO Jensen Huang delivers a speech about AI and gaming during the Computex Taipei exhibition at the world trade center in Taipei, Taiwan. Computer graphics chip company Nvidia said it plans to buy Britain's Arm Holdings for $40 billion, in a merger of two leading chipmakers. Santa Clara, California-based Nvidia and Arm's parent company, Japanese technology giant SoftBank, announced the deal Sunday, Sept. 13, 2020. (AP Photo/Chiang Ying-ying, File)
LONDON (AP) -- 

U.S. graphics chip maker Nvidia said it plans to buy U.K.-based Arm Holdings in a deal worth up to $40 billion, in a move that would create a global powerhouse in the industry.

The deal, which was announced late Sunday by Nvidia and Arm's parent company, Japanese technology giant SoftBank, raises concerns about the independence of Arm, one of Europe's most important tech companies.

Arm's chip designs power the vast majority of the world's smartphones and the company is renowned as an innovator in technology for connected devices, known as the "Internet of Things." Arm centers its business on designing chips and licensing the intellectual property, especially in mobile computing, rather than chip manufacturing, for which it relies on partners. 

Being owned by a U.S. company could mean Arm is exposed to U.S. government export bans at a time when Washington is in a battle for tech supremacy with China.

Under the terms of the deal, Santa Clara, California-based Nvidia will pay SoftBank $21.5 billion in stock and $12 billion in cash. SoftBank could earn a further $5 billion if Arm hits performance targets while Arm employees will get $1.5 billion worth of Nvidia shares.

SoftBank bought Arm for about $32 billion in 2016, in an effort to cement the Japanese company's ambitions in advancing how various devices, including security cameras and household appliances, connect online and work together. That earlier deal sparked fears that one of Britain's most successful tech companies would be involved in a foreign takeover, so the British government got SoftBank to agree to keep Arm's headquarters in the U.K. and double its British staff over five years as concessions. 

Nvidia CEO Jensen Huang said the U.S. company still plans to keep Arm based at its headquarters in Cambridge, England, where it will also build an artificial intelligence research center. 

"Together we're going to create the world's premier computing company for the age of AI," Huang told reporters. 

"We want more great engineers not fewer, we want more R&D not less. And we want that work to be done in the U.K., in Cambridge," Huang said, adding that it wasn't about consolidation or cost savings.

However, Hermann Hauser, who helped set up Arm, called the deal an "absolute disaster for Cambridge, the U.K. and Europe." 

Hauser, now a technology investor, told the BBC's Radio 4 that his biggest concern was that it would degrade what he called the U.K.'s "economic sovereignty" because Arm would end up falling under the jurisdiction of U.S. export controls. 

That means "if hundreds of U.K. companies that incorporate Arm's (technology) in their products, want to sell it, and export it to anywhere in the world including China, which is a major market, the decision on whether they will be allowed to export it will be made in the White House and not in Downing Street."

The deal would also destroy Arm's "neutral" business model, which has made it the "Switzerland of the semiconductor industry," Hauser said. Arm has become successful by licensing its technology to more than 500 companies, many of which are rivals to Nvidia, and the sale would create a "monopoly problem," he said. 

Regulators in the U.S., U.K., China, the European Union will need to approve the deal, which will need about 18 months to complete. 

The British government said it could intervene in the deal because of the "vital role" Arm plays in the U.K.'s tech sector.

"The government monitors acquisitions and mergers closely and when a takeover may have a significant impact on the U.K. we will not hesitate to investigate further and take appropriate action," said Prime Minister Boris Johnson's spokesman, James Slack.

"We are investigating this deal further and ministers are speaking to the relevant companies."

Jill Lawless contributed to this report. 

  • Thursday, Sep. 10, 2020
ILM to expand virtual production resources with 3 new Stagecraft stages; diversity initiative launched
Janet Lewin, SVP, GM of ILM
SAN FRANCISCO -- 

Industrial Light & Magic has unveiled the next phase of its global expansion plan for the company’s virtual production and StageCraft LED volume services. This expansion of services is tied to a proactive initiative for increasing diversity in the industry by combining ILM’s growth in this innovative methodology with a global trainee program geared for underrepresented VFX talent. 

ILM’s existing StageCraft volume set at Manhattan Beach Studios (MBS) was used for the Emmy-nominated series The Mandalorian (Disney+) and will soon be joined by a second permanent StageCraft volume set at the studio, servicing a variety of clients in the greater Los Angeles area. In addition, ILM is building a third permanent StageCraft volume at Pinewood Studios in London, and a fourth large-scale custom volume at Fox Studios Australia to be used for Marvel’s highly anticipated feature Thor: Love and Thunder directed by Taika Waititi. ILM will also continue to provide “pop up” custom volumes for clients as the company recently did for the Netflix production The Midnight Sky, directed by George Clooney.

An end-to-end virtual production solution, ILM StageCraft is a production-hardened technology that provides a continuous pipeline from initial exploration, scouting, and art direction, traditional and technical previsualization, lighting, and of course, real-time production filming itself, with the innovative StageCraft LED volumes. Lucasfilm’s hit Disney+ series, The Mandalorian, and a highly anticipated feature film took advantage of the full complement of ILM StageCraft virtual production services. Other projects such as Avengers: Endgame, Aquaman, Jurassic World: Fallen Kingdom, Battle at Big Rock, Rogue One: A Star Wars Story, Kong: Skull Island, Solo: A Star Wars Story, Ready Player One, and Rango, have utilized  aspects of the toolset as well. 

By every measure, the new stages are vast improvements over the original ground-breaking LED volume developed for the first season of The Mandalorian in 2018. Physically, the new stages are larger, utilizing substantially more LED panels than ILM’s original stage and also offering both higher resolution and smooth wall to ceiling transitions – this directly results in better lighting on set as well as many more in-camera finals. ILM’s proprietary solutions for achieving groundbreaking fidelity on the LED walls at scale allows for higher color fidelity, higher scene complexity, and greater control and reliability.

“With StageCraft, we have built an end-to-end virtual production service for key creatives.  Directors, production designers, cinematographers, producers and visual effects supervisors can creatively collaborate, each bringing their collective expertise to the virtual aspects of production just as they do with traditional production,” explained Janet Lewin, SVP, GM ILM.  

Rob Bredow, CCO, ILM, added “Over the past five years, we have made substantial investments in both our rendering technology and our virtual production toolset. When combined with Industrial Light & Magic’s expert visual effects talent, motion capture experience, facial capture via Medusa, Anyma, and Flux, and the innovative production technology developed by ILM’s newly integrated Technoprops team, we believe we have a unique offering for the industry.”

Alongside the new stages, ILM is rolling out a global talent development initiative through the company’s long-standing Jedi Academy training program. The program, which is part of the company’s larger Global Diversity & Inclusion efforts, offers  paid internships and apprenticeships on productions with seasoned ILM supervisors and producers who serve as mentors. The program is intended to fill roles across the virtual production and VFX pipeline with those from traditionally underrepresented backgrounds;  ILM has posted expressions of interests for jobs across the spectrum, from virtual art department teams and production management to engineering and artist roles. The goal with this initiative is to attract diverse junior talent and create a pipeline for them to become future visual effects artists, technicians and producers who will be “ILM trained” and uniquely qualified to work in this new, innovative way of filmmaking.

“There is a widespread lack of diversity in the industry, and we are excited to leverage our global expansion in this game-changing workflow to hire and train new talent, providing viable, exciting, and rewarding jobs across many of our locations,” noted ILM VP, Operations, Jessica Teach, who oversees the company’s Diversity and Inclusion initiatives. “We believe this program can have a multiplier effect, attracting even more diverse talent to the industry and creating a pipeline for visual effects careers. We know that bringing more diversity into the industry is a critical part of strengthening and expanding our storytelling potential.”

ILM expects to have the new stages up and running for production in London in February of 2021 and in Los Angeles in March, with a mix of projects from features to commercials in line to take advantage of them. The company is currently fielding inquiries for future bookings by studios and filmmakers.

  • Wednesday, Sep. 9, 2020
Single-take feature "Last Call" colors with DaVinci Resolve Studio
A scene from "Last Call"
FREMONT, Calif. -- 

Blackmagic Design announced that the feature film “Last Call,” which was shot in two simultaneous 80 minute takes, was colored in DaVinci Resolve Studio by colorist and cinematographer Seth Wessel-Estes, who also used the DaVinci Resolve Micro Panel in his workflow.

“Last Call” follows a suicidal alcoholic on the anniversary of his son’s death. When he attempts to call a crisis hotline, a misdial connects him with a single mother working as the night janitor at a local community college. The split screen feature showcases both characters in real time as they navigate a life changing conversation.

Director Gavin Booth was no stranger to single take projects, having experience directing a number of one act plays and single shot music videos. He was also director on the Blumhouse project “Fifteen,” which was the world’s very first movie broadcast live, also in a single take. “Last Call” would be a unique approach to filmmaking, in the vein of films such as “Timecode” and “Russian Ark”, both early inspirations for Booth.

Both Booth and cinematographer/colorist Wessel-Estes knew there would be significant challenges in coloring a movie without edits, and with two constantly moving shots around a cityscape. “Figuring out the approach of how to color the film was a pretty daunting task,” said Wessel-Estes. “We hadn’t ever had to color anything longer than a three or four minute shots in the past.”

In preparation, both Booth and Wessel-Estes shot a single take music video for the band Bleu before starting on “Last Call,” working up not only a workflow for production but also a process for handling a constantly changing image.

“Last Call” would present a myriad of challenges to the team, all of which would be reflected in the final image that would need to be colored. Each of the characters would be filmed simultaneously, connected only by the cell phone call that was the link in the movie. Booth would follow one character, and Wessel-Estes the other.

“We needed to reduce the amount of crew for both camera and sound,” reflected Booth. “There couldn’t be a camera assistant or a boom operator. There was nowhere to hide everyone or avoid boom shadows throughout the full single take on either side of the movie. We had to work with our sound mixers to double lav the actors and rely solely on that.”

Booth and Wessel-Estes were also the camera operators, and would need to pull their own focus, while the gaffer would have to figure out how to use a mixture of practical lighting as well as hide every single cable and film light since the single take on either side would move 360 degrees through the space. Moving from inside to outside or even room to room meant that the camera’s exposure needed to seamlessly adjust at the same time. Every element would either help or hinder the final look, and the team knew they had limited time and budget to get it right.

For the look of the project, Booth wanted to maintain a realistic style, and avoid an exaggerated or otherwise over colored look that might have distracted from the character driven drama. “With the use of the long take to show ‘reality’ we wanted to keep it feeling as truthful as possible. For us, having too much of a look on a piece like this can take away from the rawness. We wanted the audience to feel like they were there with the two characters, hanging on every tense word they exchange.”

Once in post, Wessel-Estes knew that having a single long take, and hundreds of node corrections on the single shot would be unmanageable. Instead, he imported the project into DaVinci Resolve Studio and used the Edit panel to place cuts throughout the film to give him transition points. “I was able to go in and make cuts along the timeline which acted as scene markers. That way I could individually color each ‘scene’ or area by itself. Luckily with the Resolve editing panel I was able to go in and create edits and use cross dissolves to smoothly and dynamically transition between graded sections.”

Wessel-Estes also used the Davinci Resolve Micro Panel for color grading. “It enabled me to not only grade more efficiently but also to have more fine tuned tactile control of individual parameters. Since our look was so naturalistic, I ended up doing tons of very small, subtle corrections which would have been a hassle without having a control surface.”

With a split screen showing each character’s part of the story, Booth and Wessel-Estes had a host of challenges to manage. As the camera moved from location to location, both interior and exterior, they knew they would be faced with completely different lighting and exposures throughout the film, requiring careful grading to both marry segments while respecting each new look. “Since we were working with some uncontrolled lighting situations we had to go in and do dynamic exposure and color adjustments as the camera and character move from one space to another.”

Matching the two sides of the film so they felt cohesive throughout was also very important. Each half of the screen had its own ‘look’ but also needed to blend effectively so that the audience wasn’t distracted by the visual contrast of the split screen. “Using the built in comparison tools to bring up images side by side as well as the various scopes enabled us to ensure the look would remain cohesive throughout.”

Wessel-Estes also used a fair amount of vignetting that needed to be keyframed in order to track with the moving camera. “The built in smart tracking inside of Resolve was hugely helpful for a lot of this work, and I love how simple yet effective it is.” Other DaVinci Resolve Studio tools, such as HSL keys, sharpening masks and advanced keyframing controls came in handy when coloring a single take film. “Having all of these tools at my fingertips enabled me to add a degree of finesse to the look of the finished movie which just wouldn’t have been possible on set.”

With “Last Call” taking awards at 25 international film festivals and release slated in theaters on September 18th, Booth is proud of what they have accomplished, achieving a style he has long admired in other projects. “As a filmmaker I have forever been obsessed with long take storytelling; and with audience and critics response, I am thrilled our film’s story rises above the ‘gimmick’ of a long take. For me, ‘Last Call’ felt like the next evolution of a filmmaking challenge.”

  • Wednesday, Sep. 2, 2020
New FUJINON Premista lens in development; PL mount box lens previewed, MK Lens Mount showcased
The FUJINON Premista lens family
VALHALLA, NY -- 

FUJIFILM North America Corporation has announced the development of the FUJINON Premista 19-45mm T2.9 lightweight wide cinema zoom lens (“Premista 19-45mm”) for large format sensor cameras, as well as preview the SK35-700mm telephoto PL mount box lens. In addition to these items, an MK lens mount developed by Duclos Lenses for the highly anticipated RED Komodo cinema camera was introduced during a virtual press conference.

The short, lightweight wide-angle Premista 19-45mm T2.9 expands the Premista family of zooms to three lenses. Joining the 28-100mm T2.9 and the 80-250mm T2.9-3.5, the Premista 19-45mm produces images with natural and beautiful bokeh, outstanding high resolution, accurate color rendition, and controllable flare with minimal ghosting for capturing high dynamic range. The lens shows very little distortion throughout the entire zoom range, lightening the burden of correcting footage after shooting, and allowing high quality cinematic images to be created more efficiently.

“The response we’ve seen to the Premista lenses since their 2019 launch has been tremendous both in terms of excitement and usage across feature film and high-end TV productions,” said Thomas Fletcher, director of marketing, Optical Devices Division, FUJIFILM North America Corporation. “Now, with stricter safety and efficiency needs on set, there is a growing demand for high quality zoom lenses that match the quality and ‘look’ of prime lenses, and efficiently capture images without the hassle of having to frequently change lenses.”

The Premista 19-45mm is scheduled for release in early 2021.

SK35-700mm Telephoto PL Mount Box Lens
Fujifilm developed the FUJINON SK35-700mm PL Mount Telephoto Box Lens (SK35-700) for 8K television applications, but the company will now be doing extensive market research, exploring the possibility of repurposing the lens in response to the emerging needs of the multi-camera cinema style production market. The lens features a 20x high magnification zoom, covering a focal range of 35mm-700mm at F2.8 (35-315mm) and F4.8 (at 700mm). The SK35-700 also features a 1.4x extender, which brings the range to 49mm-980mm on S35 cameras while also offering significant coverage on many large format cameras. It is 28” long and weighs 69 lbs.

The SK35-700 is currently the only telephoto PL mount box lens on the market. Its design provides for unparalleled cinematic imaging in various multi-cam productions.

“We believe the SK35-700 will deliver on the growing desire of more producers to create a cinematic look,” said Fletcher. “The lens range creates the ability to shoot in immersive environments without obstructing views or otherwise interrupting the viewers’ experience.”

Duclos Lenses MK-R Mount
Duclos Lenses has developed the MK-R Lens Mount, an RF Mount Conversion that makes the FUJINON MK 18-55mm and 50-135mm zoom lenses compatible with a variety of RF mount camera bodies—most notably, the Super 35 format KOMODO 6K camera from RED. Paired together, the setup is extremely small, lightweight, and relatively affordable.

  • Wednesday, Sep. 2, 2020
Weta Digital advances VFX and animation in the cloud via deal with Amazon Web Services
Weta Digital CEO Prem Akkaraju
SEATTLE -- 

Weta Digital is going all-in on Amazon Web Services (AWS) to create a new, cloud-based visual effects (VFX) workflow. This workflow includes a set of technologies for VFX artists that will underpin the studio’s global expansion, accelerate key portions of film production, and expand Weta Digital’s New Zealand operations, enabling its team of artists to collaborate on visual effects remotely. In the past 25 years, Weta Digital has brought to life some of the most memorable worlds and characters in film, including Middle-earth and Gollum in New Line’s The Lord of the Rings trilogy and the Na’vi and beautiful landscapes of Pandora in Avatar

Over the course of this multi-year deal, Weta Digital will migrate the vast majority of its IT infrastructure to AWS to support a pipeline that includes 100 proprietary tools and its LED-stage virtual production service, which creates immersive new worlds on set. In addition, Weta Digital will use AWS to produce and render original content from the newly announced “Weta Animated” and deliver on its multi-year movie slate.

Visual effects artists use a wide range of animation and compositing software to integrate computer-generated imagery with live action footage, creating scenes that go beyond what can easily be captured with film alone. This imagery generates a massive volume of video and image files that can put a strain on IT resources and requires delicate load balancing efforts to keep production facilities operating at peak capacity. Leveraging AWS’ global infrastructure and AWS services, including compute, storage, security, machine learning (ML) and analytics, Weta Digital can spread its workloads more efficiently around the world, freeing up talent and resources to continue to create groundbreaking visual effects, and gain the flexibility to render VFX scenes remotely wherever its creative staff is based. With Amazon Elastic Compute Cloud (Amazon EC2), Weta Digital will have expanded access to a broad range of specialized Graphics Processing Unit (GPU) instances for better integration of ML into the VFX creation process, enabling artists to create more life-like, detailed movie creations.

“Weta Digital has been an innovator in the visual effects industry for decades. By adopting AWS’ ultra-scale infrastructure, we can implement a proprietary cloud pipeline and globally scale our production to greater levels than ever before,” said Prem Akkaraju, CEO of Weta Digital. “Weta established a remote collaborative workflow in March due to the pandemic to seamlessly continue work on the Avatar sequels and other films. With the power of AWS, we can now take that success to a global scale. Drawing on AWS’ virtually unlimited compute capacity, we can empower our artists to work safely and securely where they want without technical limitations. In addition, using the breadth and depth of AWS services we can more easily test new ideas and technologies as we continue to push the boundaries of what is possible in visual effects today. AWS services, such as machine learning and data analytics, will help Weta deliver projects faster and more cost effectively, and our customers will enjoy the fruits of Weta Digital’s continuous innovation.”

AWS CEO Andy Jassy added, “Weta Digital has earned fans around the world through its innovative approach of combining technology and creativity to push the boundaries of visual effects in the movie industry while bringing some of cinema’s most memorable characters to life. Weta Digital will rely on AWS’ unmatched portfolio of services to continue redefining what is possible on screen and at a scale that was not previously possible. Through its collaboration with AWS, Weta Digital is reducing technology barriers for those in the filmmaking industry, strengthening its operations in New Zealand and globally, and paving the way for immersive, new experiences for moviegoers.”

  • Wednesday, Sep. 2, 2020
Sony introduces Cinema Line; FX6 slated for release by end of the year
SAN DIEGO, Calif. -- 

Sony Electronics Inc. has launched Cinema Line, a series of camera products for a wide range of content creators. 

Cinema Line will deliver not only the coveted cinematographic look cultivated through extensive experience in digital cinema production, but also the enhanced operability and reliability that meet discerning creators’ various needs. The new series will extend beyond traditional cinema camera and professional camcorder form factors.

In 2000, Sony released the ground-breaking HDW-F900. The HDW-F900 made digital cinema history as the world’s first 24p digital cinema camera. Many Sony cameras followed in response to countless dialogues with cinematographers and image creators--including VENICE, which was released in 2018.

Existing cameras that will form part of the Sony Cinema Line include VENICE and FX9. VENICE has become a go-to choice for digital movie production, and FX9 has an outstanding track record in documentary production. The next camera will appeal to a wider spectrum of visual creators. Sony will be releasing and shipping this next addition to the Cinema Line, FX6, by the end of 2020.

Each of the Cinema Line cameras will evolve with user feedback: The FX9 Version 3.0 firmware upgrade, available in 2021, will see the addition of the S700PTP (a protocol that realizes S700P over TCP/IP) to enable remote control, and a Center Scan mode for Super 16mm lens and B4 lens support with its adaptor, as well as other features. In parallel, in November 2020, VENICE will see additional features in Version 6.0 firmware, which will improve its operability in broadcast and live environments.

“The voice of our customer is critical to everything we do,” said Neal Manowitz, deputy president of Imaging Products and Solutions Americas at Sony Electronics Inc. “We have the deepest respect for filmmakers, cinematographers and storytellers, and will continue to evolve our product line to meet and exceed their demands. Just as our VENICE camera was designed to capture the emotion in every frame, our new Cinema Line expands that vision to allow a broader range of creators to push their boundaries further, and capture and create like they’ve never been able to before.”

MySHOOT Company Profiles