• Wednesday, Sep. 16, 2020
EIZO, Eluvio, Moxion, Carl Zeiss SBE named recipients of HPA Awards for Engineering Excellence
Joachim Zell, chair of the HPA Awards Engineering Committee
BURBANK, Calif. -- 

The Hollywood Professional Association (HPA) Awards Committee has unveiled the winners of the 2020 HPA Awards for Engineering Excellence.  The HPA Awards, including the HPA Award for Engineering Excellence, will be bestowed on November 19 in a virtual gala.

The judging process for this year’s entries was adapted to adhere to restrictions imposed upon the industry by the pandemic, while remaining in keeping with the rigorous standards for these highly regarded honors.  As always, a blue-ribbon judging panel reviewed the submissions, with the process for this year being split in two sections:  a video presentation sent in advance to the judges, and a “live” virtual interaction with judges after the video submissions were reviewed.

This year marks the 15th anniversary of the HPA Awards, which were founded to recognize creative artistry and innovation in the professional media content industry. A coveted and competitive honor, the Engineering Excellence Award rewards outstanding technical and creative ingenuity in media, content production, finishing, distribution, and archive.

HPA Awards Engineering Committee chair Joachim Zell said, “It is always exciting to experience first-hand the innovative brilliance of our industry. This year, we had a great turnout of entries, and in keeping with the standards of the Engineering Excellence awards, an outstanding judging panel of industry professionals reviewed the submissions.  Without exception, every entry was compelling, and judging was as inspiring as ever. We are proud of our industry in these moments where we see that even a global crisis does not break our spirit or the dedication to incredible work. Congratulations to the winners, and all the entrants, we are impressed and inspired.”

The winners of the 2020 HPA Award for Engineering Excellence are:

EIZO: Prominence CG3146 31.1
EIZO’s Prominence CG3146 31.1” HDR monitor is the first in the world with a built-in calibration sensor. New technologies were invented to include advanced brightness and color uniformity correction allowing the calibration to occur at the top of the monitor and a stable XYZ glass type sensor that resist temperature. New algorithms were needed for one sensor in both low brightness areas with higher noise, and high bright areas with more light saturation. The machine learning display stabilization and custom ASIC technologies were implemented to improve image stability and increase the available colors of the LCD panel by over 22%.

Eluvio: Eluvio Content Fabric
Eluvio has pioneered a novel video distribution platform to service content providers around the globe. The Eluvio Content Fabric provides an efficient, secure, and cost-effective means to deliver ultra-low latency video globally just-in-time, without CDNs, cloud transcoding, or databases.  As a single backend for distribution, the platform links media, metadata, and programmable code; and serves high quality streaming video to the viewer without intermediaries, separate file copies, or static versions. Content is personalized dynamically and rights-controlled via software blockchain contracts. This submission includes a description, deployments with Tier 1 video content companies, and supportive quotes from MGM.  

Moxion: Immediates
Moxion Immediates are the result of multiple innovations enabling post and production to securely share and review each other’s footage within minutes of creation.  Immediates footage, complete with Camera, Action, and VFX Wranglers’ metadata, flows from the QTAKE video assist into the Moxion cloud where it can be reviewed and logged before flowing into Editorial or VFX workflows.  Conversely, footage created in post; for example, VFX Previs Cuts can be sent via the Moxion Cloud, back to set, security, the Video Assist for playback on the onset monitors, or to any onset iOS device.

Carl Zeiss SBE: eXtended Data - Lens Metadata Technology
Simplify and increase the accuracy of the image capture and processing workflow. ZEISS eXtended Data unifies two data sets: Key lens data based on the Cooke, and i-Technology and the ZEISS distortion and shading lens data.  After recording on set together with the video files, the data can be applied in post-production for more accurate compositing and editing by using the suite of ZEISS developed lens plugins.

Honorable mention
Frame.io and Sohonet received honorable mention for their Camera-to-Cloud and ClearView Flex technologies, respectively.

In addition to the Engineering Excellence Award, the virtual awards gala on November 19 will recognize excellence in craft categories including color grading, editing, sound and visual effects.  The Judges Award for Creativity and Innovation will be announced in advance of the awards and presented during the program.

HPA, and its HPA Awards committee, remains dedicated to continuing to support the artists, technical innovators, and companies in the industry that continuously raise the bar for content. In light of this year’s unique challenges, a number of changes were put in place to encourage participation and engagement. Tickets are free with registration, and entry fees were reduced. The awards will be presented in an engaging, interactive platform which encourages a large audience to join in the evening’s activities.

  • Monday, Sep. 14, 2020
Nvidia to buy UK's Arm, sparking fears of chip dominance
In this Tuesday, May 30, 2017 file photo, Nvidia CEO Jensen Huang delivers a speech about AI and gaming during the Computex Taipei exhibition at the world trade center in Taipei, Taiwan. Computer graphics chip company Nvidia said it plans to buy Britain's Arm Holdings for $40 billion, in a merger of two leading chipmakers. Santa Clara, California-based Nvidia and Arm's parent company, Japanese technology giant SoftBank, announced the deal Sunday, Sept. 13, 2020. (AP Photo/Chiang Ying-ying, File)
LONDON (AP) -- 

U.S. graphics chip maker Nvidia said it plans to buy U.K.-based Arm Holdings in a deal worth up to $40 billion, in a move that would create a global powerhouse in the industry.

The deal, which was announced late Sunday by Nvidia and Arm's parent company, Japanese technology giant SoftBank, raises concerns about the independence of Arm, one of Europe's most important tech companies.

Arm's chip designs power the vast majority of the world's smartphones and the company is renowned as an innovator in technology for connected devices, known as the "Internet of Things." Arm centers its business on designing chips and licensing the intellectual property, especially in mobile computing, rather than chip manufacturing, for which it relies on partners. 

Being owned by a U.S. company could mean Arm is exposed to U.S. government export bans at a time when Washington is in a battle for tech supremacy with China.

Under the terms of the deal, Santa Clara, California-based Nvidia will pay SoftBank $21.5 billion in stock and $12 billion in cash. SoftBank could earn a further $5 billion if Arm hits performance targets while Arm employees will get $1.5 billion worth of Nvidia shares.

SoftBank bought Arm for about $32 billion in 2016, in an effort to cement the Japanese company's ambitions in advancing how various devices, including security cameras and household appliances, connect online and work together. That earlier deal sparked fears that one of Britain's most successful tech companies would be involved in a foreign takeover, so the British government got SoftBank to agree to keep Arm's headquarters in the U.K. and double its British staff over five years as concessions. 

Nvidia CEO Jensen Huang said the U.S. company still plans to keep Arm based at its headquarters in Cambridge, England, where it will also build an artificial intelligence research center. 

"Together we're going to create the world's premier computing company for the age of AI," Huang told reporters. 

"We want more great engineers not fewer, we want more R&D not less. And we want that work to be done in the U.K., in Cambridge," Huang said, adding that it wasn't about consolidation or cost savings.

However, Hermann Hauser, who helped set up Arm, called the deal an "absolute disaster for Cambridge, the U.K. and Europe." 

Hauser, now a technology investor, told the BBC's Radio 4 that his biggest concern was that it would degrade what he called the U.K.'s "economic sovereignty" because Arm would end up falling under the jurisdiction of U.S. export controls. 

That means "if hundreds of U.K. companies that incorporate Arm's (technology) in their products, want to sell it, and export it to anywhere in the world including China, which is a major market, the decision on whether they will be allowed to export it will be made in the White House and not in Downing Street."

The deal would also destroy Arm's "neutral" business model, which has made it the "Switzerland of the semiconductor industry," Hauser said. Arm has become successful by licensing its technology to more than 500 companies, many of which are rivals to Nvidia, and the sale would create a "monopoly problem," he said. 

Regulators in the U.S., U.K., China, the European Union will need to approve the deal, which will need about 18 months to complete. 

The British government said it could intervene in the deal because of the "vital role" Arm plays in the U.K.'s tech sector.

"The government monitors acquisitions and mergers closely and when a takeover may have a significant impact on the U.K. we will not hesitate to investigate further and take appropriate action," said Prime Minister Boris Johnson's spokesman, James Slack.

"We are investigating this deal further and ministers are speaking to the relevant companies."

Jill Lawless contributed to this report. 

  • Thursday, Sep. 10, 2020
ILM to expand virtual production resources with 3 new Stagecraft stages; diversity initiative launched
Janet Lewin, SVP, GM of ILM
SAN FRANCISCO -- 

Industrial Light & Magic has unveiled the next phase of its global expansion plan for the company’s virtual production and StageCraft LED volume services. This expansion of services is tied to a proactive initiative for increasing diversity in the industry by combining ILM’s growth in this innovative methodology with a global trainee program geared for underrepresented VFX talent. 

ILM’s existing StageCraft volume set at Manhattan Beach Studios (MBS) was used for the Emmy-nominated series The Mandalorian (Disney+) and will soon be joined by a second permanent StageCraft volume set at the studio, servicing a variety of clients in the greater Los Angeles area. In addition, ILM is building a third permanent StageCraft volume at Pinewood Studios in London, and a fourth large-scale custom volume at Fox Studios Australia to be used for Marvel’s highly anticipated feature Thor: Love and Thunder directed by Taika Waititi. ILM will also continue to provide “pop up” custom volumes for clients as the company recently did for the Netflix production The Midnight Sky, directed by George Clooney.

An end-to-end virtual production solution, ILM StageCraft is a production-hardened technology that provides a continuous pipeline from initial exploration, scouting, and art direction, traditional and technical previsualization, lighting, and of course, real-time production filming itself, with the innovative StageCraft LED volumes. Lucasfilm’s hit Disney+ series, The Mandalorian, and a highly anticipated feature film took advantage of the full complement of ILM StageCraft virtual production services. Other projects such as Avengers: Endgame, Aquaman, Jurassic World: Fallen Kingdom, Battle at Big Rock, Rogue One: A Star Wars Story, Kong: Skull Island, Solo: A Star Wars Story, Ready Player One, and Rango, have utilized  aspects of the toolset as well. 

By every measure, the new stages are vast improvements over the original ground-breaking LED volume developed for the first season of The Mandalorian in 2018. Physically, the new stages are larger, utilizing substantially more LED panels than ILM’s original stage and also offering both higher resolution and smooth wall to ceiling transitions – this directly results in better lighting on set as well as many more in-camera finals. ILM’s proprietary solutions for achieving groundbreaking fidelity on the LED walls at scale allows for higher color fidelity, higher scene complexity, and greater control and reliability.

“With StageCraft, we have built an end-to-end virtual production service for key creatives.  Directors, production designers, cinematographers, producers and visual effects supervisors can creatively collaborate, each bringing their collective expertise to the virtual aspects of production just as they do with traditional production,” explained Janet Lewin, SVP, GM ILM.  

Rob Bredow, CCO, ILM, added “Over the past five years, we have made substantial investments in both our rendering technology and our virtual production toolset. When combined with Industrial Light & Magic’s expert visual effects talent, motion capture experience, facial capture via Medusa, Anyma, and Flux, and the innovative production technology developed by ILM’s newly integrated Technoprops team, we believe we have a unique offering for the industry.”

Alongside the new stages, ILM is rolling out a global talent development initiative through the company’s long-standing Jedi Academy training program. The program, which is part of the company’s larger Global Diversity & Inclusion efforts, offers  paid internships and apprenticeships on productions with seasoned ILM supervisors and producers who serve as mentors. The program is intended to fill roles across the virtual production and VFX pipeline with those from traditionally underrepresented backgrounds;  ILM has posted expressions of interests for jobs across the spectrum, from virtual art department teams and production management to engineering and artist roles. The goal with this initiative is to attract diverse junior talent and create a pipeline for them to become future visual effects artists, technicians and producers who will be “ILM trained” and uniquely qualified to work in this new, innovative way of filmmaking.

“There is a widespread lack of diversity in the industry, and we are excited to leverage our global expansion in this game-changing workflow to hire and train new talent, providing viable, exciting, and rewarding jobs across many of our locations,” noted ILM VP, Operations, Jessica Teach, who oversees the company’s Diversity and Inclusion initiatives. “We believe this program can have a multiplier effect, attracting even more diverse talent to the industry and creating a pipeline for visual effects careers. We know that bringing more diversity into the industry is a critical part of strengthening and expanding our storytelling potential.”

ILM expects to have the new stages up and running for production in London in February of 2021 and in Los Angeles in March, with a mix of projects from features to commercials in line to take advantage of them. The company is currently fielding inquiries for future bookings by studios and filmmakers.

  • Wednesday, Sep. 9, 2020
Single-take feature "Last Call" colors with DaVinci Resolve Studio
A scene from "Last Call"
FREMONT, Calif. -- 

Blackmagic Design announced that the feature film “Last Call,” which was shot in two simultaneous 80 minute takes, was colored in DaVinci Resolve Studio by colorist and cinematographer Seth Wessel-Estes, who also used the DaVinci Resolve Micro Panel in his workflow.

“Last Call” follows a suicidal alcoholic on the anniversary of his son’s death. When he attempts to call a crisis hotline, a misdial connects him with a single mother working as the night janitor at a local community college. The split screen feature showcases both characters in real time as they navigate a life changing conversation.

Director Gavin Booth was no stranger to single take projects, having experience directing a number of one act plays and single shot music videos. He was also director on the Blumhouse project “Fifteen,” which was the world’s very first movie broadcast live, also in a single take. “Last Call” would be a unique approach to filmmaking, in the vein of films such as “Timecode” and “Russian Ark”, both early inspirations for Booth.

Both Booth and cinematographer/colorist Wessel-Estes knew there would be significant challenges in coloring a movie without edits, and with two constantly moving shots around a cityscape. “Figuring out the approach of how to color the film was a pretty daunting task,” said Wessel-Estes. “We hadn’t ever had to color anything longer than a three or four minute shots in the past.”

In preparation, both Booth and Wessel-Estes shot a single take music video for the band Bleu before starting on “Last Call,” working up not only a workflow for production but also a process for handling a constantly changing image.

“Last Call” would present a myriad of challenges to the team, all of which would be reflected in the final image that would need to be colored. Each of the characters would be filmed simultaneously, connected only by the cell phone call that was the link in the movie. Booth would follow one character, and Wessel-Estes the other.

“We needed to reduce the amount of crew for both camera and sound,” reflected Booth. “There couldn’t be a camera assistant or a boom operator. There was nowhere to hide everyone or avoid boom shadows throughout the full single take on either side of the movie. We had to work with our sound mixers to double lav the actors and rely solely on that.”

Booth and Wessel-Estes were also the camera operators, and would need to pull their own focus, while the gaffer would have to figure out how to use a mixture of practical lighting as well as hide every single cable and film light since the single take on either side would move 360 degrees through the space. Moving from inside to outside or even room to room meant that the camera’s exposure needed to seamlessly adjust at the same time. Every element would either help or hinder the final look, and the team knew they had limited time and budget to get it right.

For the look of the project, Booth wanted to maintain a realistic style, and avoid an exaggerated or otherwise over colored look that might have distracted from the character driven drama. “With the use of the long take to show ‘reality’ we wanted to keep it feeling as truthful as possible. For us, having too much of a look on a piece like this can take away from the rawness. We wanted the audience to feel like they were there with the two characters, hanging on every tense word they exchange.”

Once in post, Wessel-Estes knew that having a single long take, and hundreds of node corrections on the single shot would be unmanageable. Instead, he imported the project into DaVinci Resolve Studio and used the Edit panel to place cuts throughout the film to give him transition points. “I was able to go in and make cuts along the timeline which acted as scene markers. That way I could individually color each ‘scene’ or area by itself. Luckily with the Resolve editing panel I was able to go in and create edits and use cross dissolves to smoothly and dynamically transition between graded sections.”

Wessel-Estes also used the Davinci Resolve Micro Panel for color grading. “It enabled me to not only grade more efficiently but also to have more fine tuned tactile control of individual parameters. Since our look was so naturalistic, I ended up doing tons of very small, subtle corrections which would have been a hassle without having a control surface.”

With a split screen showing each character’s part of the story, Booth and Wessel-Estes had a host of challenges to manage. As the camera moved from location to location, both interior and exterior, they knew they would be faced with completely different lighting and exposures throughout the film, requiring careful grading to both marry segments while respecting each new look. “Since we were working with some uncontrolled lighting situations we had to go in and do dynamic exposure and color adjustments as the camera and character move from one space to another.”

Matching the two sides of the film so they felt cohesive throughout was also very important. Each half of the screen had its own ‘look’ but also needed to blend effectively so that the audience wasn’t distracted by the visual contrast of the split screen. “Using the built in comparison tools to bring up images side by side as well as the various scopes enabled us to ensure the look would remain cohesive throughout.”

Wessel-Estes also used a fair amount of vignetting that needed to be keyframed in order to track with the moving camera. “The built in smart tracking inside of Resolve was hugely helpful for a lot of this work, and I love how simple yet effective it is.” Other DaVinci Resolve Studio tools, such as HSL keys, sharpening masks and advanced keyframing controls came in handy when coloring a single take film. “Having all of these tools at my fingertips enabled me to add a degree of finesse to the look of the finished movie which just wouldn’t have been possible on set.”

With “Last Call” taking awards at 25 international film festivals and release slated in theaters on September 18th, Booth is proud of what they have accomplished, achieving a style he has long admired in other projects. “As a filmmaker I have forever been obsessed with long take storytelling; and with audience and critics response, I am thrilled our film’s story rises above the ‘gimmick’ of a long take. For me, ‘Last Call’ felt like the next evolution of a filmmaking challenge.”

  • Wednesday, Sep. 2, 2020
New FUJINON Premista lens in development; PL mount box lens previewed, MK Lens Mount showcased
The FUJINON Premista lens family
VALHALLA, NY -- 

FUJIFILM North America Corporation has announced the development of the FUJINON Premista 19-45mm T2.9 lightweight wide cinema zoom lens (“Premista 19-45mm”) for large format sensor cameras, as well as preview the SK35-700mm telephoto PL mount box lens. In addition to these items, an MK lens mount developed by Duclos Lenses for the highly anticipated RED Komodo cinema camera was introduced during a virtual press conference.

The short, lightweight wide-angle Premista 19-45mm T2.9 expands the Premista family of zooms to three lenses. Joining the 28-100mm T2.9 and the 80-250mm T2.9-3.5, the Premista 19-45mm produces images with natural and beautiful bokeh, outstanding high resolution, accurate color rendition, and controllable flare with minimal ghosting for capturing high dynamic range. The lens shows very little distortion throughout the entire zoom range, lightening the burden of correcting footage after shooting, and allowing high quality cinematic images to be created more efficiently.

“The response we’ve seen to the Premista lenses since their 2019 launch has been tremendous both in terms of excitement and usage across feature film and high-end TV productions,” said Thomas Fletcher, director of marketing, Optical Devices Division, FUJIFILM North America Corporation. “Now, with stricter safety and efficiency needs on set, there is a growing demand for high quality zoom lenses that match the quality and ‘look’ of prime lenses, and efficiently capture images without the hassle of having to frequently change lenses.”

The Premista 19-45mm is scheduled for release in early 2021.

SK35-700mm Telephoto PL Mount Box Lens
Fujifilm developed the FUJINON SK35-700mm PL Mount Telephoto Box Lens (SK35-700) for 8K television applications, but the company will now be doing extensive market research, exploring the possibility of repurposing the lens in response to the emerging needs of the multi-camera cinema style production market. The lens features a 20x high magnification zoom, covering a focal range of 35mm-700mm at F2.8 (35-315mm) and F4.8 (at 700mm). The SK35-700 also features a 1.4x extender, which brings the range to 49mm-980mm on S35 cameras while also offering significant coverage on many large format cameras. It is 28” long and weighs 69 lbs.

The SK35-700 is currently the only telephoto PL mount box lens on the market. Its design provides for unparalleled cinematic imaging in various multi-cam productions.

“We believe the SK35-700 will deliver on the growing desire of more producers to create a cinematic look,” said Fletcher. “The lens range creates the ability to shoot in immersive environments without obstructing views or otherwise interrupting the viewers’ experience.”

Duclos Lenses MK-R Mount
Duclos Lenses has developed the MK-R Lens Mount, an RF Mount Conversion that makes the FUJINON MK 18-55mm and 50-135mm zoom lenses compatible with a variety of RF mount camera bodies—most notably, the Super 35 format KOMODO 6K camera from RED. Paired together, the setup is extremely small, lightweight, and relatively affordable.

  • Wednesday, Sep. 2, 2020
Weta Digital advances VFX and animation in the cloud via deal with Amazon Web Services
Weta Digital CEO Prem Akkaraju
SEATTLE -- 

Weta Digital is going all-in on Amazon Web Services (AWS) to create a new, cloud-based visual effects (VFX) workflow. This workflow includes a set of technologies for VFX artists that will underpin the studio’s global expansion, accelerate key portions of film production, and expand Weta Digital’s New Zealand operations, enabling its team of artists to collaborate on visual effects remotely. In the past 25 years, Weta Digital has brought to life some of the most memorable worlds and characters in film, including Middle-earth and Gollum in New Line’s The Lord of the Rings trilogy and the Na’vi and beautiful landscapes of Pandora in Avatar

Over the course of this multi-year deal, Weta Digital will migrate the vast majority of its IT infrastructure to AWS to support a pipeline that includes 100 proprietary tools and its LED-stage virtual production service, which creates immersive new worlds on set. In addition, Weta Digital will use AWS to produce and render original content from the newly announced “Weta Animated” and deliver on its multi-year movie slate.

Visual effects artists use a wide range of animation and compositing software to integrate computer-generated imagery with live action footage, creating scenes that go beyond what can easily be captured with film alone. This imagery generates a massive volume of video and image files that can put a strain on IT resources and requires delicate load balancing efforts to keep production facilities operating at peak capacity. Leveraging AWS’ global infrastructure and AWS services, including compute, storage, security, machine learning (ML) and analytics, Weta Digital can spread its workloads more efficiently around the world, freeing up talent and resources to continue to create groundbreaking visual effects, and gain the flexibility to render VFX scenes remotely wherever its creative staff is based. With Amazon Elastic Compute Cloud (Amazon EC2), Weta Digital will have expanded access to a broad range of specialized Graphics Processing Unit (GPU) instances for better integration of ML into the VFX creation process, enabling artists to create more life-like, detailed movie creations.

“Weta Digital has been an innovator in the visual effects industry for decades. By adopting AWS’ ultra-scale infrastructure, we can implement a proprietary cloud pipeline and globally scale our production to greater levels than ever before,” said Prem Akkaraju, CEO of Weta Digital. “Weta established a remote collaborative workflow in March due to the pandemic to seamlessly continue work on the Avatar sequels and other films. With the power of AWS, we can now take that success to a global scale. Drawing on AWS’ virtually unlimited compute capacity, we can empower our artists to work safely and securely where they want without technical limitations. In addition, using the breadth and depth of AWS services we can more easily test new ideas and technologies as we continue to push the boundaries of what is possible in visual effects today. AWS services, such as machine learning and data analytics, will help Weta deliver projects faster and more cost effectively, and our customers will enjoy the fruits of Weta Digital’s continuous innovation.”

AWS CEO Andy Jassy added, “Weta Digital has earned fans around the world through its innovative approach of combining technology and creativity to push the boundaries of visual effects in the movie industry while bringing some of cinema’s most memorable characters to life. Weta Digital will rely on AWS’ unmatched portfolio of services to continue redefining what is possible on screen and at a scale that was not previously possible. Through its collaboration with AWS, Weta Digital is reducing technology barriers for those in the filmmaking industry, strengthening its operations in New Zealand and globally, and paving the way for immersive, new experiences for moviegoers.”

  • Wednesday, Sep. 2, 2020
Sony introduces Cinema Line; FX6 slated for release by end of the year
SAN DIEGO, Calif. -- 

Sony Electronics Inc. has launched Cinema Line, a series of camera products for a wide range of content creators. 

Cinema Line will deliver not only the coveted cinematographic look cultivated through extensive experience in digital cinema production, but also the enhanced operability and reliability that meet discerning creators’ various needs. The new series will extend beyond traditional cinema camera and professional camcorder form factors.

In 2000, Sony released the ground-breaking HDW-F900. The HDW-F900 made digital cinema history as the world’s first 24p digital cinema camera. Many Sony cameras followed in response to countless dialogues with cinematographers and image creators--including VENICE, which was released in 2018.

Existing cameras that will form part of the Sony Cinema Line include VENICE and FX9. VENICE has become a go-to choice for digital movie production, and FX9 has an outstanding track record in documentary production. The next camera will appeal to a wider spectrum of visual creators. Sony will be releasing and shipping this next addition to the Cinema Line, FX6, by the end of 2020.

Each of the Cinema Line cameras will evolve with user feedback: The FX9 Version 3.0 firmware upgrade, available in 2021, will see the addition of the S700PTP (a protocol that realizes S700P over TCP/IP) to enable remote control, and a Center Scan mode for Super 16mm lens and B4 lens support with its adaptor, as well as other features. In parallel, in November 2020, VENICE will see additional features in Version 6.0 firmware, which will improve its operability in broadcast and live environments.

“The voice of our customer is critical to everything we do,” said Neal Manowitz, deputy president of Imaging Products and Solutions Americas at Sony Electronics Inc. “We have the deepest respect for filmmakers, cinematographers and storytellers, and will continue to evolve our product line to meet and exceed their demands. Just as our VENICE camera was designed to capture the emotion in every frame, our new Cinema Line expands that vision to allow a broader range of creators to push their boundaries further, and capture and create like they’ve never been able to before.”

  • Tuesday, Sep. 1, 2020
Report: Algorithm question complicates TikTok sale
Employees exit the ByteDance headquarters in Beijing, China on Friday, Aug. 7, 2020. President Donald Trump on Thursday ordered a sweeping but unspecified ban on dealings with the Chinese owners of consumer apps TikTok and WeChat, although it remains unclear if he has the legal authority to actually ban the apps from the U.S. TikTok is owned by Chinese company ByteDance. (AP Photo/Ng Han Guan)
NEW YORK (AP) -- 

Sale talks for TikTok's U.S. operations have been complicated by the key question of whether the app's core algorithms can be included as part of a deal, according to a report in The Wall Street Journal that cited unidentified people familiar with the matter. 

Those algorithms decide what videos users see without first requiring them to follow other users or specify their preferences. The Journal report stated the algorithms were considered part of the deal negotiations up until Friday. 

That's when the Chinese government  introduced export restrictions on artificial intelligence technology that appear to cover content-recommendation algorithms such as the one powering TikTok. The move followed President Donald Trump's effort to force a sale of TikTok's U.S. operations by Sept. 20.

Those export restrictions mean that TikTok's Chinese owner, ByteDance Ltd., would have to obtain a license to export any restricted technologies to a foreign company. The question is whether its algorithms would need Chinese government approval for transfer, and if so, whether Beijing would sign off.

The Journal report said both the prospective buyers and the seller, ByteDance, are trying to figure that out. Prospective buyers for U.S. TikTok assets include a Microsoft-Walmart team-up and, reportedly, Oracle.

Representatives for TikTok did not immediately respond to a request for comment on Tuesday. Microsoft, Oracle and Walmart declined to comment. 

  • Saturday, Aug. 29, 2020
Elon Musk wants his Neuralink to build a brain-computer interface
In this Sunday, Jan. 19, 2020. file photo, Elon Musk founder, CEO, and chief engineer/designer of SpaceX speaks during a news conference after a Falcon 9 SpaceX rocket test flight to demonstrate the capsule's emergency escape system at the Kennedy Space Center in Cape Canaveral, Fla. Elon Musk is not content with just electric cars, populating Mars and building underground tunnels to solve traffic problems. He also wants to get inside your brain. His startup, Neuralink, wants to one day implant computer chips inside people's brain. (AP Photo/John Raoux, File)

Elon Musk isn't content with electric cars, shooting people into orbit, populating Mars and building underground tunnels to solve traffic problems. He also wants to get inside your brain.

His startup, Neuralink, wants to one day implant computer chips inside the human brain. The goal is to develop implants that can treat neural disorders — and that may one day be powerful enough to put humanity on a more even footing with possible future superintelligent computers. 

Not that it's anywhere close to that yet. 

In a video demonstration Friday explicitly aimed at recruiting new employees, Musk showed off a prototype of the device. About the size of a large coin, it's designed to be implanted in a person's skull. Ultra-thin wires hanging form the device would go directly into the brain. An earlier version of the device would have been placed behind an ear like a hearing aid.

But the startup is far from a having commercial product, which would involve complex human trials and FDA approval among many other things. Friday's demonstration featured three pigs. One, named Gertrude, had a Neuralink implant. 

Musk, a founder of both the electric car company Tesla Motors and the private space-exploration firm SpaceX, has become an outspoken doomsayer about the threat artificial intelligence might one day pose to the human race. Continued growth in AI cognitive capabilities, he and like-minded critics suggest, could lead to machines that can outthink and outmaneuver humans with whom they might have little in common. The proposed solution? Link computers to our brains so we can keep up. 

Musk urged coders, engineers and especially people with experience having "shipped" (that is, actually created) a product to apply. "You don't need to have brain experience," he said, adding that this is something that can be learned on the job. 

Hooking a brain up directly to electronics is not new. Doctors implant electrodes in brains to deliver stimulation for treating such conditions as Parkinson's disease, epilepsy and chronic pain. In experiments, implanted sensors have let paralyzed people use brain signals to operate computers and move robotic arms. In 2016, researchers reported that a man regained some movement in his own hand with a brain implant.

But Musk's proposal goes beyond this. Neuralink wants to build on those existing medical treatments as well as one day work on surgeries that could improve cognitive functioning, according to a Wall Street Journal article on the company's launch.

While there are endless, outlandish applications to brain-computer interfaces — gaming, or as someone on Twitter asked Musk, summoning your Tesla — Neuralink wants to first use the device with people who have severe spinal cord injury to help them talk, type and move using their brain waves. 

"I am confident that long term it would be possible to restore someone's full-body motion," said Musk, who's also famously said that he wants to "die on Mars, just not on impact." 

Neuralink is not the only company working on artificial intelligence for the brain. Entrepreneur Bryan Johnson, who sold his previous payments startup Braintree to PayPal for $800 million, started Kernel, a company working on "advanced neural interfaces" to treat disease and extend cognition, in 2016. Facebook CEO Mark Zuckerberg is also interested in the space. Facebook bought CTRL-labs, a startup developing non-invasive neural interfaces, in 2019 and folded it into Facebook's Reality Labs, whose goal is to "fundamentally transform the way we interact with devices."

That might be an easier sell than the Neuralink device, which would require recipients to agree to have the device implanted in their brain, possibly by a robot surgeon. Neuralink did not respond to requests for comment on Friday. 

  • Friday, Aug. 28, 2020
Sony upgrades BRC camera series to simplify VR/AR production workflows
Sony's BRC series of cameras

As part of ongoing efforts to help productions work safely and remotely, Sony is upgrading its BRC camera range with a new set of features.

Designed for remote production and efficient operations, the V2.1 firmware upgrade to the BRC-X1000/1, BRC-X1000/WPW and BRC-H800/1, BRC-H800/WPW will allow producers and operators to simplify their VR/AR production workflows.

Through this update, the BRC cameras will output tracking data over IP, using the industry standard Free-D protocol. This enables the cameras to directly feed in real time the Pan, Tilt, Zoom, Focus and Iris, as well as the position of the BRC Camera, making simple and cost effective VR/AR production without additional tracking devices or systems.

This new feature will allow productions to easily incorporate VR/AR into their live content, such as expanded sets or scenery, live animations, e-sports and graphic overlays, enriching their production.

Free-D protocol is an industry standard, supported by major AR/VR solutions providers. BRC-X1000 and BRC-H800 are currently under verification with The Future Group (Pixotope), Reckeen, Vizrt and Zero Density, and plan to support integration with other partners implementing Free-D data.

Enhanced operations
As social distancing and reduced operational crews increase, the update will also improve the pan/tilt/zoom operations of the BRC-X1000 and BRC-H800. A reduced minimum speed allows the camera to more accurately track an object on the set and facilitate shot framing. The output will be more realistic and smoother, even with non-professional operators. Additionally, the cameras will now focus as soon as the preset is recalled, at the same time as the PTZ function, to enable more natural camera movements. Control when using a physical remote controller such as a pan-bar one will also be supported.

The firmware V2.1 upgrade for BRC-X1000 and BRC-H800 is planned to be available for free on August 31.

MySHOOT Company Profiles