• Wednesday, Aug. 3, 2022
Blackmagic Pocket Cinema Camera Deployed On "Three Headed Beast"
"Three Headed Beast"
FREMONT, Calif. -- 

Having recently premiered at the Tribeca Film Festival, the feature Three Headed Beast was shot with a Blackmagic Pocket Cinema Camera 4K digital film camera. The camera allowed the film’s minimal crew to create a sense of closeness for the nearly wordless film that relies on intimacy, body language and music to tell its story.

One of only 10 films selected for Tribeca’s U.S. Narrative Competition, Three Headed Beast tells the story of Peter and Nina, a loving long term couple navigating an open relationship, and Alex, who has formed a deep connection with Peter. Their relationships and individual desires collide over a hot Texas summer.

Shot entirely with natural light and practicals and with a small crew of only four, Three Headed Beast needed a camera that allowed the team to be adaptable while delivering a high quality, cinematic image. According to director and DP Fernando Andres, the Pocket Cinema Camera 4K’s dual native ISO helped beautifully capture the natural light, and its portable design allowed the crew to easily shoot on the fly in public locations....

  • Thursday, Jul. 28, 2022
TV Academy unveils recipients of 74th Engineering, Science & Technology Emmy Awards
The Television Academy's John Leverence
LOS ANGELES -- 

The Television Academy has revealed the recipients of the 74th Engineering, Science & Technology Emmy® Awards honoring an individual, company or organization for developments in broadcast technology.

Kirsten Vangsness, who starred for 15 seasons on the critically acclaimed CBS drama Criminal Minds and is starring in the upcoming 16th season of the series for Paramount+, returns to host the awards for the seventh consecutive year on Wednesday, Sept. 28, 2022.

“Innovation is a vital part of television production; and the talented engineers, scientists and technologists we have recognized are essential to the growth of our industry,” said Frank Scherma, chairman and CEO of the Television Academy. “These pioneering companies and visionaries have leveraged the power of technology to elevate television and storytelling in fundamental ways.”

“Earlier this year the Academy formed the Science & Technology Peer Group representing members who are involved in the strategy and development of technologies that enable or advance the storytelling process for the television industry,” said Committee chair John Leverence. “Under the leadership of the new peer group’s governors and co-chairs Wendy Aylsworth and Barry Zegel, this year’s newly constituted Engineering Emmy Awards Committee honors a wide range of innovative solutions to once seemingly intractable technical problems.”

The following is a list of awards and recipients to be recognized:

The Charles F. Jenkins Lifetime Achievement Award
Honors a living individual whose ongoing contributions have significantly affected the state of television technology and engineering.

Recipient: Dr. Paul E. Debevec
Paul Debevec is awarded the 2022 Charles F. Jenkins Lifetime Achievement Award for his groundbreaking work in high dynamic range imaging, image-based lighting and photogrammetry, essential techniques used in computer graphics for VFX and Virtual Production.

Debevec’s pioneering work makes it possible to record and reproduce the light of real scenes to illuminate virtual scenes and vice versa.

High dynamic range imagery is a mainstay of computer graphics and combined with image-based lighting, has enabled realistic integration of existing live-action lighting in computer-rendered images. These tools and concepts are now a standard within the VFX industry for rendering. The concepts and the innovative use of LED lighting Paul pioneered with the Light Stage have further laid the groundwork for the use of LED lighting in virtual production, which has seen a rapid growth as a tool for lighting actors on virtual stages.

The Philo T. Farnsworth Corporate Achievement Award
Honors an agency, company or institution whose contributions over time have significantly impacted television technology and engineering.

Recipient: ARRI
ARRI is awarded The Philo T. Farnsworth Corporate Achievement Award for its more than a century of designing and manufacturing camera and lighting systems as well as systemic technological solutions and service networks for a worldwide complex of film, broadcast and media industries.

Industry professionals have long relied on the stability and versatility of ARRI equipment in a portfolio that includes digital cameras, lenses, camera accessories, archive technologies, lamp heads and lighting accessories. Along with offering exclusive technologies, ARRI Rental’s services and equipment provide camera, lighting and grip packages to professional productions here and abroad.

ARRI cameras have connected the creativity and technology that have made filmed entertainment the premier medium of our time. Dedicated to maintaining its place in the forefront of the development of future technologies for the capture of moving images, ARRI has been at it for 100+ years … and counting.

Engineering Emmys
Presented to an individual, company or organization for developments in engineering that are either so extensive an improvement on existing methods or so innovative in nature that they materially affect the production, recording, transmission or reception of television.

This year’s seven (7) Engineering Emmy recipients are:

Recipients: Mark Hills and Marc Bakos for the Cleanfeed remote audio review/recording system
Cleanfeed is a high-fidelity “conference call” software as a service with a focus on audio production. It enables collaboration with audio-quality equivalent to all participants being together in the same studio and with low latencies for smooth interaction between talent. Cleanfeed’s innovative technology has advanced workflows in the industry, including being accessible to at-home engineers and talent who use a straightforward link in their web browser to enable full-scale TV and film post-production.

 
Recipient: disguise Systems Ltd. for the disguise platform
The disguise platform is an advancement in image processing that incorporates elements of video playback and real-time technology to improve interaction between computer graphic elements, digital images and environments, physical actors, props, and practical sets. The combination of the disguise platform with LED walls and camera tracking enables real lighting information on actors (and real objects), support for reflective and refractive props, more natural shot lineups, and a production environment where creative decisions can be made quickly and with improved collaboration. Utilizing real-time 3D visualization-based software and robust hardware, the tools in the disguise platform seamlessly integrate and direct an array of technologies including camera tracking and real-time content engines that incorporate analog physical space into a virtual digital world.

Recipient: Industrial Light & Magic for the StageCraft virtual production tool suite
ILM StageCraft is an end-to-end virtual production tool suite that bridges the gap between practical physical production methodologies and traditional digital post-production visual effects by providing the ability to design, scout and light environments in advance of the shoot and then capture that vision in camera during principal photography. StageCraft brings together a real-time engine, a real-time renderer, high-quality color management, physical camera equipment, LED displays, motion-capture technologies, synchronization methodologies and tailored on-set user interfaces to digitally create the illusion of 3D backgrounds for live-action sets.

Recipients: Geoffrey Crawshaw and William Brinkley for the Leostream remote access software
Leostream’s remote access and desktop connection management software enables news and entertainment organizations to create security-conscious remote production environments that are sustainable, performant and cost-effective. Its ability to mix and manage on-premises and cloud-based hosting platforms and support for multiple high-performance display protocols ensures the productivity of editors and production engineers, while simplifying IT. The ability to manage disparate technologies from a single management and access platform is a uniquely Leostream construct that enables organizations to advance the state of the art of their entire hosted desktop environment with an eye on integrating new technologies as they come to market.

Recipient: Shure Incorporated for the Axient® Digital wireless audio system
The Shure Axient® Digital Wireless System equips audio production teams with the wireless capabilities necessary to deliver transparent, true, artifact-free audio for television and television broadcasts with the highest-performance RF (radio frequency), exceptional audio quality, command and control, and hardware scalability necessary to tell stories seamlessly on the most demanding sets in the world.

Recipient: Sohonet for the ClearView Pivot remote collaboration tool
Sohonet’s ClearView Pivot is a real-time remote collaboration tool with the flexibility to connect creative users at the click of a button, allowing the user to stream color and frame-accurate footage in 4K HDR, 12-bit color depth and 4:4:4 chroma sampling in real time with ultra-low latency. From a single web interface (and without having to set up new firewall configurations for each usage), ClearView is able to efficiently connect numerous participants for multi-point review sessions.

Recipients: Stype Cajic, Andrija Cajic, Daniel Kruselj and Ivica Antolkovic for the stYpe suite of optical/camera tracking tools
The stYpe suite of optical/tracking tools includes the first bolt-on mechanical tracking kit for camera cranes (StypeKit) that was used to retrofit existing cranes and transform them to virtual production cranes (complete with lens data delivered to an ethernet interface), resulting in the simplification of setup procedures and implementation of lens calibration procedures that eliminated graphics drift and, with the introduction of auto-aiming functionality, made VR shots smoother. Additionally, its optical-camera tracking system (RedSpy) produces a point cloud marker system used to calculate the position of the camera, which, in conjunction with StypeKit’s crane tracking, satisfies the most demanding needs of live productions.

  • Thursday, Jul. 7, 2022
Twitter says it removes 1 million spam accounts a day
This Nov. 4, 2013, file photo shows the icon for the Twitter app on an iPhone in San Jose, Calif. Twitter, in a call with executives Thursday, July 7, 2022, said it removes 1 million spam accounts each day. The briefing aimed to shed more light on the company's fake and bot accounts as it tussles with Elon Musk over “spam bots.” (AP Photo/Marcio Jose Sanchez, File)

Twitter said it removes 1 million spam accounts each day in a call with executives Thursday during a briefing that aimed to shed more light on the company's fake and bot accounts as it tussles with Elon Musk over "spam bots."

The Tesla CEO, who has offered to buy Twitter for $44 billion, has threatened to walk away from the deal if the company can't show that less than 5% of its daily active users are automated spam accounts.

Musk has argued, without presenting evidence, that Twitter has significantly underestimated the number of these "spam bots" -- automated accounts that typically promote scams and misinformation — on its service.

Twitter said on the call that the spam accounts represent well below 5% of its active user base each quarter.

Fake social media accounts have been problematic for years. Advertisers rely on the number of users provided by social media platforms to determine where they will spend money. Spam bots are also used to amplify messages and spread disinformation.

The problem of fake accounts is well-known to Twitter and its investors. The company has disclosed its bot estimates to the U.S. Securities and Exchange Commission for years, while also cautioning that its estimate might be too low.

Last month, Twitter offered Musk access to its "firehose" of raw data on hundreds of millions of daily tweets, according to multiple reports at the time, though neither the company nor Musk confirmed this.

Barbara Ortutay is an AP technology writer.

  • Tuesday, Jun. 21, 2022
Hayden5 deploys virtual camera crew in the Metaverse
Hayden5's Metaverse Capture Service
LOS ANGELES & NEW YORK -- 

Video agency Hayden5 is launching a Metaverse Capture Service, a video production service geared at brands and experiential marketers producing virtual events. The new service deploys “virtual” videographers and camera crews to capture Metaverse experiences on leading virtual platforms, such as Roblox, Minecraft, Horizon Worlds, Decentraland, and more. 

To R&D its new offering, Hayden5 recently entered the Metaverse to document the Grammy Week experiences hosted on Roblox. Hayden5 also headed to the red carpet to capture Fashion Week 2022 in Decentraland and demonstrated how you can interview an avatar with a handheld-camera feel from thousands of miles away – all in real-time. 

“We deploy professional camera crews to capture, stream, and recap IRL events, so why not do the same in the Metaverse?” said Todd Wiseman Jr, co-founder and creative director at Hayden5. “Our Metaverse Capture Service is a game-changer for brands and entertainment providers creating experiences there. From specific shot selection to multi-cam coverage to scripted and live content, we can now extend what happens in the virtual world to more consumer-friendly video formats.” 

Metaverse Capture combines traditional production infrastructure and state-of-the-art technology. All of the capture methods are platform-dependent and customizable, with built-in cinema tools for the highest quality capture. The shoots can be conducted entirely virtually, either in real-time, or using pre-programmed moves. Most captures are software or controller-based, which is the primary application of the service. Clients also have the option of deploying professional camera operators, equipped with VR hardware, for a natural hand-held look. 

Features of the service include:

  • Mimic traditional camera moves – crane, dolly, or Steadicam – in 3D space 
  • Customizable for emerging platforms like Minecraft, Horizon Worlds, Roblox, Decentraland, and VR Chat 
  • Conduct shoots in real-time, or with pre-programmed moves
  • Capture services range from hands-off and algorithmic, to being manually controlled by anything from a video game controller to a real camera rig, outfitted with virtual reality equipment.

“Just as we’ve scaled the ability to produce video content globally with remote production services and products, such as Drop Kits, Crew+, and Cloud Cuts, Metaverse Capture is infinitely scalable, and a powerful tool for bringing Metaverse experiences beyond the fragmented ecosystems they live in,” concluded Wiseman Jr.

  • Wednesday, Jun. 8, 2022
DP Glen Keenan Cookes Up A New "Star Trek" Look
Glen Keenan, CSC
LEICESTER, UK -- 

Using three sets of Cooke Optics Anamorphic/i Full Frame Plus Special Flare lenses, cinematographer Glen Keenan, CSC, achieved his desire for the most organic, non-studio look to convince the audience they were seeing a real location in Star Trek: Strange New Worlds (Paramount+).

A spin-off from Star Trek: Discovery and a prequel to Star Trek: The Original Series, Strange New Worlds follows Captain Christopher Pike and the crew of the USS Enterprise. The Star Trek Universe’s journey for its streaming series to get to anamorphic full frame with special flare started with Star Trek: Discovery Disco, for which Keenan served as cinematographer for seasons one through three.

“Season two of Disco was our move to anamorphic primes [for 2.39:1 for streaming], and that won me over,” said Keenan. “For Star Trek, there’s a studio, but no reality. I want to convince the audience that we are in a real space with a lens that would add more organic qualities to the image. The Cooke anamorphic special flares have the right amount of aberrations and flare for the signature Star Trek blue streak flare. Two things really help with reality: the expected inconsistencies between lenses help to ground the story like we were really there and the anamorphic falloff. Both of those features help to deliberately frame the action to where I want the audience to focus on.”

The offer for Keenan to move to Star Trek: Strange New Worlds and develop the look came from Alex Kurtzman, creator and EP of Star Trek: Discovery.

“There was a moment when shooting Disco in anamorphic that I knew I really wanted full frame special flare for 4K for Strange New Worlds. My supplier got on the horn with Cooke. And Cooke built a custom set for Strange New Worlds, delivering two sets before episode one, then the third set once they were made. Cooke stepped up with full frame anamorphic special flare for day one. It was remarkable. Three sets in two months.”

  • Wednesday, Jun. 8, 2022
Resolve Needed For "Writing With Fire"
A scene from "Writing With Fire"
FREMONT, Calif. -- 

Writing With Fire, nominated for this year’s Best Documentary Feature Oscar, completed postproduction using Blackmagic Design's  DaVinci Resolve Studio editing, color grading, visual effects (VFX) and audio postproduction software and DaVinci Resolve Advanced Panels. This included color correction, online editing and managing the films’ various delivery needs.

Directed by Sushmit Ghosh and Rintu Thomas, Writing With Fire tells the story of Khabar Lahariya, India’s only newspaper run by Dalit women. The film shows how, in a cluttered news landscape dominated by men, Khabar Lahariya’s chief reporter and her two fellow journalists broke traditions and redefined what it means to be powerful.

Mumbai-based Bridge PostWorks provided postproduction for the film. Industry vet Sidharth Meer graded and provided online editing and conforming.

The film was primarily shot over four years in Uttar Pradesh, an Indian region that has extreme climates and incredibly colorful paints for the interior and exterior of homes. Shots for the film were a mix of talking head interviews and footage of the three reporters in action. DaVinci Resolve Studio was able to adjust each shot to maintain a look that focused on the subjects and not the intense color and lighting of the backgrounds.

Beyond color correction, Meer relied heavily on DaVinci Resolve Studio’s editing and delivery tools. The film had a lot of conform work, which included instances of super imposed titles and graphics, separate looks for material that had originated on smartphones, and clips that had to look like webpages from YouTube with video playing within them, which had their own subtitles.

Writing With Fire was shown in theaters and at dozens of online film festivals, on top of having to be delivered to the Motion Picture Academy for Oscar consideration. Resolve accommodated delivery of all these versions.

  • Monday, Jun. 6, 2022
Apple offers glimpse at upcoming changes to iPhone software
Apple CEO Tim Cook poses for photos as he holds one of the new Apple MacBook Air computers with an M2 processor, Monday, June 6, 2022, following the keynote presentation of Apple's World Wide Developer Conference on the campus of Apple's headquarters in Cupertino, Calif. (AP Photo/Noah Berger)

Apple on Monday provided a peek at upcoming tweaks to the software that powers more than 1 billion iPhones and rolled out two laptops that will be the first available with the next generation of a company-designed microprocessor.

As usual, Apple spent most of the opening day of its annual developers conference touting the next versions of software for the iPhone, iPad, Apple Watch and Mac computers instead of the sleek devices that established it as a technology trendsetter and the world's most valuable company.

The iPhone's next operating system, called iOS 16, will revamp the look of the device's lock screen and make make mostly minor improvements to the current software. The software updates have become increasingly important in recent years as iPhone owners have started to hold to their existing devices for longer periods of time than they once did.

One of iOS 16'se most noticeable differences will occur on the iPhone lock screen. The new software, which will be released this fall as a free download, will allow users to anchor their favorite apps as small widgets on the lock screen.

The new software also will enable the lock screen to display live notifications, such the status of a Uber ride on its way to pick up a passenger. Other authorized notifications will come in from the bottom of the screen instead of the current distribution from the top in an effort to avoid clutter on the display.

The iPhone's messaging system will be revamped so texts can be edited after they are sent or even rescinded in their entirety if the sender has a change of heart. Those options will only be available when both users are using Apple's messaging app for texting.

The Apple Pay service that's part of the iPhone's digital wallet is adding a new financing feature likely to be popular as soaring inflation rates squeeze more household budgets. The option will allow consumers to stagger the cost of any purchase made through Apple Pay over four installments completed within a six-week period with no additional fees. Similar financing is already offered through digital services such as Affirm, whose stock price sank by more than 5% Monday after the news about Apple Pay came out.

Several of the new features for Apple's Macs and iPads are designed to make it easier to sync with the iPhone for things like making video calls. Other tools will enable more apps to run side by side to perform multiple tasks on the same screen.

Helping people toggle from one Apple device to another is one of the main reasons that the company began making Macs that run on the same kind of chips that power the iPhone and iPad in late 2020.

Now Apple in putting the next generation of its Mac chip in it two most popular laptops, the MacBook Air and MacBook Pro, which the company said will be available in stores at some point next month. The MacBook Air will sell for $1,200 and the MacBook Pro will sell for $1,300.

The event was held at Apple headquarters in Cupertino, California.

Michael Liedtke is an AP technology writer.

  • Wednesday, May. 25, 2022
Autodesk invests in RADiCAL, AI-powered 3D motion capture 
RADiCAL reproduces skeletal joint rotations in 3D from a single conventional video feed for use with animated characters
NEW YORK -- 

Autodesk announced its investment in RADiCAL, the New York-based developer of a born in the cloud, AI-powered 3D motion capture solution. This investment marks the latest move by Autodesk toward democratizing end-to-end production in the cloud for content creators, and builds on the recent acquisitions of Moxion and LoUPE.  

Through its collaborative real-time platform, RADiCAL facilitates 3D motion capture, human virtualization and analysis at massive scale. RADiCAL’s proprietary AI combines modern deep learning strategies, human biomechanics, and computer graphics to estimate, track, and reproduce skeletal joint rotations in 3D from a single conventional video feed. From videos to metaverses, this data can be used to automate the animation of 3D characters and avatars.  

Requiring no special hardware, training or custom coding, RADiCAL’s cloud-based solution removes the barrier to entry typically associated with 3D motion capture and character animation, making the technology accessible to everyone, everywhere, at any time. Aspiring professionals can use its powerful AI to start creating engaging animated characters and hone their skills for more advanced 3D animation work. 

Content creators around the world, including facilities and educational institutions such as Aircards, Surreal Events, Savannah College of Art & Design, Full Sail University and others are already using RADiCAL to incorporate motion capture into their projects at scale. This investment from Autodesk will fuel RADICAL’s efforts to further advance its AI to serve a growing community, and support professional animation and VFX workflows.  

“RADiCAL’s accessible, easy to use solution appeals to up-and-coming artists and next gen content creators,” said Diana Colella, SVP, Autodesk Media & Entertainment. “We look forward to working with RADiCAL to extend the sophisticated AI engine powering its solution to put collaborative real time motion capture capabilities in the hands of additional professional creators.  As we continue to build out our vision for cloud-based content-creation, partners are key to driving innovation.” 

“We have deep roots in media and entertainment technology development, and many of us are long- time users of tools like Maya and 3ds Max,” said Gavan Gravesen, founder & CEO, RADiCAL. “We’re thrilled Autodesk is supporting our mission to become the market standard for fast, accessible, and advanced 3D motion capture. This investment will help us expand the capabilities of our AI, scale our cloud infrastructure, and uplevel collaborative editorial features for the film, television and games markets.” 

Surreal, a web-based virtual event platform incorporates real time avatars into its projects using RADiCAL.  

“RADiCAL’s innovative technology is a game-changer for Surreal, enabling platform presenters to engage with a live audience as if they were physically onsite,” said Nick Grant, co-founder and chief product officer, Surreal. “In addition, RADiCAL gives Surreal users dynamic control of avatar movement using only a single web camera. Likewise, capturing animation data at a user’s location enables real-time user-controlled animations, and optimizes interactions that enhance the overall Surreal experience.”

  • Wednesday, May. 11, 2022
Blackmagic deployed for HBO period drama "The Gilded Age"
Christine Baranski (l-r), Cynthia Nixon and Louisa Jacobson in a scene from "The Gilded Age" (photo by Alison Cohen Rosa/courtesy of HBO)
FREMONT, Calif. -- 

 A Blackmagic URSA Mini Pro 12K digital film camera was used to capture visual effects (VFX) plates for the HBO drama series “The Gilded Age.” VFX supervisor Lesley Robson-Foster also used a Blackmagic Pocket Cinema Camera 6K as a witness camera for the period drama that transports New York City back to 1882.

“The Gilded Age” pits old money versus new as American society goes through an era of incredible change and opulent wealth. Starring Christine Baranski, Cynthia Nixon, Carrie Coon, Louisa Jacobson and more, “The Gilded Age” follows the Van Rhijn and Russell households who are at the center of the societal battle.

In total, the VFX team delivered 1,500 VFX shots over nine episodes for the series, covering everything from creating period correct street scenes, to a train crash, full music hall concert scene and ferry boat terminal from the 1880s. According to Robson-Foster, “This project was unique because so much of it was virtual. There were many big scenes we filmed entirely in a greenscreen set with just a practical doorway or a small section of physical set.”

“The main event was the construction of 61st Street between Fifth Avenue and Madison Avenue in New York City, where the Van Rhijns and Russells lived. The street was partially physically built and then completed virtually as a computer generated model. To help build the scene, we needed to shoot plates for the view that faced Central Park, so we relied on the URSA Mini Pro 12K in Blackmagic RAW,” said Robson-Foster. “Due to COVID, we shot a lot of the show in the winter when it was meant to be summer, so there was a lot of tree plate shooting. We also used CG trees, but without question it was better to use the plates for composite.”

When shooting, Robson-Foster relied on the URSA Mini Pro 12K’s usability to be quick on her feet. “When we went off as our own VFX unit to shoot the plates, we needed to be as flexible and compact as possible. The URSA Mini Pro 12K helped us be self sufficient,” she said. “The camera is very user friendly, and the interface is well thought out. Under time pressure and when chasing the light, it was nice not to get lost in menus and just easily get what we needed.”

Similarly, the Pocket Cinema Camera 6K’s intuitive Blackmagic OS and compact design made it ideal as a witness camera. Robson-Foster added, “We needed to be able to set up our VFX cameras quickly and reliably whilst the main unit was shooting. The Pocket Cinema Camera 6K’s small design was compact enough to use in the shot as a witness camera. We used it to shoot toward the main unit camera to provide footage we could then use as a reflection on windows and horse carriages.”

“For a period drama of this grandeur, every little detail counts. Even in a heavily virtual set, we’re able to add those tastes of realism into our VFX through incorporating VFX plates and witness footage whenever we can. The Blackmagic Design cameras made it easy to capture the assets we needed to not only turn winter into summer but also the clock back 140 years to the 1880s,” Robson-Foster concluded.

  • Monday, May. 9, 2022
Meta opens first physical store
A man experiences the Quest 2 virtual headset during a preview of the Meta Store in Burlingame, Calif., Wednesday, May 4, 2022. (AP Photo/Eric Risberg)
BURLINGAME, Calif. (AP) -- 

Facebook parent Meta has opened its first physical store — in Burlingame, California — to showcase its hardware products like virtual and augmented reality goggles and glasses.

The store, which is open to the public as of Monday, is made for people who want to test out products like Ray-Ban Stories, Meta's AR glasses and sunglasses, along with the Portal video calling gadget and Oculus virtual reality headsets.

Shoppers still have to order the glasses from Ray-Ban but can buy the other products at the store.

"It's a very concrete step from moving away from social media and ads that mislead people and elections and spying and data and all those things to a very physical representation of clean, classy, well-designed, cool hardware that makes you go, ah," said Omar Akhtar, research director at Altimeter, a technology investment firm.

Akhtar said he "didn't believe in virtual reality" until he sat and tried on the Oculus headset for the first time — and believes this will be the same for others who are able to put on the goggles and try it out. Apple pioneered physical retail stores in Silicon Valley and Meta, which owns Instagram and Facebook, is likely hoping it'll replicate at least some of that success.

"The truth of it is that physical things never went away and they're never going to go away," Akhtar said. "Everybody realizes that even if we are going to step into the virtual world, we're going to need to access it with hardware."

MySHOOT Company Profiles