Tuesday, June 25, 2019

Toolbox

  • Tuesday, May. 21, 2019
Submissions open for 71st Engineering Emmy Awards
NOHO ARTS DISTRICT, Calif. -- 

Submissions for the 71st Engineering Emmy® Awards are now open through Friday, June 7. The Engineering Emmy Award honors an individual, a company or an organization that considerably improves existing methods or innovations that materially affect the transmission, production, recording or reception of television.

The 2019 Engineering Emmy Awards Entry Form can be downloaded from the Television Academy’s website here.

 TelevisionAcademy.com/downloads

Recipients of the Engineering Emmy, The Charles F. Jenkins Lifetime Achievement Award and the Philo T. Farnsworth Corporate Achievement Award will be selected by the Engineering Awards Committee comprised of highly qualified Academy members appointed from technically oriented Peer Groups. Winners will be presented with their Engineering Emmy at a ceremony on October 23, 2019.

Previous Engineering Emmy Award winners include AVID, Canon, Dolby Laboratories, Disney, FUJI, Netflix, NASA, Sony Corporation and YouTube.

  • Thursday, May. 16, 2019
Digital Filmtree deploys DaVinci Resolve to color “The 100”
Bob Morley (l) and Eliza Taylor in season 6 of "The 100"
HOLLYWOOD, Calif. -- 

DigitalFilm Tree (DFT) has brought color to the post-apocalyptic, sci-fi drama The 100 with a team that includes sr. colorist Dan Judy, and VP of post Chad Gunderson. Just prior to the premiere of its sixth season on April 30, The 100 was announced as renewed for a seventh season by The CW.

Crafting environments like ice worlds, desolate desert-scape cities, and acid cloud storms, Judy is enthused over the support he and DFT have seen from Grant Petty and Blackmagic Design. Judy gave a presentation on his approach to color correction utilizing Resolve at this year’s NAB. Blackmagic announced its latest version of the NLE software, DaVinci Resolve 16, during the convention.

“I worked with the fourth DaVinci machine ever built, and hand-in-hand with the people down at the facility that invented DaVinci back in the day,” he said. 

During his talk, Judy referenced several advantages to the DaVinci Resolve software that have come in particularly useful for The 100, like OFX--Open FX plug-ins--advanced tools that expand the colorist’s ability to treat everything from beauty and Boris & Sapphire FX all the way to fixing damaged images and all available within Resolve Color.  There are Unlimited Power Windows with amazing tracking abilities.  Working clip-based allows proper access to the client’s camera original materials, maintained and managed via DFT’s object storage SAN.  Judy has the freedom to use a LUT from production.  He will use discretion to make certain it’s a benefit to the scene at hand.  Again, working clip-based provides a nondestructive approach allowing Judy the full dynamic range to protect the show’s color design.
 
Shooting on the ARRI Alexa gives extra dynamic range when matching footage shot across different times of day and then leveraging that ability when matching to other cameras used in production--GoPro, Red and Canon.  With The 100’s color design constantly evolving over locations and seasons, Resolve provides its evolving technology to help the DFT team meet these challenges.  Resolve’s last iteration--version 15--incorporated the Fusion Platform, this allows Edit, Color and VFX to contribute to the project’s final outcome giving the client real-time reviews of these efforts, even remotely.

Co-executive producer and director, Tim Scanlan, Judy’s liaison on The 100, and his creative partner for more than two decades, can even monitor and provide feedback from the production offices for The 100, in Santa Monica, or from his home office, in Newport Beach utilizing DFT’s remote services. Judy, meanwhile, can be found on a dedicated Resolve workstation at DFT’s facilities in Hollywood. He has partnered with Scanlan for more than 20 years, and the duo’s lineage in color can be traced back to wildly popular shows in broadcast, including ABC’s Charlie’s Angels and the CW’s Smallville, the latter of which they collaborated on for more than a decade.

“Over the years, Tim has pushed me very hard to utilize my tools to their fullest,” Judy continued, adding his thanks to DP Michael C. Blundell for his contributions to The 100. “Michael has been a real feather in our cap helping us out throughout the past five seasons.” Judy also expressed appreciation for associate producer Emanuel “E” Fidalgo. Gunderson thanked post supervisor Mark Knoob, whom he communicates with on a daily basis, and sr. editor Thomas Galyon.

“Dan Judy has a massive skill set, and can bend this software, somehow, to do magical things,” added Gunderson, who joined the post team for The 100 four seasons ago. “I help facilitate everything that the production needs around here from the time it hits online to finish. As this show is being managed primarily in Resolve’s remote workflows, it’s imperative that you have a strong management team, strong editorial team, and a strong color team, all working simultaneously, to see all of this through.”

  • Tuesday, May. 14, 2019
Christie partners with Cannes Film Festival for 13th year
CANNES, France -- 

Christie® is once again the projection technology partner for the Cannes Film Festival. For the 2019 festival, which runs May 14-25, Christie is supplying 36 digital cinema projectors from across its Solaria™ Series, and accessories. It is the 13th year that Christie has supported the event.

The ongoing partnership with the Cannes Festival confirms Christie’s reputation in the cinema industry as a provider of market leading digital cinema solutions, and its ongoing commitment to offer the best image quality for one of cinema’s most prestigious events.

“It’s an honor for Christie to be involved in such an important and globally renowned festival, now in its 72nd year,” said Francis Zee, consultant, Christie. “The Cannes Festival is all about the celebration of cinema and delivering the director’s vision to an audience of their peers. Our technical approach is to deliver rock-solid reliability of image quality with our projectors, and let the movie speak for itself.”

Technical specification for the projection technology in the festival and market screening rooms is overseen by the CST (Commission Supérieure Technique de l’Image et du Son). “The technicians that make up CST are very experienced, and the operators inside all the projection rooms are hand-picked by the Festival,” added Zee. “It is a pleasure to work alongside them.”

In the festival’s five rooms--also known as Competition rooms--a combination of Christie’s CP4230, CP2230 and CP4220 projectors will be used. The main competition rooms will have both 2K and 4K projectors to match the content on show. Celebrating its 60th anniversary this year, the Marché du Film is the largest international gathering of professionals in the sector and will feature the Christie CP2215, a popular, compact 2K DCI Xenon projector. 

This year’s curtain raiser is The Dead Don’t Die, directed by Jim Jarmusch and starring Tilda Swinton, Adam Driver, Bill Murray, Selena Gomez, Chloe Sevigny and Iggy Pop. 
 

  • Wednesday, May. 8, 2019
AWS Cloud empowers Tangent Animation for Netflix’s “Next Gen” 
Tangent Animation studio at work in Toronto
TORONTO -- 

Animated sci-fi feature “Next Gen” tells the story of a lonely, rebellious teenage girl who teams up with a top-secret robot to stop a madman’s plot for world domination. Featuring the voice talent of John Krasinski, Charlyne Yi, Jason Sudeikis, Michael Peña, David Cross and Constance Wu, the CG action-adventure was created as a joint effort by Baozou and Tangent Animation, a subsidiary of Tangent Studios.

Released globally by Netflix, “Next Gen” marks the largest project that Tangent Animation has tackled to date both in size and scope, requiring four 2K resolution deliverables, including mono and stereo versions in English and Mandarin. With only 25 percent of the project’s rendering completed and three months until delivery, Tangent looked to AWS Thinkbox to help them scale compute resources with Amazon Web Services (AWS) Elastic Compute Cloud (EC2), ultimately completing more than 65 percent of the film’s rendering with AWS.

“AWS Cloud was a godsend for us on ‘Next Gen;’ it allowed us to render about two and half versions of the movie in just 36 days and far outstripped our on-premises capabilities. Without it, lighting and rendering would have to have started nearly eight months ahead of time and that would have required an entirely different creative strategy,” said Jeff Bell, Tangent Studios COO and co-founder. “We’ve always used Deadline for managing resources on premises, but our per hour compute costs are so low, we weren’t sure if the cloud was the right option for us until it became clear we needed a lot more CPU to get the job done. We couldn’t have hit our deadlines without AWS.”

Ken Zorniak, CEO and president, Tangent Studios, added, “Removing the limitations of a physical farm allows creatives to make changes to the story and look until the last minute, then let the computers do the heavy lifting. Since artists receive shots back faster using AWS, they can work more effectively. It’s definitely improved our studio’s flow and throughput and that helps keep everyone motivated and engaged.”

Tangent deploys open source software for 3D production, primarily using Blender for content creation. Tangent’s local 600-node farm is housed at the studio’s Winnipeg headquarters, which has about 60 people who typically focus on asset creation, lighting and VFX. Much of Tangent’s animation is done out of its Toronto studio, which is about twice the size staff-wise and is designed to be highly flexible.

“Our Toronto office is run on a data center so looking to the cloud wasn’t a foreign concept,” Bell explained. “After just two months of setup, configuration and testing--which AWS Thinkbox helped us with, we went from being inexperienced to spinning up 3,000 AWS instances--five times the capacity of our local resources.”

Already well into production once they decided to leverage the AWS cloud, Tangent was able to use AWS Snowball, a data transport solution that can be sent to a studio’s location, to quickly load upwards of 100TB of data onto AWS servers. AWS Thinkbox helped them determine where to locate the data, and how to balance machine power and RAM needs with pricing and core availability, making use of economical Spot Instances where possible.  

Looking to the future, Bell envisions broadening Tangent’s relationship with AWS. He shared, “I can see us moving our whole production pipeline to AWS: disk tiering for cold storage, remote users, backups, virtual workstations, and beyond. With AWS, we see a partner that we can rely on long term, not just to bring more cores online to finish a project but also a resource for the work we do beyond the animation studio in developing SaaS technology.”

Currently, Tangent Animation is working on a new project expected to announce in the coming weeks.

  • Wednesday, May. 8, 2019
Cooke Optics to showcase wares at Cine Gear Expo
Cooke 50mm anamorphic full frame lens
LEICESTER, UK -- 

Cooke Optics will demonstrate its strength in the full frame arena at Cine Gear Expo 2019, with lenses from its S7/i and Anamorphic/i Full Frame Plus lens sets on Stand 67. The lens manufacturer will also present the latest developments for the /i Technology metadata system that provides detailed lens data to VFX and postproduction teams, and Cooke Optics TV will be live broadcasting interviews from the stand throughout the show. All nine of Cooke’s lens families will be represented on the stand.

The new Anamorphic/i Full Frame Plus range has been designed to meet the growing appetite for large format production, while offering the popular anamorphic characteristics including flare and oval bokeh. This range is also available with Cooke’s SF ‘Special Flair’ coating, which enables an exaggerated flare that gives yet more choice to cinematographers.

The 18mm and 180mm lenses from the S7/i full frame spherical range will also be featured on the Cooke stand. These, together with the 27mm, are going into production over the coming months to round out the range.

The Panchro/i Classic lenses that emulate the look of old Cooke Speed Panchros are rapidly growing in popularity, for their painterly vintage look paired with the conveniences of modern housing and the ability to match lenses through the Panchro/i Classic range. Visitors to Stand 67 will see the recently announced 65mm Macro lens--a 2-1 Macro--which also covers the full frame sensor.

Cooke will also present /i3 (/i Cubed), the latest version of its /i Technology metadata system that provides detailed lens data to VFX and post-production teams. /i3 firmware now provides distortion mapping - not just a theoretical measurement of spherical lenses of a particular focal length, but of the specific lens in use.

“We have been pushing /i for a very long time as a standard for the industry, and we believe this latest version represents a sea-change for postproduction and producers to really understand the value of lens metadata to reduce time and costs in post,” said Les Zellan, chairman, Cooke Optics. “When we can literally show how lens data collected on set reduces tasks in post from hours to seconds--why wouldn’t you use it?”

In addition, the team from Cooke Optics TV will be on the stand shooting and broadcasting live to the Cooke Optics Facebook and Cooke Optics TV YouTube channels throughout the show, interviewing cinematographers, camera department and film production professionals. Cooke Optics TV is an educational content channel for the film industry, which is lens agnostic.

  • Tuesday, Apr. 30, 2019
Drone used to aid 3D remake of Japanese internment camp
In this Nov. 16, 2007, file photo, Bob Fuchigami looks through one of the albums of photographs that he has collected on Camp Amache during an interview at his home near Evergreen, Colo. Fuchigami was 12-year-old when he and his family were forced to leave their 20-acre farm in Northern California for the Japanese-American internment camp in Granada, Colo. A University of Denver team is using a drone to create a 3D reconstruction of the camp in southern Colorado. The Amache effort is part of a growing movement to identify and preserve historical sites connected to people of color in the U.S. (AP Photo/Ed Andrieski, File)
DENVER (AP) -- 

A University of Denver team is using drone images to create a 3D reconstruction of a World War II-era Japanese internment camp in southern Colorado, joining a growing movement to restore U.S. historical sites linked to people of color.

Researchers last week dispatched the drone from the Switzerland-based company senseFly as part of a mapping project to help future restoration work at Camp Amache in Granada, Colorado.

The senseFly eBee X drone flew over the 1-square-mile (1.6-square-kilometers) site and took more than 4,000 images as part of a project to document where barracks, schools and other buildings once stood, said Adam Zylka, the senseFly pilot who flew the drone.

Currently, the site only contains concrete foundations, artifacts, a handful of restored buildings and a cemetery of internees who died at the camp.

But Zylka said researchers can use the information gathered by the drone to create virtual reality and augmented reality apps so that visitors can experience what life was like at the internment camp with almost precisely reconstructed images.

"This is a game changer," Jim Casey, geographic information system specialist with the University of Denver who has been working to create digital maps of Amache. "You could be standing at the site, looking at nothing for sagebrush and weeds. Then, you can point your smartphone at the view and see what was once there."

Casey said people who cannot go to the isolated location around 230 miles (370 kilometers) southeast of Denver will be able to visit the site virtually after researchers process the new drone data.

From 1942 to 1945, more than 7,000 Japanese-Americans and Japanese immigrants were forcibly relocated to Camp Amache. They were among the more than 110,000 Japanese-Americans ordered to camps in California, Colorado, Idaho, Arizona, Wyoming, Utah, Arkansas, New Mexico and other sites.

Executive Order 9066, signed by President Franklin D. Roosevelt, forced Japanese-Americans, regardless of loyalty or citizenship, to leave the West Coast and other areas for the camps surrounded by barbed wire and military police. Half of those detainees were children.

At Amache, internees lived in an area next to poor Mexican-American farm workers. They produced a newspaper, tried farming and formed football and baseball teams.

Casey said the recreation of the camp is important for the U.S. to come to terms with this dark period in history.

"Children and grandchildren of internees also are trying to learn about what their parents went through," he said. "That's because they rarely talked about it."

The Amache drone project is the latest example of preservation advocates working to save and restore historical sites connected to black, Latino and Asian American history.

A digital project headed up by Brown University professor Monica Martinez seeks to locate sites connected to racial violence along the Texas border with Mexico. Some of the sites she and other researchers have identified have resulted in historic markers documenting acts of violence against Mexican Americans from 1900 to 1930.

Advocates also are working to restore the birthplace of civil rights leader Dolores Huerta in Dawson, New Mexico. The old mining community in northern New Mexico is now a ghost town and there is no marker commemorating Huerta's connection to the area.

  • Monday, Apr. 29, 2019
Tim Burton's "Dumbo" delivered in Dolby Vision with Blackmagic Design
A scene from "Dumbo"
FREMONT, Calif. -- 

Blackmagic Design has announced that DaVinci Resolve Studio was used throughout the full color pipeline on Disney’s live action remake of Dumbo. Directed by Tim Burton with a screenplay by Ehren Kruger, the DI was delivered by Goldcrest Post’s Adam Glasman, who collaborated with DP Ben Davis, BSC.

Using an ACES workflow in DaVinci Resolve, Glasman and Davis began preproduction by defining a warm, golden-hour period look inspired by the layered colors of the original cell animation’s minimalist production design. Finished in 2K to enhance the soft, filmic quality of the rushes, the team had to cater to a variety of different deliveries, including Dolby Vision 2D/3D, SDR 2D/3D, and both HDR and SDR Rec 709.

“While a lot of Dumbo was built and shot in-camera with amazing sets and many extras, a decision was made early on that the animals and all the skies would be CG,” said Glasman, explaining that purpose built sets constructed against blue screen backgrounds would be used to film Dumbo. The integration of fully CG skies was crucial to reflect the expressionist, dramatic painted backdrops of the original animation.

Using an ATEM Television Studio HD switcher as part of the DIT workflow, the team was able to key in several different dramatic sky reference images shot by Davis during preproduction together with a live feed from the camera. With feedback from Burton, these were then used to inform the lighting and mood of the entire set.

“Tim was keen on keeping a good level of contrast in everything to help integrate the computer generated assets with the background,” Glasman continued. “The VFX vendors (MPC) were given references for how a scene would probably look and lit their CG accordingly, so I had to be very careful not to spoil that.”

This was especially important for the Dolby Vision deliveries, said Glasman. “The CG skies, for instance, look amazing. If you compare a traditional DLP projection at 48 NITs to the Dolby Vision version at 1000 NITs, you instantly notice that you get a far wider color gamut with more added dimension with Dolby Vision. The sky is just as bright as it would be in the real world, so you have to treat it very sensitively.”

The DI wasn’t just about maintaining the integrity of the picture, however. Working with Tim Burton also meant there were plenty of opportunities to experiment with color too.

“Tim’s genius came to light in a scene with Dumbo’s mother in a cage, with a strong red light on her,” Glasman concluded. “There are all these animals dressed up as monsters in the cages surrounding Dumbo’s mother, and Tim just decided we should give those other cages strong colors too. I had a lot of fun making each monster a different hue, from bright green to ultraviolet. It adds to the scene. Between the production design, cinematography, and Tim’s vision, the whole film is visually stunning.”

  • Sunday, Apr. 28, 2019
Far from glitzy tech hubs, Chinese city bets big on VR
In this April 2, 2019, photo, Liu Zixing, a mining ore businessman, right, rides a virtual reality "gyroscope" in a VR theme park in Nanchang, China. One of the largest virtual reality theme parks in the world has opened its doors in southwestern China, sporting 42 rides and exhibits from VR bumper cars to VR shoot-em-ups. It's part of an effort by Beijing to get ordinary people excited about the technology - part of a long-term bet that VR will come into widespread use. (AP Photo/Dake Kang)
NANCHANG, China (AP) -- 

Liu Zixing craned his neck forward for help with fastening the goggles for his first ever taste of virtual reality. He took a break from the mining ore business to travel to a VR theme park in this Chinese provincial capital not known for high technology.

"It feels like reality," Liu said after shooting down robots in a virtual fighter jet, strapped to a spinning gyroscope lit in purple. "It's just like you're riding in a plane."

Enthusiasm for VR has cooled somewhat after years of hype, but China's leaders are trying to drum up excitement, hoping to take the lead in a technology they expect will eventually gain wide use.

Hoping to coax homegrown entrepreneurs to take the plunge, the government is educating students, subsidizing office spaces, and sponsoring conferences and competitions.

Nanchang's VR Star park offers 42 rides and exhibits, including VR bumper cars and VR shoot-'em-ups. It's the highlight of Nanchang's "VR base," a sprawling complex of mostly still empty, futuristic glass-and-steel offices.

The city of 5.5 million is the capital of Jiangxi province, a relatively impoverished region nestled in the mountains of south-central China, where the regional industries are copper mining and rice.

Officials hope that one day it will be a world-class hub for virtual reality.

"Frankly, VR isn't 100% necessary in the Chinese market at the moment," said Xiong Zongming, CEO of IN-UP Technology, one of dozens of firms being incubated by the VR base. "But with the government's push, many other companies, departments and agencies are more willing to try it out."

Xiong was born in Nanchang but studied and worked in Japan for nearly a decade before returning to China, where he settled in Shanghai. Nanchang officials enticed him back home with offers of free rent and 150,000 RMB ($22,340) in startup funds, part of an effort to lure back local talent from richer coastal cities to help lift the local economy.

Beijing began its VR drive a few years ago, when slick headsets from Samsung, Oculus, HTC and Sony were making a big splash at electronics shows in the U.S.

Chinese leaders were worried they might miss out on a boom.

VR is included in Beijing's "Made in China 2025", an ambitious plan to develop global competitors in cutting edge technologies including electric cars, solar and wind power, and robotics. Nanchang is one of several VR hubs across the country.

So far, VR is mostly a niche product used in gaming and business training, held back by expensive, clunky headsets, a lack of appealing software and other shortcomings. Analysts say it could be many years, perhaps decades, before the technology goes mainstream.

Last year, just 5.8 million VR headsets were sold globally, according to market research firm Ovum. That compares with sales of more than 1.5 billion smartphones and is far fewer than expected when VR fever was at its peak a few years back.

"My experience wasn't good," said Xu Xiao, a PC gamer who bought VR goggles over a year ago after graduating from college. "When I wore them, my eyes got dry and uncomfortable, and I got dizzy. I barely use them anymore."

Stopping by the Nanchang VR park, he was still unimpressed.

"The image quality isn't refined, and it's hard to operate," he said after a virtual flume ride.

Even if it's a gamble, analysts say China's state-led push into VR could pay off in the future. Nanchang's VR developers are marching on despite a wave of layoffs across the industry in the past few years. Thousands attended Nanchang's first VR conference last October.

"It's kind of a good move to be there now," says George Jijiashvili, a senior analyst at Ovum. "It's a long game, and I don't think it's going away anytime soon."

Beijing still lags behind: Most VR headsets are designed by companies based outside mainland China like Samsung, HTC, and Oculus and the major VR content platforms are run by giants like Facebook and Google.

China's Ministry of Industry and Information Technology aims to change that by encouraging banks to finance VR startups and directing local governments to invest in VR products for public projects such as schools and tourist sites.

The government has provided subsidies and purchases of VR software, mostly focused on education, training, and health care software. Nanchang has a 1 billion RMB ($149 million) VR startup investment fund, and is setting up another fund to attract established VR companies.

Entrepreneurs and experts believe VR will get a boost from next generation, or 5G, technologies where Chinese companies like Huawei Technologies are industry leaders. 5G promises blazing-fast connection speeds that could smooth lags and optimize multiplayer games and livestreaming so VR users might not end up with the headaches some get with today's technology.

"VR e-sports, broadcasting concerts in VR format, remote surgery — all of this is only realistic in the 5G era," said Chenyu Cui, a senior analyst at IHS Markit. "It'll make VR better for a mass audience."

Since the main commercial market for VR is entertainment, many of China's VR content makers are game developers in Shenzhen or Beijing. They're subject to booms and busts and recently, business has been flagging.

The state support is helping to protect Nanchang's developers from the cycles of feast and famine, but for now the industry is in a lull, and Xiong, the VR entrepreneur, is focused on keeping his startup afloat.

His dream is that one day, China's bet on VR will turn his thirteen-person company into an industry giant.

"I look forward to the day we can go public," Xiong said, "and become a role model for the whole province."

Associated Press writer Yanan Wang contributed to this report.

  • Thursday, Apr. 25, 2019
Walmart experiments with AI to monitor stores in real time
Mike Hanrahan, CEO of Walmart's Intelligent Retail Lab, discusses a kiosk that describes to customers the high technology in use at a Walmart Neighborhood Market, Wednesday, April 24, 2019, in Levittown, N.Y. "If we know in real time everything that's happening in the store from an inventory and in stock perspective, that really helps us rethink about how we can potentially manage the store," said Hanrahan. (AP Photo/Mark Lennihan)
LEVITTOWN, NY (AP) -- 

Inside one of Walmart's busiest Neighborhood Market grocery stores, high resolution cameras suspended from the ceiling point to a table of bananas. They can tell how ripe the bananas are from their color.

When a banana starts to bruise, the cameras send an alert to a worker. Normally, that task would rely on the subjective assessment of a human who probably doesn't have time to inspect every piece of fruit.

The thousands of cameras are a key feature of Walmart's Intelligent Retail Lab, which officially opens inside this 50,000-square-foot store on Thursday. It's the retail giant's biggest attempt so far to digitize the physical store.

Walmart envisions using the cameras, combined with other technology like sensors on shelves, to monitor the store in real time so its workers can quickly react to replenish products or fix other problems. The technology, shown first to The Associated Press, will also be able to track when shelves need to be restocked or if shopping carts are running low. It can spot spills and even detect when more cash registers need to be opened before long lines start forming.

Walmart's deep dive into artificial intelligence in its physical store comes as Amazon raised the stakes in the grocery business with its purchase of Whole Foods Market nearly two years ago.

That's put more pressure on Walmart and other traditional retailers like Kroger and Albertson's to pour money into technology in their stores. At the same time, they're trying to keep food prices down and manage expenses. Amazon has been rolling out cashier-less Amazon Go stores , which have shelf sensors that track the 1,000 products on their shelves.

Walmart's online U.S. sales are still a fraction of Amazon's online global merchandise empire, which reached $122.98 billion last year.

Walmart hopes to start scaling some of the new technology at other stores in the next six months, with an eye toward lower costs and thus lower prices. As the shopping experience improves, the retailer expects to see higher sales.

"We really like to think of this store as an artificial intelligence factory, a place where we are building these products, experiences, where we are testing and learning," said Mike Hanrahan, CEO of Walmart's Intelligent Retail Lab and co-founder of Jet.com, purchased by Walmart three years ago.

Hanrahan says the cameras are programmed to focus primarily on the products and shelves right now. They do not recognize faces, determine the ethnicity of a person picking up a product or track the movement of shoppers, he says. Some other companies have recently started experimenting with store shelf cameras that try to guess shoppers' ages, genders and moods.

There are signs throughout the Neighborhood Market educating shoppers about how it is being used as a lab. Still, the cameras could raise privacy concerns.

"Machine learning fundamentally finds and matches patterns," says Steven M. Bellovin, a computer science professor at Columbia University and a privacy expert, who hasn't seen the new Walmart AI Lab. But he says companies run into trouble when they start to match behavior to a specific customer.

Hanrahan says Walmart has made sure to protect shoppers' privacy and emphasized that there are no cameras at the pharmacy, in front of the rest rooms or in employee breakrooms.

The lab is Walmart's second in a physical store. A glass enclosed data center at the back of the store houses nine cooling towers, 100 servers and other computer equipment that processes all the data.

Last year, Walmart's Sam's Club opened a 32,000 square foot lab store, a quarter of the size of a typical Sam's Club.  The lab is testing new features surrounding the Scan & Go App, which lets customers scan items as they shop and then buy from their phones, skipping the checkout line.

The retail lab is the third project from Walmart's new incubation arm, created after the Jet.com acquisition as a way for the discounter to shape the future of retail.

It follows the launch of Jetblack, a shopping by text service aimed at affluent shoppers in New York. Walmart's second incubation project was Spatial&, a VR tech company. As part of the launch, it's bringing tractor-trailers to some of Walmart parking lots so customers can experience DreamWorks Animation's "How to Train Your Dragon" through virtual reality.

Hanrahan says the company is embracing the labs in stores to better understand the real ways that technology affects customers and workers. It also wants to educate shoppers. Walmart has made a point to not hide the technology, and small educational kiosks are set up throughout the Neighborhood Market.

Despite the signs and visible cameras, many shoppers, including Marcy Seinberg from Wantagh, New York, didn't seem to notice or care.

"I am not bothered by it," Seinberg said. "If technology saves me money, I would be interested."

 

  • Thursday, Apr. 25, 2019
RED R3d SDK for NVIDIA CUDA-accelerated workflow now available
RED R3D SDK with REDCINE-X PRO for NVIDIA CUDA
IRVINE, Calif. -- 

RED Digital Cinema® released its RED R3D® SDK and accompanying REDCINE-X PRO® software with accelerated decode and debayering on NVIDIA CUDA® platforms. By offloading the compute-intensive decoding and debayering of RED R3D files onto one or more NVIDIA GPUs, real-time playback, edit and color grade of 8K footage is now possible.

Benefits and efficiencies of this new software-hardware combination during the postproduction process include:

  • 8K real-time 30 fps or greater playback performance
  • Up to 10x faster transcoding, depending on the format and content
  • Improved efficiencies and quality control within the content review process
  • Creative freedom using flexible R3D files instead of proxy files

 
8K performance is available with NVIDIA Quadro® RTX™ 6000 and 8000, GeForce® RTX™ 2080 Ti and TITAN RTX™ GPUs when coupled with a moderately configured PC. Creators can achieve additional performance improvements with multi-GPU configurations and may see noticeable gains even with older NVIDIA GPUs. Also, new NVIDIA RTX laptops from leading computer manufacturers, including Razer, Acer, Alienware, ASUS, Dell, Gigabyte, HP, Lenovo, MSI and Samsung, provide real-time playback at up to 8K and offer flexibility in choosing the right tools to fit a variety of budgets.

Support from major NLEs and other SDK integrators is expected soon.

MySHOOT Profiles


Cinematographer
Award Winning Writer-Director with International Experience
Director, Writer
Erik Anderson
Director
Cinematographer, Director
Cinematographer, Director

MySHOOT Company Profiles