• Monday, Jul. 8, 2019
Cooke S4/i prime lenses help convey the horrors of war in Hulu’s "Catch-22"
On the set of "Catch-22" with George Clooney (standing behind podium) and cinematographer Martin Ruhe (r). (Photo courtesy of Hulu)

For the Hulu six-episode miniseries Catch-22, directed by Grant Heslov, Ellen Kuras and George Clooney, cinematographer Martin Ruhe relied on two sets of Cooke Optics’ S4/i prime lenses matched with two ARRI ALEXA Mini cameras to capture this latest on-screen version, which premiered on 17 May.

Based on the acclaimed Joseph Heller novel, Catch-22 is set during World War II and revolves around a military by-law which states that if you fly your missions, you’re crazy, and all you have to do is ask not to fly them. But if you ask not to, then you’re sane, and so you have to fly them. The book’s title coined the term that has entered the common lexicon since Heller’s book was first published in 1961.

One thing that was made clear was that it would be its own film, and not based on the 1970 version. “We all looked at the original film, and the two projects have a different nature,” said Ruhe. “Ours is a dark comedy with a strong look for a strong visual story, as compared to the original which was more of a straight comedy. The aerial scenes had to show the intense horror of being up in those small tin boxes. It had to be about life and death.”

Ruhe’s goal was to contrast the horror of the aerial scenes and the absurdity of the ground scenes. To do that, he made use of two identical sets of Cooke S4/i prime lenses--14mm, 18mm, 21mm, 25mm, 32mm, 40mm, 50mm, 65mm, 75mm, 100mm and 135mm focal lengths--shooting with the ARRI ALEXA Mini’s Super 35mm (2.8K) sensor in ARRI Raw 16:9, which would later be finished in 4K HDR.

“We had two sets of camera/lens combinations as we were cross shooting as well as having some days with splinter [second] unit shooting,” explained Ruhe. “While I used all of the lenses, the 32mm was my all-time favourite for close-ups inside the planes. Although, to be honest, I did have to move to the 50mm at times due to the limited space within those planes.”

In fact, one of the main benefits of using Cooke S4/i primes for Ruhe was their size. “I had to be very fast and versatile in tight places. I didn’t want to get stuck fighting minimal focus, and thanks to the S4/i’s, I didn’t,” he added.

To help understand the period, production designer David Gropman provided a lot of stills from Heller’s regiment to show the team what life in those camps was like, along with viewing historical newsreel footage. Then, during camera tests, stills were taken and placed into Photoshop to match the old postcard look of the era. Company 3, which would handle the digital intermediates, then created LUTs for the cameras to match the required looks.

With more than 20 years of experience with Cooke lenses, Ruhe knew from the start that he wanted the S4/i primes. “I first grew attached to Cookes on commercials, and I shot The American with S4s as well as The Keeping Room, where I also used original Cooke Speed Panchros,” he said. “They are just beautiful in the way that they fall off, how they flare and the texture you get from them. This is especially important when shooting in digital, as the lenses give you a nice organic feel. There’s just something so beautiful about the Cookes, and I go back to them time and time again. And the close-up with the 32mm is just the perfect tool.” 

For both the ground and aerial scenes, Ruhe went for a natural look. This was especially important on-board the planes as he didn’t want them to be too perfectly lit. For ground interiors, a 120’x75’ soft sail and grey screen was used with a 20K standing in for sunlight.

“You want people to feel the heat of the day,” explained Ruhe. “We worked with hard contrast; blow out when inside the tent looking out. I think this looked quite natural, as I wanted to convey the feeling of heat.”

For Ruhe, one of the standout scenes for the S4/i was in episode six. “I don’t want to give away any spoilers, but there’s a scene that was entirely shot with the 32. It’s so close to the faces and so intimate, which I love. You’ll have to see it to understand it, but every DP out there will know what I’m talking about when they watch that episode. It just looks great.”

  • Thursday, Jul. 4, 2019
Technology allows NBC to add new elements to Tour de France
In this July 29, 2018 file photo, Tour de France winner Britain's Geraint Thomas, wearing the overall leader's yellow jersey, passes the Arc de Triomphe during the twenty-first stage of the Tour de France cycling race over 116 kilometers (72.1 miles) with start in Houilles and finish on Champs-Elysees avenue in Paris, France. Chris Froome's absence, coupled with the withdrawal of last year's runner-up Tom Dumoulin, has reshuffled the game and produced a long list of top contenders. (AP Photo/Christophe Ena, File)

Phil Liggett remembers the early days of Tour de France coverage in the United States, which would involve him traveling to Paris at the end of a stage, recording voiceovers all night and then rejoining the circuit for the next stage.

Those days, though, are ancient history. The Tour has been aired live in the U.S. since 2001 with Liggett providing play-by-play. The coverage has also evolved to include pre-race and nightly highlight shows.

The NBC Sports Group will air more than 250 hours of coverage across NBC, NBCSN and the NBC Sports Gold online streaming package. Despite the challenges of live coverage, Liggett said it is a lot easier compared to the weekend highlight shows that used to be the only way to view it.

"There is nothing that beats doing it live," said Liggett, who will be covering his 47th Tour when it begins Saturday. "Sitting in the commentary box is like being in a 727. You can't wait to take off and see where things land at the end of a stage."

As technology has evolved, broadcasters have found it easier to add new elements. This year's Tour will include cameras on the bike of up to eight riders that can transmit live. They have been tested the past four years, but would only show footage at the completion of a stage.

Cameras can be mounted under the rider's saddle and on the front under the handlebar. The cameras could provide additional insight into late-race moves or crashes.

Steve Porino will also have a camera focused on him as he reports during each stage while traveling aboard a motorcycle on the course.

Commentator Christian Vande Velde will ride several key stages in advance, wearing special raptor sunglasses to preview critical course points. The sunglasses will also show Vande Velde's speed, how much energy he is using and the rate of pedaling. The sunglasses utilize the same technology U.S. fighter pilots have in the visors of their helmets for telemetry.

NBC will also use a virtual graphics Telestrator, which will produce augmented reality graphics that will allow commentators to move around and analyze cyclists. Depending on its use, it is the type of technology that could be extended to coverage of other sports.

Joel Felicio, who is the coordinating producer for NBC, said planning for the Tour begins in October. Most of the video that Felicio uses comes from France TV Sport, which provides the main feed, but his challenge each year is figuring out how to introduce new elements to the broadcast.

Felicio also has a challenge that few others have, which is producing live broadcasts from 21 different locations.

"There's trying to figure out commercials, when to go to commentary and using different elements but not missing the key move. There's also trying to keep everyone interested for six hours," he said.

This will be the first Tour since 1985 that Liggett has not done with Paul Sherwen by his side. Sherwen died Dec. 2 at age 62 due to heart failure. Bob Roll moves into the commentary box along with Jens Voigt, who competed in the Tour from 1998 through 2014

Chris Horner, who competed in the Tour seven times, will also debut as an analyst. Horner is the most recent American Grand Tour champion after capturing the Tour of Spain in 2013.

"I think this is the most open race in years," Liggett said. "There are a lot of young riders that have the potential of this being one of the best Tours in recent memory."

  • Monday, Jul. 1, 2019
Vision Research introduces Phantom 640
Phantom S640

Vision Research, manufacturer of Phantom High-Speed cameras, has introduced the Phantom S640 which leverages the Phantom CMOS sensor used in the popular VEO640 camera and provides up to 6 Gpx/sec (75 Gbps) of streaming data and images for machine vision applications.

The S640 gives demanding machine vision applications the frame rates they need, reaching 1,480 fps at full 4 Mpx resolution of 2560 x 1600 and up to 200,715 fps at lower resolutions. The S640 has many similar characteristics to the VEO640, including high image quality and high frame rates at 4Mpx resolution, but its output is not confined to the camera’s RAM.

“We’re excited to bring the benefits of the Phantom S640 to a wide variety of Machine Vision applications, especially those that need to access and analyze the high-speed data immediately and can’t wait for downloading from the camera’s memory, as well as those events and processes that run longer than standard camera RAM can manage,” said Dan Hafen, director of business development for Machine Vision Cameras at Vision Research.

The Phantom S640 joins a growing family of Phantom Machine Vision Cameras as Vision Research continues to bring the benefits of its traditional Phantom High-Speed cameras to the Machine Vision platform. It joins the Phantom S990, based on the cinema quality Phantom 4K sensor, as well as the S210 and S200, based on the Phantom Miro C210, which is used in automotive, industrial and scientific applications. The S640 sensor, the same used in the VEO640, has a 10 micron pixel size, larger than many Machine Vision cameras. Larger pixels are typically more light sensitive than smaller pixels, an important factor in high speed imaging. The S640 has an ISO rating of 6500 for monochrome and 1250 for color.

To support high quality imaging, it also has 12-bits, a noise level of 20.5e- and dynamic range of 55.9dB. The camera also features a general-purpose input/output (GPIO) for fast, flexible signaling and synchronization. It includes signals beneficial in standard high-speed applications, such as Time Code In and Out, as well as signals commonly found in streaming applications.

The S640 streams at high frame rates using CXP6 protocol. Like the Phantom S990 released last year, the S640 has four banks of four CXP6 connections each. Each bank connects to a 4-port frame grabber, or two banks can connect to an 8-port frame grabber. Once the frame grabbers receive the data, the image is stitched together and processed with any other customer algorithms. The data can be analyzed immediately or stored in a long-record DVR for post analysis. The S640 is compatible with any PCIe3 CXP6 frame grabber. However, frame grabber manufacturer, Euresys, has made the stitching function easier by incorporating it into the software of their Euresys 8-port Octo board.

For applications that do not require the full 6Gpx/sec throughput, the S640 can also provide only 2Gpx/sec of data by using only one bank of ports, or 4GPx/sec of data with two banks. It can also save data transfer rates by switching to 8-bit mode.

Key specifications of the Phantom S640 include

  • Up to 6 Gpx/second (75 Gbps) of streaming capability
  • 1,480 fps at 2,560 x 1,600 and 2,340 fps at 1,920 x 1,280
  • 4-megapixel CMOS sensor in color or monochrome
  • 10-µm pixel size
  • 12-bit or 8-bit data transfer
  • Up to 4 banks of 4 CXP ports
  • CXP6 and GenICam compliant
  • Thursday, Jun. 27, 2019
Glassbox Technologies launches virtual camera toolset

Virtual production software company Glassbox Technologies has released its powerful and accessible virtual camera plugin DragonFly from private beta for public use. Teased during the company’s spring 2019 launch, DragonFly brings professional virtual cinematography tools to filmmakers and content creators everywhere, allowing users to view character performances and scenes within computer-generated virtual environments in real-time, through the camera’s viewfinder, an external monitor or iPad.

Available for Unreal Engine, Unity 3D and Autodesk Maya, DragonFly delivers an inclusive virtual cinematography workflow that allows filmmakers and content creators to make and test creative decisions faster and earlier in the process, whittling down production cost on projects of all scopes and sizes. 

The powerful off-the-shelf toolkit takes creators from pre-viz to post-viz without the need for large teams of operators, costly hardware or proprietary tools. It is platform agnostic and fits seamlessly into any workflow out-of-box. With DragonFly, users can visualize and explore a CG virtual environment, then record, bookmark, create snapshots and replicate real camera movement as seamlessly as conducting a live-action shoot.  

“Virtual production poses great potential for creators, but there were no off-the-shelf filming solutions available that worked out of the box,” noted Glassbox co-founder and CPO Mariana Acuña. “In response, we made DragonFly: a virtual window that allows users to visualize complex sets, environments and performances through a viewfinder. Without the need for a big stage or mocap crew, it brings greater flexibility to the production and post pipeline for films, animation, immersive content, games and real-time VFX.” 

The product was developed in collaboration with top Hollywood visualization and production studios including The Third Floor for best-in-class results. 

“Prior to DragonFly, each studio has created their own bespoke virtual production workflow, which is costly and time-consuming per project. DragonFly makes real-time virtual production usable for all creators,” said Evelyn Cover, global R&D manager for The Third Floor. “We’re excited to collaborate with the Glassbox team to develop and test  DragonFly in all kinds of production scenarios from previs to post, with astounding success.”

Glassbox’s second in-beta virtual production software solution, BeeHive--the multi-platform, multi-user collaborative virtual scene syncing, editing and review solution--is slated to launch later this summer.

DragonFly is now available for purchase or can be downloaded for free as a 15-day trial on the Glassbox website. Pricing and licensing includes a permanent license option costing $750 USD (including $250 for the first year of support and updates) and an annual rental option costing $420/annum.

  • Thursday, Jun. 27, 2019
Avid introduces updated MediaCentral platform

Avid® (Nasdaq: AVID) has released MediaCentral® 2019, the next generation of its media workflow platform for TV news, sports and postproduction operations.

“We’ve taken the core business capabilities of MediaCentral, the richest media platform available, and redesigned it so our customers--whether they are a team of two or an organization of thousand--have easy access to information, assets and apps so they can turn around their content faster than ever before,” said Raymond Thompson, director of broadcast and media solutions marketing at Avid.

MediaCentral 2019 scales to meet the needs of today’s and tomorrow’s journalists who collaborate in increasingly dispersed teams as they strive to create engaging shows and stories faster and be first to break news on air and on social media. MediaCentral’s unified platform has a customizable suite of creative tools and media management, which enable teams to work simply from within the modern user interface, create and collaborate from anywhere using any device, and deploy the platform with seamless cloud integration for a full cloud solution or an on-prem/cloud hybrid approach.

With MediaCentral 2019, teams--across multiple geographic locations--can quickly ingest, log, search, edit, distribute, and publish video content to any number of outlets, giving them the agility to create better content faster and maximize its value. All-new MediaCentral functionality includes:

  • Collaboration across multiple sites: Up to six production sites can connect, enabling sharing and powerful searching of content in different locations for greater accessibility and collaboration. Content creators can search, browse, and play back media remotely with the same performance as if stored locally.
  • Faster and more intuitive search: Find media faster by using the new query builder, and expanded filtering (metadata, dates, and favorites). Users get the right results faster and are better able to leverage and monetize their media assets. The Phonetic Index option allows users to find all clips that contain the words that they’re looking for in a matter of seconds.
  • New logging capabilities: Users can log assets with meaningful details quickly. Ideal for sports, news, reality TV, and post production, the new Log app allows loggers to tag information as it happens, and mark in/out points quickly, enabling other team members to easily search large amounts of media and find the right clips faster.
  • Automated file ingest: Ingest high volumes of media through a web browser or volume ingest with full Avid Media Access (AMA) support in the Ingest desktop app.
  • Flexible deployment: Users can set up the platform however they want and transition it as business needs evolve. MediaCentral can be deployed on premises in a facility, in a private data center, or with a hybrid model.

These new functions enhance and extend the industry-standard platform’s modular, scalable design and full suite of apps, services, and connectors that accelerate every part of the media creation and publishing workflow.

  • Tuesday, Jun. 25, 2019
Testronic Labs adds Archion EditStor Omni systems to facilities in Burbank, Warsaw
EditStor Omni storage system

Testronic Labs, which specializes in quality assurance (QA), localization services, and compliance for the film, television and games industries, has invested in Archion Technologies’ EditStor Omni storage systems. These Archion solutions are in use at two of Testronic’s locations: Burbank, Calif., and Warsaw, Poland.

Testronic is a global provider of QA across film, TV, games, and platforms, with facilities in London, Warsaw, Burbank, and Santiago. Established over 20 years ago, Testronic has teams of quality assurance experts who test content at all levels, from masters to deliverables, and test the consumer experience on a range of international devices, ensuring OTT content and services deliver the best possible experience to end users.

Testronic Labs because was in need of a very high performance and cost-effective network storage solution for its Burbank facility.  During a meeting and live demonstration at NAB, Jason Gish, Testronic’s president of film and TV, was able to see the EditStor Omni perform first-hand.  Subsequently, Archion deployed an EditStor Omni at Testronic Labs’ Burbank facility for testing and proof of concept. After a successful evaluation, Gish approved the purchase of an Omni.   
Archion was then informed that Testronic Labs required a new media storage solution for its expanding Poland location. Gish approved the purchase of a new EditStor Omni for Testronic’s operation in Warsaw, while also expanding the Omni at his rapidly growing Burbank facility.

Testronic currently has a mix of over 20 Macs, PCs, and Linux workstations connected to the EditStor Omni in its Burbank facility, with a similar number of workstations connected to the Omni in its Warsaw location.

Gish said, “We chose Archion’s EditStor Omni for our Burbank and Warsaw locations because speed is such a critical part of our storage need. We needed screaming fast storage to play files that contain a lot of data, like UHD/4K, HDR 10, DolbyVision, and Dolby Atmos. The Archions are installed in both our Warsaw and Burbank locations as our main system of file storage and play out. The benefits these solutions offer our clients include high performance playback for high level content. Clean and smooth playback is a necessity for QC. Other systems we’ve tried have introduced playback issues. We chose Archion to avoid those issues, and to provide us the best possible view of the files to be QC’d. The Archion Omni lives up to our expectations, and our experience with the systems in both locations has been positive and beneficial. We’ve had no issues with uptime, and the speed and throughput are everything they are claimed to be. The Archion team has been very helpful and responsive whenever we have questions or need to add storage. We continue to grow with Archion, as our offices add more and more clients, and our workflows also continue to grow.” 

EditStor Omni, delivering speeds of over 15,000 MB/second, provides production, postproduction, creative agencies and other creative facilities a single media storage system for all their high-performance collaboration needs. From 8K video to 4K playback of raw media files, the Omni NAS storage server has the performance, scalability and functionality to handle the most challenging media workflows.

EditStor Omni is an intelligent 24 drive storage system with single server capacities of up 336TB per chassis, hot pluggable expansion nodes, and total expansion into multiple Petabytes.  It is a complete turnkey collaborative storage solution that requires no third-party software or drivers to ensure compatibility with the prominent editing, finishing and visual effects applications, including those from Adobe®, Apple®, Avid®, Autodesk® and Blackmagic Resolve®.

The EditStor Omni was recommended to Testronic by the consultancy firm File Based Workflows, while Cutting Edge, a media systems integrator and now a division of ALT Systems, was instrumental in the successful deployment and system integration at Testronic.

  • Monday, Jun. 24, 2019
WarnerMedia Innovation Lab plans NY facility, enters into partnerships to advance future tech
A look at what's envisioned for Warner Media Innovation Lab in NYC

The WarnerMedia Innovation Lab made several key announcements, deepening its commitment to cutting edge technology and the incubation of new consumer facing products, services and experiences. These announcements include the location of its physical space in New York City which will be powered by AT&T’s 5G network, making it one of AT&T’s first permanent 5G experience centers. The Lab also announced further details on its partnerships with WarnerMedia Ad Sales and with Xandr, AT&T’s advanced advertising and analytics company. Finally, the Lab announced the appointment of an architecture firm to lead design of the physical space.

The WarnerMedia Innovation Lab will be a newly constructed 20,000 square foot facility located in the Chelsea neighborhood of Manhattan, featuring an immersive zone for showcasing consumer-ready experiences visible to the public, flexible indoor and outdoor event spaces, dedicated R&D environments and an open and collaborative modern work space.

“The Lab is more than a technology incubator, but also a dream factory for us to create the wonderment that fans have come to love and expect from WarnerMedia,” said Jesse Redniss, GM, WarnerMedia Innovation Lab. “Here we’ll flex the best of WarnerMedia’s creative storytelling capabilities combined with cutting edge technology from AT&T and our partners to deliver experiences that will be talked about for a lifetime.”

The Lab, which is slated to open its doors to strategic partners and WarnerMedia cross-business unit teams in early 2020, will bring 5G experiences to life through exploration and development initiatives, enabling a real-time virtualized collaboration ecosystem across WarnerMedia and the AT&T offerings. This builds on AT&T’s commitment to offer 5G across WarnerMedia properties. From the Innovation Lab in New York City to Warner Bros. in Los Angeles, to The Lounge by AT&T in Seattle and WarnerMedia’s Atlanta studios, 5G has the power to transform how WarnerMedia’s content is created and consumed.

“By working across AT&T, we’re able to combine the latest in 5G technology with immersive content experiences and cutting-edge advertising capabilities,” said David Christopher, president of AT&T Mobility and Entertainment. “The WarnerMedia Innovation Lab will be a space where developers, creators and visitors will be inspired to push the boundaries of entertainment, all powered by the company that first introduced the U.S. to the power of mobile 5G.”

WarnerMedia’s commitment to the Lab’s ability to revolutionize the overall fan experience includes advertising as well. Dan Riess, head of advanced advertising and branded content, WarnerMedia Ad Sales, commented, “Storytelling is in our company’s DNA and part of that experience is how the content is enjoyed, including advertising. The Lab is a critical part of our testing and learning on the new experiences in advertising that we will be rolling out to market.”

The WarnerMedia Innovation Lab will be also be backed by consumer insights and technology from Xandr, AT&T’s advanced advertising and analytics company, as it continues to test and develop new advertising capabilities that make brand messages more relevant and engaging to consumers. This powerful combination will deliver innovative consumer experiences and foster an entrepreneurial spirit across the full WarnerMedia portfolio of brands.

Kirk McDonald, CMO, Xandr. said, “Working with our colleagues at AT&T Communications and WarnerMedia, we are uniquely positioned to develop new advertising innovations that engage consumers and provide integral feedback for marketers and brands. The WarnerMedia Innovation Lab will accelerate the adoption of new advertising formats and provide an environment to showcase our collaborative work.”

The new advertising advancements also allow marketers to learn how their messages resonate with viewers, as the Lab unveils a new balance in the relationship between advertising, technology and content. The innovations will include MR/VR applications, 5G uses that enhance new advertising capabilities, and better UI/UX experiences to “Make Advertising Matter.”

Architectural design firm Design Republic was awarded the services project for the Lab’s physical space in NYC, with work beginning this summer. Design Republic is an award-winning architectural design firm specializing in corporate workplace, retail, and media technology design, and was chosen based on their culture fit with WarnerMedia and innovative approach and design concepts, as evidenced by recent work with clients including Nasdaq, Bvlgari and WarnerMedia’s own Bleacher Report.

WarnerMedia is part of AT&T Inc.

  • Thursday, Jun. 20, 2019
Elton John biopic "Rocketman" graded with DaVinci Resolve Studio
A scene from "Rocketman"
FREMONT, Calif. -- 

Blackmagic Design has announced that its DaVinci Resolve Studio was used throughout the color pipeline on Paramount’s new musical biopic about Elton John’s breakthrough years produced by Marv Films & Rocket Pictures and directed by Dexter Fletcher.

Lensed by DP George Richmond, BSC, Rocketman had an on set DIT workflow developed and managed by Onset Tech’s Joshua Callis-Smith with Goldcrest Post handling the final DI and online. Senior colorist Rob Pizzey handled the final grade delivering in Dolby Vision Domestic and Theatrical HDR, as well as SDR Rec 709 with Russ White and Daniel Tomlinson completing the online.

Pizzey and Richmond began pre-production by defining the main show LUT in early testing where different lenses, exposure ranges and lighting setups were tested against elements from the film’s production design. “Three variations of the LUT were then created at multiple exposures as the films base look throughout the on set DIT, where I helped create a subtle color arc for the film’s dailies to inform the shifting narrative,” said Callis-Smith who added, “Our dailies lab relied on a Blackmagic eGPU setup for the first time, allowing them to achieve faster processing speeds using the 5K iMac via Thunderbolt 3, even when it came to H.264 encoding.” 

Once production was complete, Richmond and Pizzey selected 350 still frames from the film to create a color bible for the final grade. They then spent two days picking different looks for each scene before a two week unattended grading process, where Pizzey matched up the different looks to the aesthetics in the color bible while handling subtle recuts and dropping in visual effects sequences where necessary.

“We wanted to reflect the narrative with a slightly muted, desaturated world for young Reggie,” said Pizzey. “Then, as the film progresses and Elton’s world goes out of control, the color pops more and vintage lenses were used to obtain a vibrant feel with exaggerated flares.”

“I aimed to maintain the soft look achieved with the vintage lenses, making sure not to go too hard on the bottom end of the curve,” added Pizzey. “For one scene, we wanted to create a look where everything was muted except for Elton and Reggie. There was a lot of color in the scene’s rushes. The houses were red brick, there was lots of green in the gardens, and the dancers wore colorful outfits. To achieve the required look, we asked VFX to create mattes to help us to make sure the grade didn’t bleed into the other characters throughout the dance routine.”

  • Tuesday, Jun. 18, 2019
A look at Libra, Facebook's new digital currency
In this Jan. 9, 2019, file photo, media and guests tour Facebook's new 130,000-square-foot offices, which occupy the top three floors of a 10-story Cambridge, Mass., building. Facebook unveiled a broad plan Tuesday, June 18, to create a new digital currency. (AP Photo/Elise Amendola, File)

Facebook is unveiling a digital currency called Libra as the company seeks to make its ads more valuable by enabling smoother transactions and payments online, particularly among those without credit cards or bank accounts.

Libra will use the same security and record-keeping principles as Bitcoin, the most popular digital currency system today. But unlike Bitcoin, Libra is backed by several traditional financial companies, including PayPal, Visa and Mastercard, and will base its value on multiple real-world currencies such as the U.S. dollar and the euro. Libra also faces additional scrutiny over privacy, given Facebook's poor record on the matter.

Here's a look at Libra and other cryptocurrencies.

It's a form of digital cash that uses encryption technology to make it secure. Cryptocurrencies exist not as physical bills or coins but rather as lines of digitally signed computer code. Records are typically kept on ledgers known as blockchain.

People can store their cryptocurrency stashes in virtual wallets that resemble online bank accounts. Facebook is developing a wallet app for Libra; others will be able to as well.

As with other cryptocurrencies, people will be able to buy and sell libras on exchanges for traditional currencies. It's not clear what fees, if any, consumers will have to pay for such transfers, although Facebook says they should be low.

Although Bitcoin has gotten a lot of attention, it isn't widely used. For one thing, its value fluctuates wildly, meaning that $100 in bitcoins today might be worth $300 a month from now — or $2.50. Only a handful of merchants accept bitcoins as payments.

Facebook is hoping to keep the libra's value stable by tying it closely to established currencies. Unlike most other cryptocurrencies, the Libra will be backed by real-world bank deposits and government securities in a number of leading currencies.

Facebook is also recruiting partners ahead of time. Lyft, Uber and Spotify already have joined the Libra group. They will likely accept libras when the system launches. They'll also help fund, build and govern the system. That'll make Libra less of a free-for-all than Bitcoin. Facebook says Libra will embrace regulation, but it isn't providing many details on how.

With most cryptocurrencies, including Bitcoin, anyone can lend computing power to verify transactions and to prevent anyone spending the same digital coin twice. With Libra, the verifications will initially be managed by its founding companies, such as Facebook and PayPal. Facebook believes the closed approach will mean better security.

Although it's possible to trace bitcoins and some other cryptocurrencies as they are spent, owners of accounts behind the transactions aren't necessarily known. That makes such currencies a favorite among certain cybercriminals. But it is sometimes possible to tie cryptocurrency transactions to a real person who has cashed out digital coinage into a traditional currency.

And if someone spends libras while logged onto Facebook, it's theoretically possible Facebook could tie it back to a real person.

Facebook says it won't use Libra data to target ads, but may share data "to keep people safe, comply with the law, and provide basic functionality." Facebook is creating a subsidiary, Calibra, to try to keep the operations separate.

Libra is scheduled to launch publicly in the first half of next year. Whether consumers will embrace it is another matter. Discounts potentially offered by Uber and other partners might be enough to get people to at least try the system. But many people find it easy enough to pay for goods and services online with credit and debit cards.

There could be greater appeal among people who don't have bank accounts. Libra could open up e-commerce to them.

Though Libra could be a way for Facebook to drive spending when people interact with Facebook ads, the company says the currency will be independent and won't require a Facebook account to use.

  • Thursday, Jun. 13, 2019
Simian boosts analytics capabilities with features tracking reel viewership, emotional responses
Simian MoodReactions measurement
LAGUNA NIGEL, Calif. -- 

Simian, the video sharing and collaboration service used by advertisers, agencies, media companies, production houses, post studios and music providers, has introduced two features that provide a boost to its analytics capabilities.

The company has just launched Engagement Graphs and debuted a new tool called Simian MoodReactions. The former provides a visual, at-a-glance representation on how reels and presentations have been viewed by their recipients, while the latter lets viewers tag spots or scenes with a range of emoticons that convey how they feel about the content they’ve just viewed.

According to the company, this form of analytic metric has never before been available in any video sharing or collaboration service. “When used together, Engagement Graphs and MoodReactions can provide users with a deep level of insight as to how their work is being received in the marketplace,” said Brian Atton, Simian’s chief operating officer, “while providing members of ad agency creative and production groups with a means of sharing attitudes and opinions about work they’re reviewing without having to screen the work in a group setting.”

Simian’s Engagement Graphs, which were rolled out in late May, offer a visual representation that reports on what portions of a spot or reel have been watched, re-watched or skipped over. Replacing Simian’s Heatmaps feature, they’re part of an analytics redesign that includes a refreshed dashboard that more clearly displays important reel performance data, such as when the reel was opened, total number of views, what percentage of the video was viewed, total downloads and more.

The net result is a broader picture on how the video’s been received, which helps senders such as sales reps and EPs make more strategic decisions when following up with clients or building their next reel.

“Analyzing reel performance and identifying standout spots, or scenes within spots, helps you make smarter choices,” said Atton. “That was our thinking behind the creation of Engagement Graphs – they equip creative companies with actionable reel viewing insight, empowering them to send better-targeted follow-ups and win more jobs.”

MoodReactions lets users show how they feel
If tracking how prospects interact with your work gives Simian users a better understanding of what they’re looking for on each project, what hasn’t been measurable before is how prospects feel about what they’re watching. Simian MoodReactions addresses that by measuring emotional response to what they’re seeing on screen.

“Now people watching can register their response to video content by tagging the video at any point during its running time with a set of corresponding emojis,” explained Jay Brooks, Simian’s chief technology officer. “They run the gamut of how someone might respond to work they’re seeing, with icons that convey feelings like love, like, sad, wtf, boring, even angry.”

Simian MoodReactions allow creative groups or production teams to bookmark scenes and share those reactions with the rest of the group when selecting a director, editor, DP, artist or other creative to bid on a project. “This is especially useful when location or schedules makes it impossible to assemble the team and screen reels together,” added Atton.

“We spent a lot of time developing these features, and like everything else we do here, it’s in response to the needs of our users,” he continued. “On the production and post side, they’re looking for meaningful ways they can use data to help inform and guide their sales efforts. And on the agency side, they’re looking for ways they can share and collaborate efficiently. Engagement Graphs and MoodReactions satisfies both needs.”

MySHOOT Company Profiles