Wednesday, June 19, 2019

Toolbox

  • Thursday, Mar. 28, 2019
Ikegami rolls out HDR support for monitors
Ikegami HLM-2460W monitor
NEUSS, Germany -- 

Ikegami has made high dynamic range (HDR) support available as an option for its HLM-60 monitor series. 

The new option includes EOTF tables for Hybrid Log-Gamma (HLG) and S-Log3 in addition to conventional gamma. Existing monitors in the HLM-60 series can be upgraded with the new option retrospectively. The HLM-60 series includes models HLM-2460W, HLM-1760WR and HLM-960WR. Ikegami’s HLM-2460W is a 24-inch Full-HD monitor with a 1920 x 1200 pixel 10-bit resolution LCD panel. It offers 400 candela per square meter brightness, very narrow front-to-back dimensions, light weight and low power consumption. Multi-format SDI, 3G-SDI, HDMI, Ethernet and VBS inputs are provided as standard. The HLM-2460W incorporates a full 1920 x 1080 pixel high brightness and high contrast LCD panel. This has a wide viewing angle, fast motion response, and high-quality color reproduction, achieving real pixel allocation without resizing. The monitor’s gradation characteristics make it ideal for a wide range of broadcast applications.

The HLM-1760WR monitor has a 17-inch Full-HD 1920×1080 pixel 450 candela per square meter 10-bit resolution LCD panel. The HLM-960WR is a highly compact multi-format LCD monitor with a 9-inch Full-HD 1920 x 1080 pixel 400 candela per square meter 8-bit resolution LCD panel. 

  • Wednesday, Mar. 27, 2019
Artificial intelligence pioneers win tech's "Nobel Prize"
This undated photo provided by Mila shows Yoshua Bengio, a professor at the University of Montreal and scientific director at the Artificial Intelligence Institute in Quebec. Bengio was among a trio of computer scientists whose insights and persistence were rewarded Wednesday, March 26, 2019, with the Turing Award, an honor that has become known as technology industry’s version of the Nobel Prize. It comes with a $1 million prize funded by Google, a company where AI has become part of its DNA. (Maryse Boyce/Mila via AP)
SAN FRANCISCO (AP) -- 

Computers have become so smart during the past 20 years that people don't think twice about chatting with digital assistants like Alexa and Siri or seeing their friends automatically tagged in Facebook pictures.

But making those quantum leaps from science fiction to reality required hard work from computer scientists like Yoshua Bengio, Geoffrey Hinton and Yann LeCun. The trio tapped into their own brainpower to make it possible for machines to learn like humans, a breakthrough now commonly known as "artificial intelligence," or AI.

Their insights and persistence were rewarded Wednesday with the Turing Award, an honor that has become known as technology industry's version of the Nobel Prize. It comes with a $1 million prize funded by Google, a company where AI has become part of its DNA.

The award marks the latest recognition of the instrumental role that artificial intelligence will likely play in redefining the relationship between humanity and technology in the decades ahead.

"Artificial intelligence is now one of the fastest-growing areas in all of science and one of the most talked-about topics in society," said Cherri Pancake, president of the Association for Computing Machinery, the group behind the Turing Award.

Although they have known each other for than 30 years, Bengio, Hinton and LeCun have mostly worked separately on technology known as neural networks. These are the electronic engines that power tasks such as facial and speech recognition, areas where computers have made enormous strides over the past decade. Such neural networks also are a critical component of robotic systems that are automating a wide range of other human activity, including driving.

Their belief in the power of neural networks was once mocked by their peers, Hinton said. No more. He now works at Google as a vice president and senior fellow while LeCun is chief AI scientist at Facebook. Bengio remains immersed in academia as a University of Montreal professor in addition to serving as scientific director at the Artificial Intelligence Institute in Quebec.

"For a long time, people thought what the three of us were doing was nonsense," Hinton said in an interview with The Associated Press. "They thought we were very misguided and what we were doing was a very surprising thing for apparently intelligent people to waste their time on. My message to young researchers is, don't be put off if everyone tells you what are doing is silly."

Now, some people are worried that the results of the researchers' efforts might spiral out of control.

While the AI revolution is raising hopes that computers will make most people's lives more convenient and enjoyable, it's also stoking fears that humanity eventually will be living at the mercy of machines.

Bengio, Hinton and LeCun share some of those concerns — especially the doomsday scenarios that envision AI technology developed into weapons systems that wipe out humanity.

But they are far more optimistic about the other prospects of AI — empowering computers to deliver more accurate warnings about floods and earthquakes, for instance, or detecting health risks, such as cancer and heart attacks, far earlier than human doctors.

"One thing is very clear, the techniques that we developed can be used for an enormous amount of good affecting hundreds of millions of people," Hinton said.

  • Tuesday, Mar. 26, 2019
Cooke Anamorphic/i Lenses prove to be a force for Disney "Star Wars" spots
On the set of a "Star Wars" merchandise/toys spot, for which Cooke lenses were deployed.
LEICESTER, UK -- 

Two VFX-heavy commercials for Disney highlighting the latest Star Wars toys and merchandise benefited from Cooke Optics’ Anamorphic/i prime lenses to recreate the cinematic look of the films, and Cooke’s /i Technology metadata system to aid smooth shoots and complex post-production processes.

Both spots play to the firing of children’s imagination: “Choose Your Path” focuses on The Last Jedi merchandise, featuring three children playing in an attic bedroom; a boy puts down a Kylo Ren toy, which then comes to life to fight Lego starships, while two of the children duck as a ship speeds past them on the red salt flats of Crait – which then seamlessly turn back into the bedroom with a classic Star Wars wipe. “Galaxy Of Adventures” features the original Star Wars trilogy and Solo: A Star Wars Story, with more children playing in an attic room, interacting with the toys and merchandise in a series of tableaux reminiscent of scenes from the films.

The core production team behind the spots – director Steven Hore, director of photography Alex Macdonald and DIT James Marsden - have worked together for over 15 years, mainly on commercials. They chose to shoot with Cooke Anamorphic/i lenses in a deliberate nod to the cinematic look of the first and third Star Wars trilogies.

“Cooke lenses were famously used on the first trilogy, and Cooke Anamorphic/i’s were the closest modern lenses we could find to replicate that look,” said Macdonald. “We loved the glass - it gives so much to skin tones, and the way it works with light really encapsulates the cinematic look. We then realized the bonus was that we could get all this telemetry from the lenses as well, through the /i Technology sensor.”

Hore concurred: “We knew from the start that we wanted to short anamorphically – it was a quick win in terms of transforming a small interior space into something approximating the Star Wars world. However, when you only have a few seconds of screen time for a commercial you can’t go mad with flare or it risks overwhelming the story and products. The Cooke Anamorphic/i’s have bags of character, they make everything feel really creamy and they have a lovely flare and focal characteristics, but they don’t bombard you.

“We also knew there would be a huge amount of postproduction so we needed lenses that would allow reliable replication of shots in post. The added bonus of /i Technology to increase the information we could provide to the VFX team made the Cooke Anamorphic/i lenses an easy choice.”

The team has shot a few previous projects with various Cooke lenses, and had seen the benefits of recording the lens metadata. Marsden said, “We used Cookes on the last short film I worked on, shooting RAW with an Alexa SXT - if the camera department came back and wanted to know what the stop was or which lens we used, we had all that information. People don’t realize you have this amount of power and annotation in the lens interface. It’s not hard to implement, and gives tremendous time-saving and cost-saving benefits on set and in post. The level of data you can pull out as a frame by frame record, like aperture, position and which specific lens you’re using, is fantastic.”

The team’s regular camera of choice, including for these spots, is the Sony F55. “We generally now use two F55s and two RAW recorders. The F55 is the first RAW system we could get into as a modular system - it’s small and light enough to go on a crane, use handheld, you can do anything with it – and the /i system works natively with RAW data,” said Macdonald.

For the Star Wars commercials shoots the team shot lens grids, but the additional camera required for this can be a tough sell for production budgets. “Having the lens data eliminates the need for this, plus it’s very helpful to double check that you’re using the same lens and at the same settings when returning after a break or shooting additional plates for a scene,” said Macdonald.

This proved to be the case when, a few months later, the Disney team wanted to substitute the products shown in one of the spots for a different set of merchandise. As Macdonald explained: “We did another day’s shoot with the same children against a blue screen with the new toys, and simply dropped the shot into the original ad. We were able to match it quickly and easily because we had the original information about which lens we had used and all the settings.”

There were also several shots where the lens data was crucial for post. “We were shooting a tracking shot behind the kids’ heads against a blue screen which, in the final version, would place them into a battle zone scene,” said Hore. “We had to do the shot hand-held, so every take would have been slightly different. With the /i data to help with tracking, the kids were composited seamlessly into the film scene, which gave the spot great production value.”

Another example of a complex VFX shot saw a Yoda mini figure transformed into a full sized character. “You can imagine the compositing that went into it - taking a 75mm shot of a tiny figure and then selling it as a 24mm wide shot of a full size Yoda - that’s quite a Jedi mind trick to pull off,” added Hore. “It was a lot of work - a combination of compositing and referencing and setting up equivalent lenses in post to ensure the handover between shots was seamless. It was that much simpler thanks to the /i Technology lens information.”

The anamorphic flare plays a big role in many of the Star Wars films, and the team wanted to capture that for the spots by playing with the lighting. “Star Wars is set in a make believe world where a planet might have two suns, so we felt freed from the idea that the sun has to come through one window at a particular time…the spots were all about the kids’ imaginations and we caught something of that ourselves,” said Macdonald. “We used a mixture of old fashioned, big tungsten light sources and daylight on both spots, and punched holes through the sets at strange angles to shine lamps straight through into the lenses, just to get that anamorphic flare. We also had a smoke machine and shook a dusty blanket around to get lots of dust motes in the air.”

Hore sums up the appeal of the Cooke Anamorphic/i lenses. “In these kinds of spots, you have to cover off a lot of plot in a short space of time. The cinematic look and anamorphic character of these lenses not only give a beautiful image but also help to tell this story really economically – the audience instantly recognizes the environment and understands what it represents, so we can tell the story quickly and elegantly. With the bonus of the /i Technology lens data, Cooke Anamorphic/i lenses were perfect for these projects.”  

Cooke Anamorphic/i lenses and /i Technology will be available for demonstration on Stand C6333 at NAB 2019.

  • Monday, Mar. 25, 2019
Shortlist Set For IABM BaM Awards at NAB Show 2019
IABM's John Ive
LAS VEGAS -- 

IABM, the international trade association for suppliers of broadcast and media technology, has announced the shortlisted entries for the NAB Show 2019 edition of its BaM Awards®. With more than 160 entries--a record number--the judges have shortlisted a total of 40 entrants across the nine BaM™ Content Chain categories that accurately model the structure of the industry today, together with a 10th award recognizing an outstanding project, event or collaboration.

The panel of 40+ non-affiliated, expert judges is now scrutinizing the shortlisted entries. Visits to the stands of shortlisted companies will take place once NAB Show 2019 opens to complete the judging process. The eventual winners will be announced at the IABM BaM Awards® Party on Tuesday, April 9, which is being held in Ballroom B at the Westgate hotel adjacent to the Convention Center from 6-8pm. 

“Once again, we have had a difficult job paring down so many high quality entries to produce this shortlist,” said John Ive, IABM director strategic insight, chair of the judging panel. “The BaM Content Chain model has given us an excellent framework to assess the potential impact of entries across the flow of the new content factory and it is heartening that innovation continues to drive our industry forward in every part of the content chain. The shortlisted entries are all of the highest quality--now it is down to the judges to select the very best of the best.”

The shortlisted companies (and product/service names where they are not embargoed until the show opens) are:

Create

  • LEDGO Technology Limited - Dyno D600C RGB LED Panel Light
  • Opus Digitas, Inc. - User-Generated Video (UGV) management platform
  • Ross Video
  • Shure
  • Teradek – Bolt 4K

Produce

  • Adobe
  • Grass Valley
  • Marquis Broadcast - Postflux for Premiere Pro
  • Lawo AG - A__UHD Core

Manage

  • GB Labs - Mosaic Automatic Asset Organiser
  • Piksel
  • VoiceInteraction
  • Yella Umbrella - Stellar - Timed Text - In a Browser

Publish

  • AWS - Secure Packager and Encoder Key Exchange (SPEKE)
  • Broadpeak - CDN Diversity™ technology with Priority feature
  • Red Bee Media - World’s First Software-Only Playout Deployment
  • Telestream - OptiQ

Monetize

  • Amagi - THUNDERSTORM DAI-as-a-Managed Service platform
  • Paywizard – Singula™
  • Qligent - Vision-Analytics
  • Veritone

Consume

  • Broadpeak - nanoCDN™ with ultra low latency and device synchronization
  • Verimatrix – nTitleMe
  • Vista Studios - User Experience

Connect

  • Alteros - GTX Series L.A.W.N. Direct-to-Fiber venue-wide wireless mic system
  • Cerberus Tech - Livelink Platform
  • DVEO - Windows® Application for Reliable Live Video Transfers over Public Internet -- PC DOZER™: APP
  • Embrionix

Store

  • GB Labs - InFlight Data Acceleration (IDA)
  • OWC - ThunderBlade™
  • Rohde & Schwarz - SpycerNode
  • Symply - SymplyWORKSPACE

Support

  • Microsoft - Avere vFXT for Azure
  • PHABRIX - Qx IP V3.0
  • Skyline Communications - DataMiner Precision Time Protocol (PTP) Management and ST2110 Media Flow Tracking
  • Touchstream

Project

  • GrayMeta - Videofashion - Monetising archives with GrayMeta
  • MediaKind - Enabling a world-first: 6K tiled 360-degree live sports streaming success
  • Vista Studios - User Experience
  • Zhejiang Radio and Television Group - 32 Camera 4K IP Flagship OBVAN

The winning entries will automatically be submitted for IABM’s prestigious Peter Wayne Golden BaM Award®, with the winner announced at the IABM Annual International Business Conference and Awards in December 2019.

  • Thursday, Mar. 21, 2019
First artificial intelligence Google Doodle features Bach
This image provided by Google shows the animated Google Doodle on Thursday, March 21, 2019. Google is celebrating composer Johann Sebastian Bach with its first artificial intelligence-powered Doodle. Google says the Doodle uses machine learning to "harmonize the custom melody into Bach's signature music style." (Google via AP)
MOUNTAIN VIEW, Calif. (AP) -- 

Google is celebrating composer Johann Sebastian Bach with its first artificial intelligence-powered Doodle.

Thursday's animated Google Doodle shows the composer playing an organ in celebration of his March 21, 1685, birthday under the old Julian calendar. It encourages users to compose their own two-measure melody.

Google says the Doodle uses machine learning to "harmonize the custom melody into Bach's signature music style." Bach's chorales were known for having four voices carrying their own melodic line.

To develop the AI Doodle, Google teams created a machine-learning model that was trained on 306 of Bach's chorale harmonizations. Another team worked to allow machine learning to occur within the web browser instead of on its servers.

The Doodle will prompt users who are unsure of how to interact with the animated graphic.

  • Wednesday, Mar. 20, 2019
Roper Technologies to acquire Foundry
Craig Rodgerson
LONDON -- 

Foundry, a developer of software for the media, entertainment and digital design industries, will be acquired by Roper Technologies, Inc, a diversified technology company and a constituent of the S&P 500, Fortune 1000, and the Russell 1000 indices. The transaction is expected to close in April 2019, subject to regulatory approval and customary closing conditions.

It’s a move that enables Foundry to remain an independent company, with Roper assuming ownership from Hg. Roper has a successful history of acquiring well-run technology companies in niche markets that have strong, sustainable growth potential. Foundry’s track record of profitable growth and leading position within its core markets make it an excellent fit for Roper’s long-term strategy.

Craig Rodgerson, Foundry’s CEO, said: “Roper understands our strategy and chose to invest in us to help us realize our ambitious growth plans. This move will enable us to continue investing in what really matters to our customers: continued product improvement, R&D and technology innovation, and partnerships with global leaders in the industry.”

Neil Hunn, Roper’s CEO, said: “Foundry brings over two decades worth of experience in the digital visualization industry and a strong core management team, and this is what excited us about this partnership.”

Nic Humphries, sr. partner at Hg, said: “Foundry is a business with a history of growth and innovation. Hg has built on the company’s strong development capabilities to drive enterprise adoption of the digital design offering and pioneer revolutionary new products such as Athera. We’ve enjoyed working with the team over the last four years and wish the team well as they continue their growth journey as part of Roper Technologies.”

  • Tuesday, Mar. 19, 2019
Goodbye console? Google launches game-streaming platform
This image provided by Google shows the controller for a video-game streaming platform called Stadia, positioning itself to take on the traditional video-game business. The platform will store a game-playing session in the cloud and lets players jump across devices operating on Google's Chrome browser and Chrome OS, such as Pixel phones and Chromebooks. (Google via AP)
NEW YORK (AP) -- 

Google on Tuesday unveiled a video-game streaming platform called Stadia, positioning itself to take on the traditional video-game business.

The platform will store a game-playing session in the cloud and lets players jump across devices operating on Google's Chrome browser and Chrome OS, such as Pixel phones and Chromebooks.

Google didn't say how much its new service will cost, whether it will offer subscriptions or other options, or what games will be available at launch —all key elements to the success of a new video-game platform. It said only that Stadia will be available in late 2019.

Google made the announcement at the Game Developers Conference in San Francisco. Some industry watchers were expecting a streaming console, but Google's platform centers squarely on the company's cloud infrastructure.

"The new generation of gaming is not a box," said Google Vice President Phil Harrison. "The data center is your platform."

Much like movies and music, the traditional video-game industry has been shifting from physical hardware and games to digital downloads and streaming. Video-game streaming typically requires a strong connection and more computing power than simply streaming video, since there is real-time interaction between player and game. Google says it is leveraging its data centers to power the system.

Alphabet Inc.'s Google said playing video games will be as simple as pressing a "Play Now" button, with nothing to download or install. An optional dedicated Stadia controller will be available. The WiFi-enabled controller has a button that lets players launch a microphone and use Google Assistant to ask questions about the games being played. Another button lets users share gameplay directly to Google's video streaming service, YouTube.

Harrison said he expects all gaming will eventually take place outside consoles, in cloud-powered streaming platforms similar to what Google announced. But not right away.

"It won't replace traditional games devices overnight," he said in an interview after the announcement. "And we wouldn't be here if not for the existing traditional platforms."

CFRA Research analyst Scott Kessler said Google's approach that ties YouTube sharing and video-game playing is unique.

"It is not necessarily at this point the easiest thing for people to livestream their games and now you can do it with the push of a button," he said. "What they've done with Stadia is to connect and unify both the gaming platform and the streaming platform which obviously is new."

The company said Stadia will be available in late 2019 in the U.S., Canada, the U.K. and parts of Europe. Google showed demos of "Assassin's Creed Odyssey" and "Doom Eternal." More information about games and pricing is due this summer.

The U.S. video game industry raked in revenue of $43.4 billion in 2018, up 18 percent from 2017, according to research firm NPD Group.

BTIG Managing Director Brandon Ross said Stadia will be a positive for game publishers "assuming that it works and works at scale, which is a big assumption."

That's because the platform could bring in players not willing to spend the money upfront for a gaming PC or a console.

"What they're presenting is a feasible way to play videogames in the cloud, and utilizing the cloud so you can play anytime, anyplace and anywhere," he said. "There's no friction, including the friction of upfront hardware costs."

Ross added that Google's platform could set up a distribution battle between Microsoft, which owns the Xbox, Sony, which owns the PlayStation, Google and perhaps Amazon, which reportedly is working on its own video-game service, as they race to lock down distribution of the most in-demand games.

To that end, Google launched Stadia Games and Entertainment which will develop Stadia-exclusive games.

"The differentiator for any of the distributors on a console or in the cloud is going to be available content," he said.

Harrison said Google will rely on outside publishers and game developers to provide many of the games available on the platform. But having its own inside studio will also allow the company to fully test and make use of new features.

"We can be the advance party, so to speak, and we can be testing out the latest technology," he said. "Once we've proven it we can help bring that up to speed on the platform even more quickly with our third-party partners."

Harrison acknowledged Google faces stiff competition from longtime rivals Microsoft, Sony and others. Google has been working on Stadia for more than four years, he said, and has been working with game developers through Android and Play Store for longer.

The others have more than a decade of experience. But Google believes it brings something new.

"We are not a historical console or PC platform," he said. "We are built specifically for this new generation."

AP technology writer Rachel Lerman in San Francisco contributed to this story.

  • Tuesday, Mar. 19, 2019
RED RANGER camera system becomes available
RED RANGER
IRVINE, Calif. -- 

RED Digital Cinema announced the availability of its new RED RANGER™ all-in-one camera system designed to meet the needs of high-end productions at select RED authorized rental houses.

RANGER includes the benefits of RED’s cinematic full frame 8K sensor, Monstro, in a camera system that includes three SDI outputs (two mirrored and one independent) allowing two different looks to be output simultaneously; wide-input voltage (11.5V to 32V); 24V and 12V power outs (two of each); one 12V P-Tap port; integrated 5-pin XLR stereo audio input (Line/Mic/+48V Selectable); as well as genlock, timecode, USB and control.

Ideal for studio configurations, RANGER is capable of handling heavy-duty power sources and boasts a larger fan for quieter and more efficient temperature management. The system is currently shipping in a Gold Mount configuration, with a V-Lock option available next month.

“There have been a lot of changes over the years in the rental business and we hope this offers our authorized rental houses something unique,” said Jarred Land, president of RED Digital Cinema. “We’re in this to make the best products possible and to enable all our customers to succeed in their own way. RANGER is what the Rental Houses asked for--I’m excited to see the results.”

RANGER captures 8K REDCODE RAW up to 60 fps full format, as well as Apple ProRes or AVID DNxHR formats at 4K up to 30fps and 2K up to 120 fps. It can simultaneously record REDCODE RAW plus Apple ProRes or AVID DNxHD or DNxHR at up to 300 MB/s write speeds. To enable a robust end-to-end color management and post workflow, RED’s enhanced image processing pipeline (IPP2) is also included in the system.

RANGER ships complete, including:

  • Production Top Handle
  • PL Mount with supporting shims
  • Two 15mm LWS rod brackets
  • RED Pro Touch 7.0” LCD with 9” ARM and LCD/EVF cable
  • LCD/EVF Adaptor A and LCD/EVF Adaptor D
  • 24V AC power adaptor with 3-pin 24V XLR power cable
  • And Compatible Hex and Torx tools
  • Friday, Mar. 15, 2019
Jury rules Apple owes Qualcomm $31M for patent infringement
In this Jan. 3, 2019, file photo the Apple logo is displayed at the Apple store in the Brooklyn borough of New York. A jury announced the verdict Friday, March 15, that Apple should pay $31 million in damages for infringing on patents for technology owned by mobile chip maker Qualcomm that helps iPhones quickly connect to the internet and extend their battery life. (AP Photo/Mary Altaffer, File)
SAN DIEGO (AP) -- 

A jury has decided Apple should pay $31 million in damages for infringing on patents for technology owned by mobile chip maker Qualcomm that helps iPhones quickly connect to the internet and extend their battery life.

The verdict Friday in a San Diego federal court follows a two-week trial that pitted two former allies that have become bitter adversaries. The trial is a fragment of a legal battle involving Apple and Qualcomm, which are sparing over who invented some of the technology used for key features in smartphones and other mobile devices.

The stakes will be much larger in another federal trial next month that will determine whether Apple should be required to pay Qualcomm for licensing other technology used in iPhones.

Apple had been paying the licensing fees until it stopped in 2017 and filed a lawsuit alleging that Qualcomm was abusing his dominance of the mobile chip market to gouge smartphone makers for technology that it hadn't even invented. That trial is scheduled to start April 15.

In the trial that just concluded, the jury unanimously agreed with Qualcomm's contention that it should be paid $1.41 per iPhone relying on three of its patents. The damages date back to July 6, 2017, when Qualcomm filed its lawsuit, and covers technology used in the iPhone 7, iPhone 7 Plus, iPhone 8, iPhone 8 Plus and iPhone X.

San Diego-based Qualcomm hailed the verdict as a validation of its technology's importance to iPhones. "The technologies invented by Qualcomm and others are what made it possible for Apple to enter the market and become so successful so quickly," said Don Rosenberg, Qualcomm's general counsel.

Apple expressed disappointment with the decision. "Qualcomm's ongoing campaign of patent infringement claims is nothing more than an attempt to distract from the larger issues they face with investigations into their business practices in US federal court, and around the world," the Cupertino, California, company said.

The dispute between Apple and Qualcomm is also part of an antitrust lawsuit that the U.S. Federal Trade Commission filed in 2017. In that case, the FTC alleges that Qualcomm had been abusing its market power in mobile chips for years. The trial concluded in San Jose, California, earlier this year, but the judge still hasn't ruled.

  • Tuesday, Mar. 12, 2019
Shotgun boosts game studios’ creative pipelines with new integrations for Jira and After Effects
LOS ANGELES -- 

Shotgun will be at the Game Developers Conference (GDC) in San Francisco (March 18-22) to announce two new integrations: Jira Bridge and Adobe After Effects.

The new Jira Bridge takes Shotgun farther on its mission to connect game development art and engineering workflows to foster better collaboration, communication and faster iteration.

Meanwhile, Shotgun’s After Effects integration will join its suite of supported creative tools that help artists stay focused in their environments of choice. Shotgun gives artists the freedom to do what they do best: make incredible art.

These new additions follow the recent announcements of Unreal Engine and Unity integrations (with Unity’s coming soon). Together, these uniquely place Shotgun in enabling game development studios to move new ideas forward much faster in an increasingly dynamic, content-driven market by bridging the creative and engineering sides of the studio. Both Jira Bridge and the new After Effects integration will be available from March 18, with Jira Bridge in public beta.

Shotgun will showcase its Jira, After Effects, Unreal Engine and Unity integrations at GDC in San Francisco through a Developer Day​ session at Moscone Center, in room 3020, West Hall, on March 19th at 1:20pm. 

“More than ever before, games studios are driven to quickly drop new assets and builds that keep players engaged,” said Don Parker, VP & GM, Shotgun Software, Autodesk. “Shotgun is giving studios the connected agility they need to move faster. We’re really excited about the new Jira and After Effects integrations providing yet more ways for the asset side of a game development studio to stay connected to, and in sync with, the engineering side of the house.”

Epic Games runs in-house asset creation with Shotgun. “At any given time, there are up to 40 people using Shotgun as a management tool from a production standpoint and anywhere from 400-500 or more active users who receive tasks from Shotgun,” said Brian Brecht, art manager, Epic Games. “When we construct assets, they go through a series of pipeline steps that are all scheduled, tracked and assigned via Shotgun. All in-house reviews are handled through RV or the embedded Shotgun tools via the web UI as well. There is no way we could handle the overall volume we produce without the production tracking tools that Shotgun brings to the table.”

Teams of all sizes, from small-to-mid sized up to leading AAA games studios are benefitting from running their asset production pipelines in Shotgun, including Electronic Arts, Bungie, Epic Games, Ubisoft, Blizzard, Sony Computer Entertainment, Rockstar, Treyarch, Square Enix, and many more.

Jira Bridge – in public beta March 18
Having your studio’s data split between two tracking applications can create workflow bottlenecks and inefficiencies, causing confusion between teams. The new Jira Bridge for Shotgun overcomes these challenges by allowing two-way synchronization of data between Shotgun and Jira, while simplifying the creation of custom mappings when they’re needed. The new Jira framework automates the tedious aspects of synchronization, and lets  artists and engineers keep using the tracking tools they prefer.

After Effects Integration - available March 18
The After Effects (AE) integration for Shotgun is the latest in our series for creative tools, joining Autodesk 3ds Max, Autodesk Maya and Adobe Photoshop, among many others, in helping artists stay in creative flow, removing the need for context switching and letting creative teams make better decisions, in fewer steps. These powerful tools makes it easy for artists to load assets in a visual browser, see project and task information in-app and publish out of AE in one click, without worrying about filenames or version numbers. Repetitive processes can be automated, taking the hassle out of things like standardizing render formats and submitting them for review. Like our other integrations, Shotgun customers can access and contribute to the integration via GitHub.

Unreal Engine Integration – available now
Artists can submit in-engine work for review faster than ever, and right inside Unreal, using the Shotgun panel, loader, and publisher. Tasks in Shotgun are linked to Unreal Engine assets, making production tracking and review much easier for artists and supervisors – and without losing much-needed context. The integration is even more powerful and customizable with Epic Games’ addition of the Extended Python API to the Unreal Engine in 4.21.

Unity integration – coming soon
Unity is planning to release its Shotgun integration this spring. The collaboration between Unity and Autodesk will boost productivity and empower artists to focus on creative work with new tools for viewing tasks and feedback directly within Unity. Control the Unity Editor with Python, load assets, publish playblasts, and track resulting analytics through Shotgun within the Unity Editor.

MySHOOT Profiles

MySHOOT Company Profiles