• Monday, Aug. 18, 2014
A Closer Look: Do more with companion devices
This April 2, 2014 image provided by Amazon shows the Amazon Fire TV system during a news conference in New York. (Photo by Diane Bondareff/Invision for Amazon/AP Images, File)

Apple has ways of encouraging you to buy more of its products: It offers bonus features on devices like the iPhone and iPad that work only when paired with other Apple gadgets.

Amazon does it, too.

It's understandable. The devices need to communicate with each other at a deep level, and that's more easily done when a company controls the software on both ends. And these features are more like extras and don't affect the products' basic functionality.

Take streaming TV devices, for example: These gadgets cost about $100 and let you watch Netflix, Hulu and other online services on a big-screen TV. Though you don't need any other device for basic streaming, some advanced features in Amazon's Fire TV require a Kindle Fire tablet or a Fire phone, while some Apple TV features work only with iPhones, iPads and Mac computers.

As a result, you're wise to consider the devices you already own when you buy a new gadget, as these devices become more powerful when combined. Here's a closer look at how the Apple TV and the Fire TV work with other gadgets from Apple and Amazon.



With a Fire phone or tablet, you can start a movie or TV show on one device and continue on another, at least when you're using the company's own streaming service, Amazon Instant Video. When watching on the TV, you can have the phone or tablet display trivia, cast information and character summaries — culled from IMDb and other sources. Information on the mobile device changes from scene to scene.

Consider these scenarios:

— You're near the end of a movie on the Fire TV at home, but need to head out. Just pull down the Fire phone's notification center and switch the movie to the phone. You won't miss a scene. It works the other way if you start on the phone on the way home.

— An actor seems familiar, but you can't remember his name or what else he's been in. With video playing on the Fire TV, check the phone or tablet to see headshots of actors in a given scene. Tap a headshot for more information. The feature also works with songs playing during certain scenes — and you can buy the song through Amazon, of course.

— You can also use the phone or tablet to pause, rewind, forward or go directly to a specific scene on the TV. Or you can check Facebook and email on the device once you've had your fill of cast information.

What about streaming services beyond Amazon's? By turning on a screen-mirroring function, anything appearing on the phone or tablet will appear on the TV. You have to dig through the settings on the phone or tablet to turn it on, though. Beyond video, I had a lot of fun pointing the phone's camera at my cousins' kids so they could see themselves on the TV live.

Unfortunately, audio and lips had a tendency to be out of sync when I mirrored using my home Wi-Fi network. It's better to work with an app that directly supports dual-screen use. Only Amazon Instant Video does so for now, though Amazon says it's working with selected partners to expand that.



You can start video on an iPhone or an iPad and continue on the TV using a feature called AirPlay. You're not limited to Apple's own iTunes service, so this is a way to get Amazon Instant Video on the big screen.

However, services have the option to disable this capability. Showtime and ABC Family, for instance, have done this. Their apps lack the AirPlay button and aren't available on the Apple TV.

Apple doesn't offer supplementary information on cast and characters, as Amazon does. And while you can start something on a phone or tablet, switch to the TV and switch back, this doesn't work if you start the video on the TV. Amazon's devices work both ways.

Apple does offer screen mirroring, and it's easier to get to than Amazon's version. Just swipe up from the bottom for the Control Center. It's not true screen mirroring, as video disappears from the mobile device when it shows up on the TV. This actually improves video quality because you're not wasting Internet bandwidth duplicating the stream on a device you're not watching.

You can also do screen mirroring from a Mac or use the Apple TV as a second monitor to extend your Mac's desktop space. But it doesn't always work well if your Wi-Fi network isn't pristine.

Unfortunately, mirroring is sometimes blocked for copyright reasons. I'm not able to fling DVDs from the Mac to the Apple TV, for instance. In trying to fling Showtime and ABC Family from the phone or tablet, I can get only audio on the TV. I haven't run into that with any of the video apps I've tried on the Fire.

Beyond streaming TV, Apple devices will soon work together even more extensively. The upcoming Yosemite operating system for the Mac and iOS 8 for iPhones and iPads will have a set of features called Continuity. You can start an e-mail on one device and finish on another. Or you can answer phone calls on the Mac. Stay tuned.

  • Saturday, Aug. 16, 2014
21 scientific & technical achievements under consideration for Academy Awards
An Oscar statue stands on the red carpet of the 2010 Scientific and Technical Awards. (AP photo)

The Scientific and Technical Awards Committee of the Academy of Motion Picture Arts and Sciences announced that 21 scientific and technical achievements, 16 distinct investigations, have been selected for further awards consideration.

The list is made public to allow individuals and companies with similar devices or claims of prior art the opportunity to submit achievements for review.

The deadline to submit additional entries is Tuesday, August 26, at 11:59 p.m. PT.

The committee has selected the following technologies for further consideration:

    Portable, remote-controlled telescoping camera columns

    Prompted by MAT-TOWERCAM TWIN PEEK (MAT - Mad About Technology)
    Drivable, high-speed vehicle platforms

    Prompted by THE BISCUIT JR. (Allan Padelford Camera Cars)
    Neutral density filters that remove infrared contamination

    Lightweight, prime lens sets for high-resolution cameras

    Prompted by LEICA SUMMILUX-C PRIME LENS SERIES (CW Sonderoptic)
    Optical audio transfer processes

    Enabling technology of digital cinema projectors

    Interactive blend shape modeling and manufacturing

    Measurement toolsets for quality control of cinematic experience

    Prompted by LSS-100P (Ultra-Stereo Labs)
    Displays providing suitable visual reference for feature film review

    Collaborative, enhanceable image playback and review systems

    Prompted by RV MEDIA PLAYER (Tweak Software)
    High-resolution motion capture techniques for deforming objects

    Prompted by MOVA (MOVA) and GEOMETRY TRACKER (ILM)
    Systems for interactive grooming and direct-manipulation of digital hair

    Prompted by BARBERSHOP (Weta Digital)
    Systems for placing, grooming and resolving collisions of digital feathers

    Prompted by DREAMWORKS FEATHER SYSTEM (DreamWorks Animation)
    Systems for modeling, animation and rendering of digital vegetation

    Prompted by SPEEDTREE (IDV)
    Digital technologies for high-density physical destruction simulation

    Efficient volumetric data formats

    Prompted by FIELD 3D (Sony Pictures Imageworks) and VDB: HIGH-RESOLUTION SPARSE VOLUMES WITH DYNAMIC TOPOLOGY (DreamWorks Animation)

After thorough investigations are conducted in each of the technology categories, the committee will meet in early December to vote on recommendations to the Academy’s Board of Governors, which will make the final awards decisions.

The 2014 Scientific and Technical Awards will be presented on Saturday, February 7, 2015.

Claims of prior art or similar technology must be submitted on the Academy’s website at www.oscars.org/awards/scitech/apply.html. 

The Oscars will be held on Sunday, February 22, 2015, at the Dolby Theatre at Hollywood & Highland Center in Hollywood, and will be televised live by the ABC Television Network.  The Oscars presentation also will be televised live in more than 225 countries and territories worldwide.

  • Thursday, Aug. 14, 2014
Canon Introduces New RC-V100 Remote Controller
RC-V100 Remote Controller
Lake Success, NY -- 

Compatible with the Canon Cinema EOS line of cameras (C500, C300, and C100) as well as Canon's XF-series of professional camcorders, the new RC-V100 Remote Controller is designed to respond to a diverse array of production needs requiring remote camera operation.  The RC-V100 Remote Controller enables users to remotely control a wide variety of functions built into the cameras, as well as adjust and set various controls, such as exposure and white balance.  The RC-V100 is powered from the connected camcorder via a 15-foot (five meter) cable  and a USB port enables the units firmware to be updated to support future remote-control functions.

The remote is ideal when capturing footage at sporting events, houses of worship, or any application that requires a camera mounted on a crane.   In addition to basic controls - start/stop, shutter/gain adjustments, zoom/focus/iris parameters, custom picture values, white balance, black gamma and more - the new remote can be used to adjust various menu settings.  It also features illuminated push buttons with audible feedback to confirm each setting change.  A SETUP mode lets users customize a number of key rotary controls as well as user procedures and operating modes. 

  • Wednesday, Aug. 13, 2014
Camera Corps chooses IBC 2014 for first European showing of Q3 and MeerCat
Camera Corps' Q3 pan/tilt/zoom focus head.

Camera Corps will demonstrate the latest additions to its range of robotic camera systems at IBC 2014 in Amsterdam, September 12-16. Taking center stage will be the Q3 pan/tilt/zoom/focus head and MeerCat miniature camera, both introduced at NAB in April and now fully deliverable. Q3 and MeerCat can be integrated easily with all Camera Corps’ current control systems. Up to 96 cameras of various types can be joystick-controlled by up to four operators and four vision engineers.

Camera Corps’ Q3 robotic pan/tilt/zoom/focus camera allows highly efficient broadcast television coverage from practically anywhere. Fully IP45 rated, it is designed for use in coverage of events such as outdoor and indoor sports, reality shows and live stage performances. Q3 retains the unobtrusive compact spherical housing of its Q-Ball predecessor, allowing easy placement within view of other cameras while retaining full control of vertical and horizontal shooting angles, focal length and focus.

Just 104 millimetres high, 125 mm in diameter and weighing 2.5 kg, Q3 incorporates a high quality 1920 x 1080-native camera with a 2.1 million pixel 1/3 inch RGB Bayer progressive CMOS imager plus high-precision motorized pan/tilt/zoom/focus. An enhanced motor drive matches the precise acceleration and deceleration of much larger robotic heads when tracking moving performers on-air. The camera’s integral 20x optical zoom lens can be adjusted from maximum wide (59.4 degrees) to full telescopic (3 degrees) in just 3 seconds. Maximum focus time is 10 seconds.

Q3 can deliver live video in all commonly used HD formats, interlaced and progressive, at up to 60 hertz frame rate. Output signal format can be selected from the operator’s control panel. The motorized head can perform an unlimited number of 360 degree lateral rotations. Video is transferred at 3 gigabits per second over high-quality slip rings to ensure complete freedom from cable-snagging. Pan and tilt speed are adjustable from an ultra-slow 360 degrees in 90 minutes to 90 degrees per second. Motion control sequences of up to 25 seconds duration can be stored to internal non-volatile memory.

Additional features of Q3 include genlock input with remote timing adjustment,  smooth on-air operation of iris and master black, manual and automatical white balance adjustment, integral colour bar test signal generator and negative/positive/monochrome effects.

MeerCat miniature remote broadcast camera

Developed in response to demand from sports, reality-television and stage-show producers, MeerCat was first used on air at the April 2014 Thames Boat Race. One camera was mounted inside a thin metal pole at the rear of each boat, providing a forward view. Another was positioned at floor level, looking towards the cox.

The MeerCat head is housed in a metal case with a very small footprint, 30 x 30 mm, and is only 93 mm in height. It can be attached to a quarter-inch mount for easy integration into narrow-profile locations. Lens protrusion is just 25 mm. High-quality NF-mount lenses are available.

MeerCat can also be used as a wearable camera with a full high-definition live wireless link. Full control facilities including manual iris setting with adjustable electronic exposure can be performed remotely using the existing range of Camera Corps joysticks and remote panels.

MeerCat incorporates a high-quality third-inch MOS sensor with 1944 x 1092 effective pixels. This can be switched to deliver 1080p, 1080i or 720p video at 50, 59.94 or 60 hertz frame rate. Video is output as HD-SDI which can be converted to an optical feed using a Camera Corps optical fiber interface. 

  • Tuesday, Aug. 12, 2014
Vicon introduces Pegasus products at SIGGRAPH
Pegasus Advanced

Vicon, the motion capture technology specialist for the entertainment, engineering and life science industries, today at SIGGRAPH announced the launch of two products, Pegasus and Pegasus Advanced, a new set of retargeting and solving tools developed with IKinema, as well as Blade 3, the newest version of the company’s capture and data processing software.  The products offer a new level of flexibility, accessibility and increased value for money in motion capture.

Pegasus is the industry’s first off-the-shelf retargeter that simplifies the process of streaming real-time motion capture data from Vicon Blade onto game assets in engines like Unity or Unreal Engine (UE4). Previously, users would have to spend time and money on developing their own bespoke software to access the dynamic environment a game engine provides.

“Pegasus represents an unprecedented step forward for motion capture in enabling customers of all sizes to import their data into game engines for a more dynamic experience,” said Phil Elderfield, entertainment product manager, Vicon.

Pegasus Advanced
With all the benefits of Pegasus, Pegasus Advanced solves joints to rigid body data and streams from Vicon Tracker. Users can take advantage of the Pegasus Advanced solver and stream data into ubiquitous ergonomic packages such as Siemens’ Jack software or Dassault Systèmes’ DELMIA solutions, as well as flawless integration with Unreal and Unity.

“Pegasus Advanced is the first product of its kind to allow tracking data from rigid bodies to drive custom avatars or manikins in such a wide variety of platforms,” said Warren Lester, engineering product manager, Vicon. “This allows customers to easily drive their own models--in applications that they are familiar with--and place markers where the task demands, rather than being hamstrung by fixed marker sets or having to suffer poor data quality due to occlusion.”

Alexandre Pechev, CEO at IKinema, an animation technology company that developed Pegasus and Pegasus Advanced in conjunction with Vicon, said: “IKinema goes back to fundamental biomechanics to deliver realistic motion based on skeletal analysis of humanoid and creature forms. Our highly optimized heuristic algorithms bring the highest degree of realism and believability to the Pegasus solving pipeline.”

Blade 3
The new version of Vicon’s motion capture software, Blade 3, builds upon the power and performance of Blade 2 and Axiom--Blade’s live engine that reproduces fast, clean performances for real-time visualization. With the Axiom engine now also available offline, it can be used in a full, interactive post-processing pipeline. Other features of Blade 3 include the ability to calibrate Vicon’s Bonita video reference footage into the capture volume and overlay its images onto CG environments for easier character setup and solve quality assessment. Blade 3 also introduces compatibility with Python for the first time, allowing users to run scripts with offline Blade 3 data.

“The launch of Pegasus and Blade 3 continues our ongoing commitment to delivering innovative solutions to the motion capture community,” said Imogen Moorhouse, CEO, Vicon. “Pegasus enables direct connection to game engines for all customers, while the new features in Blade 3 continue to build on the advances made in Blade 2, making Vicon motion capture more accessible, efficient and cost effective.”

  • Tuesday, Aug. 12, 2014
Autodesk unveils extensions for Maya 2015, 3Ds Max 2015
Maya 2015

At this week’s SIGGRAPH confab, Autodesk has unveiled its latest extensions for Maya 2015 and 3ds Max 2015.

The newest update for Autodesk Maya 2015 delivers powerful new production tools to help studios large and small build more efficient and productive pipelines. Features include: a sophisticated new color management system that offers a color-safe workflow throughout the lighting, rendering and compositing process; new tools that help make it easier to share, transfer, and collaborate on data; and extended customization options for Viewport 2.0, Maya 2015 Extension 1 helps provide studios with new opportunities to create differentiated, future-proofed pipelines.

-- Color Management: A completely new color management system preserves artistic creative intent throughout the lighting and rendering process with a simple linear workflow is provided out of the box. Studios can customize the system to closely match their color pipeline, enabling artists to work across The Foundry’s NUKE software, Adobe Photoshop software, Autodesk Creative Finishing solutions, and certain other applications to help allow for color compatibility throughout the studio pipeline.

-- Performance Profiler: Technical directors can now gauge, measure, and debug the performance of individual nodes, character rigs, and Maya scenes. Performance Profiler offers a graphical interface that enables performance measurement and graphing for dirty propagation, evaluation, deformations, rendering, Qt events, XGen instancing and Bifrost simulations. Developers can now use Performance Profiler in other custom tools and plug-ins using a robust API (Application Programming Interface).

-- Modeling and Workflow Productivity Enhancements: Requested by artists, the update now includes workflow enhancements to make daily tasks easier and more efficient: expanded wireframe color choices; a per-menu keyboard shortcut to repeat the last command; object visibility toggling; color coding in the channel box to reflect various key states; performance improvements and multi-UV tiling support for OpenSubdiv; enhancements to the Multi-Cut tool; further integration of the Modeling Toolkit; and new custom pivot workflows.

-- XGen Performance Enhancements: Artists can now create and groom hair, fur, and feathers on characters, and populate large landscapes with grass, foliage, trees, rocks, and debris trails faster and more efficiently, with targeted performance enhancements for the XGen Arbitrary Primitive Generator.

-- Python API for Viewport 2.0: Maya 2015 Extension 1 introduces a Python API for the high-performance Viewport 2.0 hardware-accelerated display, offering increased flexibility for viewport customization using this popular and easy-to-learn scripting language.

Availability: Autodesk Maya 2015 Extension 1 will be available September 10, 2014.

Autodesk 3ds Max 2015 Extension 1
Autodesk 3ds Max 2015 software helps increase overall productivity for artists and designers working with the high-resolution assets required by today’s demanding entertainment and design visualization projects. The newest update adds new features that help artists easily create and exchange complex assets across multiple workflows:

-- OpenSubdiv Support: Extension 1 adds the top featured requested from the community. Open-sourced by Pixar, OpenSubdiv allows artists to represent subdivision surfaces, visualize their model interactively without the need to render, and offers overall greater productivity.

-- Alembic Support: Alembic makes processing complex animated and simulated data more memory efficient and allows artists to view large datasets in the Nitrous Viewport and transfer them to other programs.

-- Enhanced ShaderFX: Enhancements to the ShaderFX real-time visual shader editor offer expanded shading options and better shader interoperability between 3ds Max, Maya, and Maya LT. With new node patterns -- wavelines, voronoi, simplex noise, and brick -- as well as a new bump utility node and a searchable node browser, game artists and programmers can create and exchange advanced shaders more easily.

Availability: Extension 1 for 3ds Max 2015 is now available.

  • Tuesday, Aug. 12, 2014
Shotgun Launches iPhone Review App, Shotgun Desktop, and MARI Integration at SIGGRAPH
Shotgun's mobile app.

Shotgun Software, developer of cloud-based production tracking, review & approval, and asset management tools for film, TV and games, delivered three major releases today at SIGGRAPH to help creative companies streamline processes and go faster. The company--recently acquired by Autodesk--launched a mobile app that brings the power of Shotgun Review to the iPhone; Shotgun Desktop, a native app shelf that gives artists fast access to productivity tools; and integration with The Foundry’s MARI 3D paint software, which helps speed up the workflow for texture artists.

With Shotgun’s first mobile app, supervisors can take their projects anywhere they go with a full set of tools to review and give their artists clear, visual feedback on work in progress. They’re able to stay connected without slowing down while they’re on set or on the run, and unblock artists for faster iteration. Now, on their iPhones, supervisors can:

-- Browse media and playlists in all projects
-- Play back movies
-- Annotate on one or more frames in the movie
-- Give feedback with notes and attach camera images or movies
-- See history on related versions and their notes

VFX supervisor Joshua Saeta, who recently worked on films that include “Sin City 2” and “Earth to Echo,” has been beta testing Shotgun Review for iPhone. “Having Shotgun on my iPhone is a complete life saver,” he said. “I could be in the middle of a shoot in New Mexico and have the director take a quick look at a digital matte painting and get feedback right there on set, on the fly, and get it back to my artists. Having the flexibility to get basic feedback instantly on a device that everyone carries in their back pockets is invaluable.”

Shotgun Desktop
Shotgun Desktop is a simple, visual interface that gives artists fast access to key productivity tools directly from the menu bar. Artists can quickly launch tools like Maya, Nuke or Photoshop, pre-configured with integrated apps that help automate key tasks -- like loading files created by other artists or publishing their work -- without having to go back to a browser or remember naming conventions and directory structures. Any app that the studio chooses to make available can be accessed, whether it’s a 3rd-party product, an in-house pipeline tool, or something from the growing list of apps that Shotgun is building and delivering.

Shotgun Desktop also makes developers’ jobs easier and more efficient, supplying a high-quality, productized app framework with a polished, artist-friendly UI that works across Linux, Mac and Windows. They can build Python apps quickly and deploy them right out to artists’ desktops.

Scott Ballard, pipeline supervisor at Encore Hollywood (which works on post and VFX for TV episodics including “House of Cards,” “Under the Dome” and “Extant”), has been working with an early version of Shotgun Desktop. He said, “This tool is going to save me a ton of time on the development side when we roll out pipelines for new shows; a lot of the initial setup and project configuration are nearly automatic. When our artists get their hands on it they’re going to love having an easy, consistent place to launch things. With the kind of timetables we face in TV work, anything that saves steps and gets us up and running faster is a huge help.”

MARI Integration
Shotgun has integrated The Foundry’s MARI 3D paint tool, including the full suite of Pipeline Toolkit apps. Now artists using MARI can work faster with the Shotgun loader, publisher and other apps, right inside the MARI interface, saving steps related to file management and connecting to other artists on the project. The integration also includes a Maya/MARI round-trip that connects modeling, textures, and lookdev.

Jack Greasley, MARI product manager at The Foundry, said, “The Foundry and Shotgun share a strong commitment to improving the experience of all of our clients, and it’s great to be able to make processes more efficient for them by integrating the tools they use every day. With Shotgun’s native support for MARI we’re streamlining a process that’s very common for texture artists in film and commercial work by enabling them to snapshot and publish files directly from within MARI.”

  • Monday, Aug. 11, 2014
ftrack rolls out version 3.0 at SIGGRAPH
An overview of the ftrack dashboard.

ftrack, a project management platform for the creative, visual effects and animation industries announced version 3.0 and is showing previews at the SIGGRAPH conference in Vancouver, August 11-14.

Working closely with existing customers including The Mill, Cinesite, MPC Commercials and ZeroVFX, ftrack has focused on enhancing an already intuitive workflow while expanding the tools for multi-location pipelines and deeper integration with key VFX software packages.

The 3.0 release will provide numerous new features and improvements, including a further refined user interface and a focus on collaborative workflows with seamless asset syncing between multiple locations and the introduction of global filters to quickly narrow scope.

Deeper pipeline integration includes a more robust API and a standalone publishing application that has been designed to allow quick and easy publishing outside of any host application.

Additional features include improved time tracking tools for artists and review and approval tools to allow outside clients to annotate and communicate efficiently with production.

Working with key software partners will also see deeper forthcoming integration with Thinkbox Software’s Deadline, Imagineer Systems’ Mocha and Cospective’s Cinesync. In addition, ftrack is collaborating with The Mill on development of plug-ins for Nuke and Hiero to provide a smoother, more intuitive experience for artists using ftrack in these applications.

The version 3.0 announcement comes in the wake of recent major client adoptions; The Mill has selected ftrack as its worldwide production management solution across all of its studios in the UK and USA, while Germany HQ’d Mackevision has purchased a site license and migrated all of its existing production management tools over to ftrack for both its design visualization and VFX projects.

  • Monday, Aug. 11, 2014
A Closer Look: Multitasking on mobile devices
This screen shot taken from a Samsung Galaxy S5 demonstrates the device's Multi-Window function, which lets you run multiple apps side by side. (AP Photo)

Smartphones and tablets would be much more useful if they allowed us to multitask the way desktop and laptop computers do.

When I'm watching video, for instance, I have to pause it to read an email or text that comes in. When I'm composing a message to make plans, I have to leave the app to check the weather forecast. For the most part, I'm not able to do more than one thing at a time on a single screen.

That's starting to change with Android devices, though. Windows tablets do let you run multiple apps side by side, but Windows phones do not. The iPhone and iPad don't, either.

In this installment of A Closer Look, I assess some of the Android devices that offer limited multitasking. These approaches aren't as smooth as what I'm used to on Mac and Windows personal computers, but they are a start.



Samsung offers Multi-Window, which lets you see multiple apps running side by side on the screen. You're typically limited to two, though Samsung's 12.2-inch Pro tablets let you do as many as four. There's a slider you can use to control how much screen space each app takes.

Multi-Window works with only selected apps, though. You can use Samsung's Video or Google's Play Movies & TV app as one of the selections, but not Hulu or Netflix. Even so, the choices have gotten better since Samsung first made this feature available in 2012.

As much as I like this concept, I've rarely used it on my two-year-old Galaxy S III. It takes me longer to figure out which apps are supported than to simply grab another device and get what I need there.

Owners of Samsung's Note smartphones and tablets also get a second way to multitask. It's called Pen Window and gets activated when you use the stylus that comes with the device. You simply use the pen to draw a box in the screen. The box floats over the main app on the screen, and apps open inside the box. You can have several apps open at once, and you can temporarily set an app aside by minimizing it into a small dot.

Again, this only works with selected apps.


— LG's G3 PHONE:

LG's latest smartphone, the G3, has a Dual Window feature. Just hold the back button and choose two apps to open side by side. As with Multi-Window, you're limited in your choices. You can adjust a slider to determine how much on-screen real estate each app occupies.

The phone also has Qslide, which gives you easy access to three apps at once. Unlike Dual Window, these apps are in overlapping windows, similar to traditional PCs. There's a slider to make two of the apps semi-transparent while working on the third. So if you're composing a text message to make plans and need to see whether you're free, you can launch a calendar through Qslide. Unfortunately, Qslide works with even fewer apps than Dual Window.

For the most part, you're limited to messaging, Web browsing and tools such as the calendar and calculator. Dual Window has a few extras, including Maps, YouTube and the photo gallery. Neither offers weather or streaming video services.



The Mate2 has a feature called Window on Window, or WOW. When you turn it on, a small translucent circle hovers over the home screen or any app you're using. Clicking on it gives you quick access to some basic tools — a calculator, a calendar, a note pad and text messaging. You're limited to just those four.

The app hovers in a window over whatever you're doing. Hit the arrow to expand it to full screen, or hit the "x'' to make it go away.

The extent of apps available doesn't exactly wow me, but it's another effort toward making mobile devices as useful as laptops. There's more work to do, but I'm glad device makers are working on it.

  • Monday, Aug. 11, 2014
Chaos Group's V-Ray comes to The Foundry's MODO, NUKE, KATANA
Chaos Group logo

In response to artist and designer demands, Chaos Group and The Foundry have announced that V-Ray will be available for a number of The Foundry’s creative software solutions.

The two companies have been working together on the development of V-Ray for three of The Foundry’s products: V-Ray for MODO, V-Ray for NUKE and V-Ray for KATANA. The MODO and NUKE versions are being unveiled for the first time at SIGGRAPH.

Starting today, V-Ray for MODO and NUKE are available as a public beta, and the commercial version of V-Ray for KATANA is available for purchase.

“Chaos Group and The Foundry’s products have been cornerstones of Atomic Fiction’s workflow since day one,” said Kevin Baillie, co-founder and VFX supervisor at Atomic Fiction. “Both companies share an obvious passion for making amazing, production-focused tools, and are constantly looking towards the future. We’re excited that their futures are converging to unify our rendering pipeline across our favorite applications!”

Built on V-Ray’s latest 3.0 core rendering technology, the integration with The Foundry’s tools streamlines the workflow for studios with pipelines built around V-Ray, NUKE, MODO, and KATANA.

The products:

V-Ray | MODO – Provides flexibility and production-proven rendering capabilities for MODO artists creating 3D content. Now in public beta.

V-Ray | NUKE – Unifies the pipeline between NUKE artists and 3D artists for unprecedented workflow improvements at all stages of production, while providing access to V-Ray’s advanced ray tracing capabilities. Now in public beta.

V-Ray | KATANA – First used on “Captain America: The Winter Soldier” by Industrial Light & Magic, the industry-leading lighting and look development package is now coupled with industry-standard rendering technology. Available upon request.

MySHOOT Company Profiles