Before the COVID-19 crisis, the markerless motion capture market continued to evolve beyond the core entertainment/videogame creation sectors with biomedical/scientific and architecture professionals experimenting with the software. With the pandemic in full swing, the software’s ability to record motion capture from the safety of one’s home seems tailor-made for this particular moment in history when the content production/post industry struggles to adapt to remote productions.

Michael Nikonov, Founder & Chief Technology Officer of Moscow-based iPi Soft, developer of the leading markerless motion capture solution iPi Motion Capture, continues to stay true to the company’s motto “Motion Capture for the Masses.” Most recently the company unveiled real-time tracking and live streaming capabilities into popular game engines such as Unity and Unreal.

We spoke to Nikonov about remote production, mocap in the classroom, the trends he sees in markerless motion capture overall, and what enhancements iPi Soft users can look forward to seeing in the near future. 

Due to the Coronavirus pandemic Hollywood live action productions have been largely suspended. Is this moment perhaps an opportunity for greater adoption for iPi Mocap among filmmakers/gamers?

Michael Nikonov: From the very beginning of the worldwide pandemic lockdown we've seen increased interest in the software from hobbyists, indie filmmakers and game developers, as well as small studios. I think that's because our software can find its place in complex production pipelines of the big studios, and is particularly effective for concept testing, previz and background animation of large crowds or secondary characters that don’t require high levels of detail. The motion capture can be done at home or in small office space, so the system is a good fit for remote work.

Animation students in the US are now heading back to college, and for many, that will be a virtual classroom environment. What are the benefits that iPi Mocap offers to help students learn your software from home?

MN: Because iPi Mocap is a portable solution it can be used at home. Kinect sensors allow for capture in a small space less than 5 by 4 feet. The new Azure Kinect sensor released last year has a wide-view mode that allows users to stand even closer than 3 feet to the sensor. So, the system can be operated by a single person, which makes it convenient for online learning. Also, our user license can be transferred between different computers without any limitation, so a license owned by a college or university can be used by multiple students from their homes. 

Last year iPi Motion Capture added real-time rendering. Was it the game-changer you thought it would be?

MN: Adding real-time was a huge developmental milestone for the company. It was a significant technical achievement for us just to make it happen and it’s something we’re tremendously proud of. That said, it was one step for us. We are focused on other improvements, specifically in regards to the overall motion tracking user experience of the software.

What specific workflow benefits does iPi Soft’s integration into the Unity and Unreal game engines provide iPi Mocap customers?

MN: One of the most important development improvements we made in regards to the Unity game engine was to enable live streaming from our software to Unity. We’re currently working on delivering this to the Unreal game engine, which should be online by 2021. Live streaming integration with game engines is essential for animators because they need to see their character models in the gaming environment as quickly as possible to decide if a scene works or needs to be redone or edited in some way. This eliminates artists having to constantly wait on their scenes to render and quickens the creative workflow.

We also added a preset for Unreal in iPi Mocap for working with their standard bipedal characters. In recent versions of Unreal Engine, the standard skeleton is stable so now it is as easy as selecting ‘Unreal’ from the menu of available characters and rigs. 

The ‘user experience’ is the buzzword around software and apps today. How about the iPi Motion Capture user experience? Have you made improvements there as well?

MN: Yes, we recognize that for some users working with more than one camera configuring and calibrating the system can be a challenge.  Our development team is working towards simplifying this for a more user-friendly motion tracking experience. Some examples of this include the ability to specify distance between any pair of cameras for setting scene scale during multi-camera system calibration, ability to control real-time tracking and live streaming to Unity from third-party software using our Automation Add-on and other improvements.

When you look at the motion capture industry what are the big picture trends you’re seeing?

MN: The opportunities as we see them for markerless motion capture remain primarily in the entertainment/gaming sectors, and increasingly in the biomedical/scientific research world. Architectural design firms are also using motion capture, but the majority of our users are professional and semi-professional animators and digital artists. 

I know you’re a huge fan of videogames, anything you’ve seen recently that is particularly impressive?

I am amazed at how indie game developers are able to do complex and beautiful games with great looking animation, with such small teams. “The End of the Sun” game is one recent example. In their Kickstarter video, they explain how they brought animation from iPi Motion Capture software to Unity game engine, and what was their first experience with motion capture.