OptiTrack’s marketing director Seth Steiling discusses how mo-cap tech is being applied beyond video games

Taking motion capture outside the studio

Thanks to film and game projects like Dawn of the Planet of the Apes, Avatar, Beyond: Two Souls, and Call of Duty – as well as consumer technologies like Kinect – motion capture has captivated popular culture as a widely-used tool for animation and virtual production.

What many mo-cap enthusiasts don’t know is how extensively the technology has been adopted outside of the traditional animation space. Biomechanists, virtual reality visionaries and roboticists alike are deploying motion capture to track the world around us in unthinkable ways.

Here are some recent, ingenious uses for mo-cap that have left us shaking our heads.

PUSHING THE LIMITS

At this year’s Indianapolis 500, we saw Arrow Electronics’ SAM project utilise motion capture in a real-time, outdoor application to enable quadriplegic Sam Schmidt to drive an Indy 500 pace car at 172 km/h – using only his head. Similar to a PC gamer using TrackIR in iRacing or rFactor, Schmidt controlled both acceleration and steering with subtle head movements, which were tracked with sub-mm precision using a marker-based optical system mounted on the car’s dashboard.

The technical wizards at KMel Robotics are also pushing the limits of mo-cap outside the studio, with an artistic short film for Lexus that captured swarms of flying quadrotors roaming city streets while its citizens slept. To control the custom-engineered robots, KMel used motion capture to simulate GPS. The quads’ 6DoF data was tracked by dozens of cameras and then streamed to a control computer that utilized KMel’s advanced algorithms to direct their trajectories in unison. The result was a lifelike rendering of swarm behavior that blurred the lines between technology and sentient life.

Miley Cyrus’ “Bangerz” tour has incorporated one of the more ambitious recent applications of motion capture, combining real-time performer and prop tracking with projection mapping to visually spectacular effect.

Engineered by VYV’s team of scientists and technical artists, the solution projects kaleidoscopic video imagery onto a variety of on-stage surfaces, ranging from Cyrus’ attire to flat panels and giant inflatables. VYV’s system uses mo-cap data and 3D modeling of the entire set to seamlessly solve for projection surface position and orientation, as well as deformation.

STUNT SHOW

Boutique mo-cap studio Animatrik Film Design is known for pulling off immensely challenging on-set shoots that most studios wouldn’t even attempt. Recently, they moved their mo-cap stage up into the mountains outside of Vancouver to capture renowned mountain biker Brandon Semenuk performing flips and double tail whips at over 40’ off of the ground.

These challenging shots required cameras to be aimed straight at the sun, clouds and the sky – conditions that would traditionally cripple an optical mo-cap system.

Animatrik was put to the test in other ways, as well, as a cantankerous mother bear and her cubs paid a visit to the set mid-shoot. Fortunately, the team came out with limbs intact and impressive mo-cap data to show for it.
Motion capture has also enabled technologists to improve the accuracy of the latest immersive gaming and virtual reality control systems. Machine learning researchers at Microsoft utilised an optical mo-cap system to record thousands of movements and gestures in order to teach Kinect v2 how to track body movements more accurately than its predecessor did.

The bottom line: these creative uses of mo-cap demonstrate that we’ve only begun to scratch the surface of what the technology can accomplish outside of its traditional application as an animation tool.

About MCV Staff

Check Also

The shortlist for the 2024 MCV/DEVELOP Awards!

After carefully considering the many hundreds of nominations, we have a shortlist! Voting on the winners will begin soon, ahead of the awards ceremony on June 20th