Motion capture can be found in the final builds of almost every triple-A video game, such has become the standards expected by today’s consumers.
But given how intensive a process motion capture can be, requiring a lot of time and resource from multiple teams, few studios use motion capture for anything other than final animations.
Ubisoft Montreal, however, did.
The studio utilised OpenStage, a markerless mo-cap system created by specialists Organic Motion, as part of the previs process on its latest title: the best-selling FPS Far Cry 4. OrganicMotion’s senior director of sales and marketing Jonathan Beaton says the developer’s previous motion capture setup necessitated a fresh approach.
“Ubisoft had a big marker-based system at their Montreal studio and they do a ton of motion capture,” he says. “They always have quite a few game productions going on, so that studio is always fairly bottlenecked in terms of getting time in it.”
At the Montreal International Gaming Summit 2014, Ubisoft Montreal’s technical director Marc Beaudoin went into more detail: “[Before we had OpenStage], if we wanted to have a better quality than blocking, we needed to go to our main motion capture studio, which required a lot more resources, a lot more time, and giving us less time to do iteration. If we need to reshoot once, that’s fine. But we couldn’t reshoot a lot of times because time constraints.
“We needed to create a new motion capture studio that was based on two principles. It has to be easy to use. We wanted the animator to use it. We didn’t want a technical operator on site that we had to call every time we wanted to use the system. We wanted to make it easy to our animators to use, and we wanted something that was fast. We didn’t want to spend a lot of time setting it up, putting markets on, calibrating a system, etc.
“If you’re spending too much time doing previs, then you’re actually missing the whole point of doing it.”
The studio tried out multiple mo-cap systems, extensively testing each one, before deciding that OpenStage was “the right choice” for Far Cry 4.
A 20 square foot system was installed, giving Ubi Montreal plenty of room to work with. OpenStage was also integrated into the studio’s existing animation pipeline; thanks to its ability to produce FBX data, captured sessions could be sent straight to MotionBuilder or into Montreal’s own tech.
“We have our own technology from our technical group that allows us to stream from MotionBuilder to our engine,” Beaudoin explained. “It’s called GameX LiveLink and it allows the actors or animators to actually see themselves in the game while we’re doing the capture.”
Beaton says Ubisoft Montreal is the first triple-A studio to acquire a system solely for the purpose of doing previs, but adds that OpenStage is ideal for such a process.
“If you use our system for previs, it’s much quicker than blocking or doing hand animation,” he says. “You can literally jump in and within seconds be streaming motion capture data. While Ubisoft filmed and went over previs, they were actually exporting it in real-time to a different building where the animators are, who are grabbing that and blocking that out in the scenes.”
One of the most important uses of OpenStage in the previs for Far Cry 4 was planning the E3 2014 demo, the world’s first glimpse at the critically acclaimed shooter.
The footage that was eventually shown on stage in Los Angeles featured antagonist Pagan Min intercepting a bus heading to Kyrat and interacting with the terrified passengers. However, the original vision was very different.
“For E3, Previs was very, very useful because it was quite a complex scene to capture,” Beaudoin said. “One of the things that previs told us, doing the capture with OpenStage, was the script was actually too long. We were above 10 minutes, and that didn’t work out for an E3 trailer. Even though the script seems short, when we were acting it out and we captured it and put all the motions together, [we realised] it was way above what we’re looking for.
“Also, Pagan Min had to do a lot of things inside the bus, like he would stab one of the soldiers and stuff like that. With previs we were able to find out that all the action he had to do, couldn’t be done in the bus. So we transposed some part of the action outside the bus.”
In fact, the previs for this crucial scene was actually captured by a single person: Beaudoin himself.
“It was the first capture we’d done [with OpenStage],” he said. “What I did was I captured myself playing the main bad guy walking in the bus. That was me acting out. I then played it back in MotionBuilder with the record option on. I can actually see this character while I was recording the second character. Then I recorded a third character, and then a fourth character. One by one, I populated the whole scene. I’d did it in only a few hours.
“Before lunch time I had all this motion captured. Then in the afternoon I would assemble it in story, like I showed you, and then I pushed it in the engine. By the end of the day we had a full scene, fully animated, with eight characters in the engine. There’s no other way to do it as fast as this, I’m pretty sure. That was pretty good. That was one single guy in one day that was able to do all of this work. It’s, again, way better than blocking as well.”
Beaton says the OpenStage system Ubisoft Montreal purchased paid for itself after just one project. Of even more value is the partnership between the studio and the motion capture firm, with the former’s feedback assisting with development of the latter’s products.
“They have an open line of communication with our engineering team,” Beaton explains, “so we’re constantly taking our feedback and pushing it into the product, as well as helping them with their pipeline. It’s not unusual for them to give us feedback, and then add those adjustments and additions in our next release.”
Ubisoft Montreal, meanwhile, is already planning how it can use OpenStage to assist with previs for multiple future projects.
“We’re the first one to use this system at Ubisoft, and other main triple-A production teams are looking at what we’re doing,” said Beaudoin. “It was kind of proof of concept, and it worked pretty well. We have now other triple-As that want to try out the system and integrate that into their pipeline.
“We also want to use it for quick-time events and in-game animation, which we unfortunately didn’t do on Far Cry 4. We ran out of time.
“Just to give you a few numbers,” he concluded, “we did more than two hours with a system for cinematic, so we had more than two hours of cinematics in our game. That was spread out onto more than 100 shots. We all did that in less than 20 days, which is awesome.
“We couldn’t have done it with any other system than OpenStage. It was shot very rapidly and it was very efficient. We didn’t have a big team behind it, and we’re able to produce a lot of stuff with it.”