From its debut, Faceware Live has presented developers with a tempting option for their markerless facial animation efforts.
The solution lets games makers produce facial animation in realtime, streaming capture data into a middleware or engine that can immediately be applied to a character. Furthermore, the data can originate at any video source, whether it’s Faceware’s ambitious Pro HD Headcam System hardware or an ordinary webcam.
As of this month, the technology is available to UE4’s legion of users.
“The Unreal Engine has long had a corner on the high end of realtime rendering,” says Peter Busch, vice president of business development. “This high-quality capability will allow our customers to do even more ambitious projects. The support we have received from the entire Epic team has been absolutely fantastic. Our teams share a common understanding of the power of a realtime production tool.”
As a result, Unreal users can now take full advantage of Faceware Live, and it’s something Busch and his team are expecting to have an impact some way beyond the realm of games.
“In the short term, we see widespread usage in previz and virtual production,” he predicts. “With the wave of upcoming rapid production needs beyond gaming, we see uses for episodic and online content. The most exciting market is broadcast television, for which we are already underway on our first live action and realtime character television series here in the States.”
When it came to building the plug-in, Faceware took the step of turning to an external company to assist them: the team behind the Kinect 4 Unreal plug-in work.
“This plug-in was co-produced by Opaque Multimedia out of Melbourne, Australia,” Busch explains. “Opaque’s strength is the ability to create a plug-in that is native to Unreal’s Blueprint system and straightforward for UE4’s user base. Having Opaque work on the integration also ensures a more rapid roadmap, since their business is designing outstanding plug-ins and keeping them up to date.”
According to Opaque, it is the ease with which developers can access the data from the Faceware plug-in via the native Blueprint interface that is among the most significant factors, in that it should mean minimal time expended adapting to the interface for those familiar with Unreal.
And for Busch, there’s another reason working with Opaque should enable Faceware users to do more with their titles.
“Their connection in the VR and AR space is particularly interesting for some of the more ambitious projects we want to do,” Busch later adds, though he’s remaining tight-lipped on more details.
Busch and his team are confident the plug-ins will be well served, regardless of their size and scope.
“Individuals and studios are always looking for ways to cut costs and speed up production,” Busch offers.
“Users can expect that the power of Unreal and the cutting edge tech of Faceware will combine to create the facial content creation pipelines of the future.”
Faceware Live’s arrival on UE4 should only contribute to the democratisation of mo-cap technology. And for those that use Unreal, having a markerless, camera-agnostic real-time facial animation system at their disposal should allow them to embrace that very trend. And whatever else Faceware and Opaque have in store should be ripe with potential.