Epic Games, VFx & post-production outfit The Mill and car manufacturer Chevrolet have come together to revolutionise digital filmmaking. This has started with "The Human Race", a short film and AR presentation that was shown as part of Tim Sweeney’s opening presentation that merges real-time visual effects and live-action storytelling using the Unreal Engine and The Mill’s visual production toolkit, Mill Cyclops.
The film features the 2017 Chevrolet Camaro ZL1 in a race with Chevrolet’s FNR autonomous concept car. However there was only one vehicle filmed during the film and that was The Mill’s Blackbird, a fully adjustable car/rig. While previously, CG cars have been added by visual effects artists in post-production, a time-consuming process, in this shoot live video feeds and positional data from the array tracking system were fed directly into Unreal Engine and the Camero was rendered and composited near instantly using real-time augmented reality, which allowed directors to instantly see the final look and composition of each shot.
‘“The Human Race’ blends cinematic storytelling and real-time visual effects to define new era of narrative possibilities,’ said Angus Kneale, Chief Creative Officer at The Mill in New York, via a press release. “This is a pivotal moment for film VFX and the coming era of augmented reality production. Using Unreal’s cutting-edge game engine technology, filmmakers are able to see their photoreal digital assets on location in real time. It also means the audience can effect change in films in ways previously unimagined, giving interactive control over vehicles, characters and environments within a live action cinema experience. With Mill Cyclops, The Mill’s proprietary virtual production toolkit, we are able to render and integrate digital assets into the real world to a level never seen before.”
We asked Kim Libreri, Epic’s CTO, how this will help game developers that are also using the engine: "The engine is going to be better than ever for rendering. These features don’t only work on a high end NVIDIA graphics card, they’ll work on any graphics card to a certain level. The lighting algorithms are much more tuned nowadays.
"AR is coming. We don’t want to present a glass ceiling to any of our developers making games, so we feel that the work that we’re doing in terms of this very, very high end professional-level AR will prepare the engine ready for whatever game developers and interactive content makers that are going to use the engine for AR.
"Even sequencer. Sequencer got better and better throughout the project, so any cutscene cinematics you’re doing, or sequenced cinematic events in a game… If you’re making an Uncharted style game in Unreal Engine, sequencer would be the core of that and sequencer is much much more capable than it was even a year ago. If I do a close up on your face, it looks just like real camera so this benefits everybody who’s using the engine.
"The stuff that we’ve done to enhance the rendering, that will be available [to developers] pretty shortly, and a lot of the stuff is already in the engine as it ships today. The more exotic compositing stuff, we need to refactor it a little bit until it’s ready for everybody, but by GDC next year almost all of that stuff will be available to everybody. And right now, if you’re a classic game developer you can implement the same stuff yourselves. We ship the source code! It’s easy to expand. It’s what we do, we take the source code and enhance and modify it. We have so many customers now that are doing virtual production, that are basically doing live previews of computer graphics on live action stages. We have to get that stuff in in a general purpose way."