Time is of the essence in Quantum Break, which mixes a video game with a live action TV-show featuring a parallel story in which hero Jack has to survive ‘Stutters’ – time breaking down and making different game world objects jump backwards and forwards. He can also make everything slow down around him and use his powers to stop enemies and certain objects in their tracks.
All rich pickings for sound design, which played a key narrative role. Remedy’s audio lead Richard Lapington says he wanted players to recognise instantly what ‘state’ the game-world is in – even with their eyes shut.
“We needed the ‘Stutter’ to aurally totally contrast with the normal game-world, so we decided to run two sets of audio side-by-side – a ‘normal’ set and a ‘stutter’ set, and switch between them,” he explains. “Aesthetically, this was quite tough – trying to concoct sounds that resembled what you see yet were ‘fractured in time’.
“Music also plays a huge part in communicating time breaking – music and rhythm are based on repeated patterns and by breaking those patterns in sync with the gameplay we can emphasise time breaking and create unease. Music stretches and filters, as does dialogue and most other sounds, when time shifts.”
Lapington went through many variations and concepts along the way – the audio team would make assets and discuss which ones worked and which didn’t. Over time, they created an audio reference guide of descriptors for Stutters: ‘violent’, ‘unpredictable’ and so on. Even more important were words that described what Stutters were not, with the team avoiding ‘sci-fi’ or ‘digital’. This was vital given that several of the audio team were working in different locations, as far flung as Microsoft’s Redmond HQ and later Soundcuts in London.
Lapington’s creative direction cross-pollinated to the live action show’s audio team too, helping ensure consistency in environment and time-related audio treatments.
Delivering his audio ambitions in-game entailed some custom augmentations to Wwise to assist with the time manipulation aspects and audio/visual sync.
“We created Q-grain, a real-time granular synth plug-in,” says Lapington. “We wanted to closely link animation timing with sound. Right from the outset we knew we had this crazy challenge of matching sound to objects moving backwards and forwards in all sorts of different timescales, all at once. Granular synthesis was the obvious choice for a plug-in, our criteria being that it should sound ’natural’ and that we wanted to control every parameter in real-time using RTPCs.
“We also needed to run multiple versions simultaneously in any scene, using compressed sound files not PCM to conserve memory. Designing sounds for use in the synth to get the outcome you wanted proved something of an art form – especially considering you’re matching some mad animations, such as a ship crashing into a bridge. Certain ‘shapes’ of sound just don’t work, and you have to be pretty clever with frequency movement-blending to make it sound convincing.”
The team also created a second plug-in called the Q-analyzer. This analyses audio in real-time and passes a resulting signal out of the audio system for use in the game.
“Visual manipulation of an audio signal is something our art director Janne Pulkkinen has been working with for a while,” Lapington explains. “From an audio perspective it’s really simple – we add a plug-in to a sound in Wwise, and with a line of script in our game engine we can drive visual effects, such as an animation timeline from a sound’s RMS value.
“Driving visual effects from audio forms the cornerstone of the game’s Stutters. All the wavy visual distortions you see in the environment are driven by sound effects – and audio and visuals are always in sync, which makes the game feel really holistic and connected.”
Article originally published in Develop: March 2016 issue.