Tetsuya Mizuguchi has a unique perspective on the role of music in video games. Many of his titles, like Lumines and Child of Eden, have a strong emphasis on interactive soundscapes, with players affecting the music through their actions.
Mizuguchi strives to create a deeper level of synchronicity between the player and the soundtrack with his games. Experiences that are a step removed from the more traditional rhythm-based titles you find in the music genre.
“The majority of music games fall into the category of ‘rhythm based’ or ‘time based’,” Mizuguchi says. “That’s not where my interest is. Rather, it’s more about how the layering of music with visuals changes the world that you’re in. And that becomes an experience in and of itself.”
The industry has come a long way in terms of the ability for games to deliver music to the player. It’s not rare for triple-A experiences to have a full orchestral soundtrack, or include contemporary popular music. Mere decades after sound design in games was considerably more lo-fi.
“In the early days we were talking about beeps and bleeps,” Mizuguchi says. “That was the extent of what we could do with sound and music in games. But as the years went by, we were gifted new ways to advance that from a technological standpoint. The expression that music brought to the game experience was heightened in a way that you were able to produce sound and music that were more organic and really heightened the emotional element.”
It’s this emotional connection that Mizuguchi is chasing when he makes his games. “Music has this power,” he says. “We talk about the power of music and how it affects us emotionally and how it moves us and influences us. So for me, it’s about using games to really maximise the power that music holds. That’s the role that music plays in my creations.”
For me, it’s about using games to really maximise the power that music holds
The interactive nature of games gives them a unique ability to create that connection with the player, especially when they have a hand in creating the audio they are hearing. This differs greatly from other media, such as film.
“You’re either experiencing music in a third-person or first-person perspective,” Mizuguchi says. “What I mean by that is that in non-game experiences you’re basically being fed the music that is coming at you, so you’re receiving it in a third-person manner. But when you’re playing a game, the interaction with music is that you’re part of it. Not only are you a part of it, but you may be affecting or creating part of the musicality. And that’s a huge difference, between hearing music versus being a part of it. So for me, I’m constantly thinking ‘what is this bridge that could make it possible to go back and forth between those two experiences?’
“Maybe there’s a new type of experience that can be designed where we blend the two. Maybe there’s a chemical reaction where something magical happens and you’re able to blend those two perspectives.”
This experience of playing with music is not something that necessarily suits a realistic game scenario, however. The effect is more akin to a trippy dream. “Music helps the player experience something that feels like your imagination is being played out,” Mizuguchi says. “It’s more of a dreamy world scenario that is happening as you play the game. So for game ideas that I have, the role of music is to be used in that manner and to not be just played as your typical rhythm-based game.
“It’s something that is a newer level of fantastical experience. I feel like the industry that we’re in, the interactive entertainment industry, brings new possibilities for the role of music. When I think about that, it excites me to think ‘what is that scenario’ or ‘how do I illustrate that possibility’ or ‘what is it that’s going to really help me get to that point of creating or designing a new experience?’.”
For Mizuguchi’s latest game, Rez Infinite, his team created a new level for Rez while modernising the entire experience. This new level, Area X, is the closest he feels he has come to his goal of reaching this new level of emotional synaesthetic fantasy.
“Area X is a brand new level which was made from the ground up for Rez Infinite. As you work your way towards the ending of that level, especially when you play it in VR and you’re in 3D, that whole experience is a display of how powerful music is being expressed in a way that really brings out the emotional side of it. As you play, the sound effects that you are creating through your actions turn into music and it’s almost opera like.
“That’s the most succinct example of what I’ve always wanted to do. When people play it, I think they ‘get it’. When I compare it to the reaction of people when they have played Lumines or Child of Eden for the first time, there has been a very obvious and clear difference between them and when I see people playing Area X for the first time. The emotional element, people being moved by playing and clearing that level, is very obvious to me.
“We’ve actually seen people being so moved by it that, to me, that is a demonstration of what I have been wanting to do with whatever new creation I’ve worked on in my entire career. So it’s a great display of what we are able to do and Area X has really helped me see that through by seeing the reactions of people who play it.”
This emotional connection is helped by Rez Infinite’s use of VR technology. This gives developers a new playground in which to experiment with technologies which have already been perfected in traditional games.
“I feel like in terms of ‘producing music and integrating it into the game’, it feels like we may have reached its peak because we’re so there in terms of the quality,” Mizuguchi says. “It’s so high. I don’t know how much more there is to push. But moving into the future, if you bring in the idea of 3D, whether it’s in VR, AR or MR, that audio and visual experience is not just your simple calculation of ‘how much more can we add’, but the true power then becomes more of a multiplier.
“There’s so much more to come. It’s only the beginning for us and we still have more to go in terms of how we express through the power of music.
“Up until now, so let’s say pre-VR, but also just in a very general sense, there are compartments. There’s a visual component (that goes here), and there’s an audio component (that goes there), and there’s maybe a dialogue component (that goes somewhere over there). It was basically trying to integrate all of these elements and communicate that to the end user, but you had to do that in a given frame. The delivery method was that 2D frame experience. So whether you’re talking to film directors, artists, sound designers or cinematic directors, any expression they came up with, they had to fit in that frame.
I feel like VR is going to actually go back to a more pure version of what our imagination is
“But when you think about your imagination, a human’s imagination, they’re not divided into those buckets. They’re very pure and multi-modal. It’s a mixture of all of these senses that you have as an idea in your head, but in order to try and express that you have to break it down into those categories. But now, I feel like VR is going to actually go back to a more pure version of what our imagination is.
“With Rez and a lot of our creations, we use the words ‘synaesthetic experiences’, where it’s the sensory experience of multiple senses triggering each other. Where we say ‘you can almost see the music’ or ‘hear what’s in front of you’. In essence we are saying that the audio and visual components are now one and they’re not separated at all. That integration and that blended experience is something that is going to be even more possible to execute thanks to the technology that has brought us VR today. I am continuing to think about and have ideas about how to really bring those experiences alive in VR.”