Develop discusses animation, LA Noire tech, Steam data and the power of Source

Valve on Art

This Q&A comes as part of Develop’s package of five interviews with Valve Software. An index of each interview can be found here.

Interview with
Randy Lundeen (left) – Environmental artist
Ted Backman (centre) – The Ted Backman
Jason Mitchell (right) – 3D graphics hardware

In terms of creating emotionally eloquent characters, do you feel Valve has made significant progress since the release of Half-Life 2 in 2004?

TB: Well, we got a lot of feedback from our customers about the ending in Half-Life Episode 2 when [that thing happens]. People told us it was one of the most emotional moments they’ve had in our games so far.

Judging by that, I think in terms of stirring emotion, we have come a step forward since Half-Life 2.

Actually I think we stepped forwards with Portal as well. People really started to have emotional reactions to inanimate objects and disembodied voices, and I can’t really think of other games with so much emotional involvement in things you don’t see.

It’s all down to the writing. We had a few new people that came onto the Portal project that could enhance the writing that was going into it.

JM: In terms of technologies, I think the eye-shading in particular is essential to our purpose. Algorithmically the process is very simple, but it’s something that most other studios haven’t paid enough attention to.

I don’t think people still realise how important shading a character’s eyes is. I mean, these are the windows to people’s souls.

I still don’t think I’ve seen eyes in a game as good as those in the Half-Life series. If you pay attention to that when building your character, it can have a big impact on how human they look.

It is very satisfying how far we have become, but we certainly have a direction for the next-gen things we’re doing as well.

What do you make of the mocap effects seen in LA Noire?

JM: It’s impressive technically, for sure. And I think it’s an important inflection point in the continuum of ever-increasing fidelity in game characters.

There’s pros and cons to that LA Noire approach, and we’re keeping our eyes on that piece of tech, but it’s not clear how we would integrate it.

I’ve heard it’s brilliant of course, though very CPU-intensive.

JM: I guess I’m not really familiar with the performance characteristics of that technology, on the playback side. One of my first impressions was the incredible high-fidelity of it.

But the system is based on a playback of a performance, which can go against how we like to think about characters interacting with our players.

We like our performances to be far more reactive to what the player does, and not to something pre-acted on a sound stage. It’s not completely obvious how this tech would integrate into our work.

But since I haven’t played that game, I’m not incredibly familiar with that technology. It’s super interesting, regardless. I’m sure it’s something people will be referring to for years to come on the history of interactive facial tech.

TB: I think we try to do our story telling with a player still active in a scene. We have to put marks on a set and have look-at targets for all our characters.

It’s relatively unexplored how Steam data from players can help Valve artists in particular. Is there any use in the data.

RL: From an environmental artist point of view, we can use it to tell when players are lost, or if they’re stuck in places. We can go back, adjust the game world and make things a more appropriate challenge.

JM: Or in multiplayer maps. We can see if one level has an inherent bias towards one team over another. We can tweak the maps so that bias goes away.

We can look at a certain section in a map where a lot of people die and decide whether it needs more shelter.

Isn’t it inherently a massive challenge to update a game if you don’t get things right the first time?

RL:
Yeah it’s harder after you ship, and it’s a lot harder on the consoles, because there’s a long lag on the update and submitting the changes.

We’ll be changing stuff right up to before we burn it on disc. Things constantly pop up when we play-test, and one of our goals is to build really flexible environments that we can easily change.

Post-ship, we can still make changes, but there’s more costs involved in regards to people’s times and how much it costs to download.

JM: The aggregate data can tell us a different story to user feedback, both are very useful though. We underwent a lot of internal testing on Half-Life 2 Episode 1, and while we didn’t see any problems, the aggregate data showed that there was a big problem with this one section where too many people were dying. So Steam data can be an invaluable tool.


Going from Steam to Source, how much does that engine constrain your creative endeavours?

RL: I actually think limitation is good for creativity. If we had an engine that could do everything, we would be in trouble. It gives us focus.

TB: And people don’t realise how many times we’ve updated the Steam engine over the last seven years. With every one we’ve released we’ve had some kind of new rendering technology, or new modules to the engine.

My experience with Portal 2 was working mostly on world materials, and I might as well been working on a brand new engine, it was that different to what I had worked with last time.

About MCV Staff

Check Also

The shortlist for the 2024 MCV/DEVELOP Awards!

After carefully considering the many hundreds of nominations, we have a shortlist! Voting on the winners will begin soon, ahead of the awards ceremony on June 20th