With most people switching from slow dial-up modems to high-speed broadband, the opportunity and demand to play games online is growing. This is great for the industry, but it does present challenges. In particular, ensuring that the online player has a good experience is paramount.
After all, in our ‘connected’ world, word quickly spreads about how good a game is and it is amazing how only a few reported bad user experiences can soon become considered fact. Therefore, games developers and publishers need to test how well their latest product is going to perform in the online environment.
It isn’t just the hardware any more
In the good old days when games were simply played in stand-alone mode, you had complete control of the gaming environment. The specifications of the dedicated games console were clearly defined and games software was developed to these specifications.
While PC performance was more variable due to different processor speeds or available RAM etc., it was still possible to determine the acceptable machine parameters required to ensure a good gaming experience. The online game presents a much greater challenge.
We can essentially separate the online game into two types: peer-to-peer and the massively multi-player game (MMOG).
With peer-to-peer type games, the players will play in smallish groups of between two and twelve players. In this situation, one player’s console or PC acts as the host server and the others play each other using that server. With the MMOG, the games publisher provides the host server(s) and all of the game players connect into these servers in order to come together in some kind of virtual world.
Now, in addition to the hardware running the game, you need to start considering the effect of the networks the players are using to join the gameplay.
While the hardware is controllable, the unknown, or uncontrollable part of a game played on line, is the network.
So, the answer is simple, let’s create a network in our lab in order to test online game playability. The problem is that real-life networks, whether they are ADSL cable or even further into the future, mobile phone networks such as GPRS 3G, are far less perfect than any internal network we would typically build in a lab or test environment.
The Characteristics of Live Networks
Online gaming takes place over geographically disbursed networks typically described as wide-area networks (WANs). Sending data over WANs throws a number of challenges at any software writer and the games industry is no exception. Let’s look at some of these.
Delay or Latency: Lag
It takes a finite amount of time for any traffic to travel across any network. Data travels across networks in ‘packets’ and it’s useful to regard these packets as being similar to buses travelling up and down roads. Just like a bus, which can have a certain number of passengers on board at any one time, so a network packet holds a certain amount of data.
When a player clicks a button on their PC or console, for example to move their character within the game, packets indicating the new position of the character are sent over the network to either the peer’s or the central games server. Relaying this information over the internal local area network (LAN) of the test lab is virtually instantaneous owing to the short distances travelled and because it is being transmitted over a super fast highway, which in road terms would the equivalent to having a five-lane highway between you and your neighbour’s house.
However, in real WAN networks including ADSL, it takes much longer for these packets to reach their destination. This is known as delay or latency and occurs because the distances travelled are going to be much greater and the ‘highway’ is more likely to reassemble a single track lane or congested dual-carriage way.
In game industry terms this delay is commonly described as ‘lag’. A small amount of lag is acceptable. Within the UK, we can expect to encounter network transmission delays of around 30 milliseconds.
However, internationally we can be talking in many hundreds of milliseconds. For example somebody playing from New Zealand can expect in the region of 300 milliseconds delay when playing a game hosted on a server based in the UK.
The result will be that things appear to be jerky on screen as the software catches up and implements the new instruction.
Imagine then, playing a fighting based game over a network with a 300 millisecond delay or lag. The player with this type of lag has very little chance of winning the fight, unless the games designers/writers have compensated and tested for that sort of delay in the first place. The result will be a frustrated player who won’t put the delay down to the network but will, instead blame the game itself and probably tell friends that it’s not worth playing.
And, just when you think it can’t get worse, consider the fact that mobile phone networks like GPRS can have delays, when bad, in the order of 7,000 milliseconds – that’s 7 seconds – and playing an interactive game in this environment is going to be an even less pleasant experience if proper games design and testing has not taken place.
Packet Loss, Damage and Reordering
So what other things are important? Well the data packets can be lost, damaged or reordered so that they arrive out of sequence, or don’t arrive at all. It’s like sending a bus down the road that either doesn’t arrive or doesn’t arrive entirely intact and is therefore probably useless when it does arrive.
With games this may result in positional moves being lost and that makes for very jerky game play. Again in a fight, you could lose the game because although you have fired your gun, the fire command packet got lost on the way. So, it’s game over and one very unhappy gamer.
What to do?
If a game needs to run over less than perfect networks, which WANs typically are, and yet the customer has expectations of a fantastic experience – then how do we ensure that this is what they get? The answer is to compensate for these real-world conditions.
A less than perfect example of this might be to send quite a lot more positioning data than you might need when you wish to reposition a character, so that if you lost some of the network packets that gave a position then fine, it will catch-up a little later. There are bandwidth considerations – more data uses more bandwidth but nevertheless that is a possible solution.
To see if this might work, you need to be able to design a prototype and test how it performs in the anticipated online conditions.
What is the best way to set up your test network? We might think of testing with real players situated all over the world, sitting in their homes. The problem is that we can’t see what’s really happening, we can’t see what errors they are getting and we have no control over them. Using such a disbursed testing set-up it isn’t practical to send someone out to take a look at what was on their console or screen. Where it is a brand new game, there is the added danger that the testers may take screen shots and forward these on to magazines, who are keen to offer sneak previews of the latest super game that is coming out.
To overcome these problems we could bring all the gamers into one place for testing in more of a laboratory environment where we know that the game is safe and secure. But now we have the problem that they are playing over an internal network, so it’s too perfect, it’s not the real world. So what do we do about that?
Creating the real network environment
Using the in-house testing model over, we have 16 gamer testers sitting in the one room, so let’s buy 16 ADSL lines. The problem is that they are all going to have much the same ADSL experience. You won’t have the experience of one person in Manchester, playing against another person in Brighton, and yet another person in the USA, as might reasonably happen.
We can’t see how the latency is affecting the game, and furthermore because all of the ADSL links are into the same building, even if we buy them from separate companies, the chances are that they’d go to exactly the same BT exchange. Clearly, such a set-up doesn’t reflect the real world where one player one may be on a fast internet connection, while another will be on a slow link – and they will have different sorts of delays depending on what cities they live in and so on.
The answer is to create a network in our test lab that behaves like a real network – and this type of network is called an emulated network. With an emulated network we can set up all of the people who are trying/testing our games as though they were in fact sitting in their homes, in different cities, with very different ADSL experiences, even though they are actually sitting in the same room.
So while two players are physically sitting side by side, one experiences the game as though they were in Australia while the other has the experience of someone playing in London. We could also ask the emulator to lose or damage a number of data packets so that when we test our game, it is a realistic experience.
Controllable & Repeatable Testing
Network emulators offer some additional advantages in testing over real networks. Imagine that you were using the real internet with testers working from home. It is possible that during testing, a neighbour on the same ADSL group (ADSL groups are shared across several households) may decide to download some large MP3 files. This will result in a slower network connection, which in turn affects the game’s performance. When you go to repeat the test, unless you know what the other users of the shared ADSL are doing – and in truth you wouldn’t – it is going to be extremely difficult to recreate the exact set of conditions again.
A network emulator is, in many ways, the network equivalent to a flight simulator. With a flight simulator you have complete control over the conditions you expect the pilot to fly in. Using the flight simulator you can reproduce exactly the same conditions time and time again. Using a network emulator to test games software playability offers similar advantages of control and repeatability.
Pushing the Boundaries
Once again, using our flight simulator analogy, as the pilot goes through the flight routine you can introduce new conditions, like landing in a storm, for them to cope with but safe in the knowledge that even if they get it seriously wrong or the weather conditions become too harsh, nobody is actually going to get hurt. Likewise, you can use a network emulator to introduce extreme network conditions in order to find out just how far you can push the boundaries before a game becomes completely unplayable.
Armed with this knowledge, you can then confidently identify that the game is going to deliver a satisfactory level of online game playability in the “normal” range of network conditions. Of course, you could take this a stage further and even provide guidelines on the minimum network specifications required to ensure a good online experience, similar to specifying the PC or consoles requirements. Alternatively, you could incorporate a red light into the game display which flashes on when network conditions become poor. Then, the player will understand that any performance loss is not down to the software but is due to the network.
There is a large growth in online gaming and ensuring online playability can no longer be an afterthought. So, it will
be the software developer or publisher who has carried out thorough pre-release testing of their game in controllable but realistic network conditions that is going to reap the benefit when gamers report that they have just found a game that plays brilliantly online.