Whatever happened to the low-ping bastard? The scourge of the early days of online gaming, the LPB would dominate Quake deathmatches by simple virtue of the fact that their connection to the server was often orders of magnitude faster than anyone else's, resulting in complete and utter domination. It was easy to identify the LPB: a quick look at the roster of game players placed their "ping" times alongside their names, making it easy to spot those with obvious latency advantages.
Fast-forward to the present day and services like Xbox Live and the PlayStation Network dominate online gaming. Even though fast broadband is now the norm - eliminating the sub-par latencies associated with the days of dial-up online gaming - the lag hasn't completely gone away. Instead, game developers are using a wide variety of technology to hide latencies from the player, leading us to wonder: is online gaming fair? If you're playing at a disadvantage, to what extent do you actually know?
While it's undoubtedly true that Xbox Live and PSN have made the process of gaming over IP far more accessible, it's also true that the latest online technologies are far from transparent and it's rare that any game will actually let the player know how good the connection is until you start playing. And even then, with client-side prediction technology now the norm, you still may be blissfully unaware that you are operating with a sub-par latency at all. And even if you are presented with a "quality of connection" ranked at one to five bars, what does that actually mean any way?
Think about it this way: have you ever watched a Call of Duty Killcam replay and thought to yourself that the sequence of events playing back to you is somewhat at odds with the reality you personally experienced before you were gunned down like a stinking pig?
How Online Gaming Works
There are three distinct ways in which online gaming operates. First of all, there's the traditional dedicated server set-up - as used by titles like Battlefield 3, MAG and Warhawk. Gameplay and crucial decisions are all driven from a central server to which all players are attached.
Next up there's the peer-to-peer or P2P system. This is rather more complicated to explain but essentially, game data is beamed from player to player, meaning there is a whole host of different latencies between each participant. However, there is one player who is the host - he is the most important part of the system as it is he who decides the "reality" of the game - who shot whom, essentially.
The host defines the reality of the game for everyone else. While the host still has to deal with varying levels of latency from each of the participating players, he has the advantage in that his own actions are processed locally, and so, one stage of lag is removed from the equation. How developers deal with host advantage can be controversial - but again, this process is invisible to the player. A number of racing titles, plus Uncharted 3, use P2P technology.
Finally there's client/server. This is similar to the dedicated server system, one player is actually serving as host for all the other players. While it's generally believed that Modern Warfare 3 and Halo are P2P titles, our information from a highly respected industry professional is that they are actually client/server - and Valve games such as Left4Dead and its sequel are, too.
The advantages of dedicated servers are two-fold. First of all, no single player has host advantage - everyone is treated equally and a player acting as host does not need to have his connection "nerfed" in order to level the playing field. Secondly, as the gameplay relies on the quality of host's connection in a P2P scenario, the masses of bandwidth available in dedicated server datacentres mean that the experience is often more uniform too.
Game developers prefer P2P or client/server in many cases because the lack of dedicated servers means that there's no grand investment in infrastructure required. In the case of the Uncharted games - where money most likely isn't really an object - Naughty Dog says that its own preference for P2P comes from the fact that they will never have to turn off dedicated servers when they become financially unviable - the entirety of the game package it has created should live on.
Latency: The Unknown Enemy
We wanted to take out some of the uncertainty from the quality of the Xbox Live and PSN experience. We wanted to find out how good, or bad, latencies were for some of the top games, and we were especially curious about how international gaming worked out. Realistically, the further data has to travel geographically, the more lag will manifest in-game, but it's also the case that the journey from the ISP to your console can introduce a lot of latency. Again, the player is left completely in the dark about this - in most cases, the best he will get on-screen is a five-bar representation of the quality of the connection. This can mean very different things for different games.
To get to the bottom of matters, we put together two sessions of gameplay with a range of different players across the UK and the world, each with a different internet connection, with each player capturing his own gameplay. After this, the key was in synchronising the streams. Ideally you'd like to line up the videos according to the in-game clock, but who knows - aside from the developers - if the clock is lagged or not?
However, we came up with some methodology to make measurements possible. The technique won't give us our preferred measurement of latency between players but it will give us the round-trip latency from say, player one to player two, and back to player one again. So as long as the captures are real-time (they are), the results are indisputably accurate at that moment in time.
Here's how it works:
- Step One: Amass players in the same online game, and have them record their experiences at 60FPS
- Step Two: Bring all captures together on the same PC
- Step Three: Line up the captures on one participant shooting (let's call him player one), with that same shot registering on the screens of the other players. We now have player one on the "right" time and the others with the clock at various skews.
- Step Four: When player two (or whoever) shoots, count the frames until player one sees the shot being fired. Multiply the frame count by 16.67ms for total latency.
Geographical location and the quality of internet service are paramount in testing, so here's a breakdown of the participants involved and the kind of connections they have.
- Camberley, UK: Possesses a "budget" 3.5mbps ADSL connection, with 800kbps upstream.
- Peacehaven, UK: An "up to 20mbps" ADSL connection with 1mbps upstream.
- Tel Aviv, Israel: 15mbps ADSL with 800kbps upstream.
- Moscow, Russia: 100mbps+ symmetrical connection with a 1gbps (!) connection direct to the ISP
- Brighton, UK: A special guest appearance for the Eurogamer 100mbps leased line - a connection so mighty even OnLive's US servers are perfectly playable, and a bottomless pit of performance that has yet to be tapped out.
What do the numbers mean? Why is lag so high?
Hold on to your potatoes. You're about to see some monster numbers - latencies so high, you'll scarcely believe that onling gaming is even playable. What we need to stress is that these numbers cannot be compared to conventional PC "ping" measurements, which encompass network traffic only.
Our measurements are based on what we observe passing through the entire game engine - and they are round-trip: there and back again. Regular readers will already know that games possess significant "input lag" - the time taken from a controller button being pressed, to action occurring onscreen. While there's been some discussion between the developers we've spoken to about this, we reckon that there's one set of equivalent lag in these round-trip measurements.
We've also averaged measurements. One thing to bear in mind is that game engines run to precise schedules of data processing when it comes to game logic and rendering. The nature of network traffic is that packets can be delayed, so updates to other players on-screen could be affected by this. We actually found that there can be a tremendous variance in results during our testing because of this.
So, first up, Modern Warfare 3 on Xbox Live with Peacehaven, Tel Aviv, Camberley and Brighton participating. Later on we did a secondary test without Brighton involved, in order to double-check the findings we had. Here's how the "round-trip" there and back again technique works with MW3 and BF3 with the same players.
Let's look at the worst case scenario first. In theory, the roundtrip latency between Tel Aviv and Camberley should be worst - and we see an average of 25 frames or a substantial 416ms. Between the two UK ADSL players, we see a substantial drop to an average of 18 frames, around 266-300ms. Factoring in the almighty Brighton 100mbps leased line, we see latencies drop to around 14 frames (233ms) when compared with the other UK players, but still a hefty 300ms plus for Tel Aviv.
Remember, these are round-trip latencies. The best we can do is halve them to give an idea of actual player-to-player latency, perhaps weighting them a little according to the capability of the connection. In the case of MW3, which is client/server based, it suggests that Brighton is the server, but the extent of the host-nerfing can't really be ascertained.
If there's one thing that we learned from this experiment, it's that latencies between players can vary quite dramatically, sometimes by a margin of 50ms. We suspect that the combination of network traffic arriving late combined with the fact we're measuring two trips rather than one exaggerates this somewhat.
Battlefield 3 and Gran Turismo 5 Analysis
The advantage DICE's Battlefield 3 has is that no single player is the host. All players connect to a central server, designed for excellent infrastructure and enormous bandwidth. This should mean a more consistent experience for all players, with no individual participant having any kind of latency advantage (and certainly no advantage that developers would feel the need to nerf).
As these servers don't have restricted upstream bandwidth like ADSL connections, they can also accommodate many more players. However, there is a limited number of them, so private matches aren't allowed. Also, in our testing, the server list didn't work at all. During our testing of the PS3 version, we had to rely on matchmaking to choose the server, which almost certainly had a significant bearing on these numbers.
Round-trip latency between our UK participants and Tel Aviv was substantial - 30 to 32 frames in total, a painful 500-533ms, with traffic between Peacehaven and Camberley seemed to average at 20-22 - a 166ms advantage.
Next up, Gran Turismo 5. It's safe to say that the online racing experience has never felt particularly robust and the nature of the client-side prediction feels rather agricultural. You can test this by yourself simply by ramming into another car in any online race. The collision itself will be corrected a split-second later, and your car repositioned with a visible jump.
For our tests, Moscow created a room/lobby for racing which was populated by Tel Aviv, Peacehaven and Camberley. We went out to race together and decided that illumination of the brake lights on each vehicle was the best way to proceed. This is a passive event that should not be subject to any kind of server or client-side prediction.
The results were intriguing. Round-trip latencies appeared to be closely tied to the sum of player-to-player latencies we found by synchronising the captures to the clock. Both sets of measurements strongly suggested that GT5 uses a client/server set-up with one player running as the server, in this case Moscow. Assuming we're right about the clock sync, here's how relative latencies look. Remember, these are just singular measurements and haven't been averaged.
Game developers typically go for P2P in racing games in order to minimise latency between players, so it is somewhat surprising to see evidence that Polyphony Digital has gone for an altogether different approach. Without the developer itself confirming what's going on, it's very difficult to tell what impact this has on gameplay and whether the host has any kind of racing advantage. Consider a photo-finish - if Moscow has an eight or nine frame advantage, that's obviously bad news. However, if the gameplay is synced to the clock, player interactions with the server could obviously be time-stamped, ensuring a fair result.
Uncharted 3: P2P and Low Latencies
With Naughty Dog's Uncharted 3, the game's Cinema Mode replays offer up a whole host of valuable data and give us new insights into the way P2P online gaming works. The developer has confirmed to us that captures from multiple players can be synced to the clock - something we suspected was true when we first lined up some recordings. We noticed that PowerPlay bonus rounds kicked off at precisely the same time for all players with not a single of frame of latency between players, strongly suggesting that gameplay was synchronised around a master clock.
This makes analysis of latency between players much more specific than just the roundtrip "there and back again" techniques we have been able to use up until now - we can break down latency between players to a more precise level. In this first video presentation, we have extracted just a few seconds of gameplay between players and analysed the time it takes for actions taken by each player to arrive on the other players' screens.
The video demonstrates how P2P gaming favours players who are grouped together closely from a geographic perspective. Peacehaven and Camberley enjoy the fastest player to player communication we've seen at one point, but at the same time we see that Tel Aviv had a momentary dip in performance, resulting in a colossal 433ms lag.
The results also demonstrate just how inconsistent traffic over the internet is at any given point and also gives us a real appreciation of the challenges facing network coders for online games, particularly on fast action games. The fact that motion is so smooth at all bearing in mind the inconsistency in when data is actually delivered to each player speaks volumes.
The Uncharted 3 captures also offer up a few more tasty morsels to chew over. The basic principle of P2P internet gaming is that while comms are beamed between players, actual kill decisions are decided by the host, which in theory gives them an advantage when it comes to their own in-game battles (though some developers actively nerf the host in order to present what they consider is a more level playing field).
What we need to figure out host latency is some kind of kill event common to all players. Something like an exploding grenade. In the video above, Moscow lobs a grenade into our latency-testing orgy of gunfire, jumping and basic arsing about, but he is actually the third player to witness its detonation, and all the signs point to Camberley, as party leader who assembled the team to begin with, as host.
Conclusions: Is Online Gaming Fair?
If there's one thing we've learned in putting this feature together, it's that it's something of a miracle that online gaming works so well and looks so smooth, bearing in mind the ever-changing latencies we can measure at any given point. It's also clear that - by and large - the client-side prediction technologies employed in most games must be of exceptional sophistication in order to create an experience as seamless as we see it on-screen.
But the issue of transparency concerns us. There are very few clues given to the player that the session they're in under-performs. Matchmaking will do its best to put you into a fast, low-latency game - but if it can't find one, it'll most likely bung you into a game hosted far away where you are under an immense disadvantage compared to the other players, with no indication given to the player of just how much worse off they actually are. Client-side prediction papers over the cracks to a degree that the player can be blissfully unaware of just how poor their connection is - and not every game has a Killcam that allows you to compare what you saw with the "established version of events" as the host sees it, and nobody really knows what the difference between a three- or five-bar connection really is.
To put it in simple terms: the difference between three- and five-bar connections in MW3 is enough to see you gunned down without even getting a shot fired off: even though, from your perspective, you visibly returned fire.
So, whatever happened to the Low Ping Bastard? Well, the gulf we used to see between dial-up and ISDN/T1 connections in the old client/server dominated period is mostly gone - the days of being orders of magnitude faster than the competition are thankfully a thing of the past. By and large, cable and ADSL technologies are great levellers - but geographical location (or rather, the quality of infrastructure between players - not to mention the quality of their connection to their ISP) can have a substantial impact on the gameplay experience.
So, is online gaming fair, presenting a level playing field to all participant? The conclusion must be that it is a bit of a lottery unless you set up private matches with people you know you share a good connection with. There are few guarantees otherwise, but based on our experiments, if we don't have a five-bar connection, we probably wouldn't want to be playing at all. In general, PC gaming at least makes some effort to inform the player on the quality of the connection to other players with actual metrics - something that only exists in a limited form on console.
The question is whether we can actually expect anything to change. After all, the whole nature of the issue is that today's online experiences are visibly so seamless, only the most informed and observant will know that there's any kind of unfairness at all...