Skip to main content

Long read: The beauty and drama of video games and their clouds

"It's a little bit hard to work out without knowing the altitude of that dragon..."

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Digital Foundry vs. Console Lag: Round Two

PlayStation 3 and Arc camera put to the test.

Generally speaking, lag is becoming a matter of crucial importance to console gaming. While the old CRT TVs essentially operate with next to no lag whatsoever, the same cannot be said of LCD and plasma screens. Displaying an image on a flatscreen takes time. Processing the image - for example, scaling from 720p to 1080p - also takes time. Some sets ship with "game mode" settings designed to display frames as quickly as possible. Most, however, do not, and some screens can introduce up to five frames (84ms) of lag, making our Dell at 50ms still slow, but far from the worst.

The combination of flatscreen latency combined with the fact that console games run at 30FPS - or even lower occasionally - means that the gameplay experience we have these days is a lot less responsive than the old 60Hz 2D arcade titles many of use played when we were kids. Lag is literally being built into the games of today - a situation that can only get worse if the likes of OnLive and Gaikai gain traction - and even the very make-up of the high-tech innards of your Xbox 360 or PS3 is contributing to that.

The move to many-core architecture in these machines involves the use of parallelisation that has the potential to introduce higher latencies into gaming over and above the issues caused by displays and inherently lower-target frame-rates. In a multi-threaded setup, where tasks are hived off to individual cores, threads or SPUs, code executes simultaneously but at some point it all needs to synchronise. If one element is late to the party, the code can stall.

Developers use the same way of measuring controller lag that Digital Foundry does. Console mod-master Benjamin Heckendorn, aka Ben Heck, created a modified Xbox 360 controller that lit up LEDs on a special controller board whenever buttons were pressed. Position the board next to the screen, record the scene with a 60FPS-capable camera and you're away. It's a ridiculously simple concept, but it works. Ben Heck's controller board is used by key game-makers including Infinity Ward and BioWare (and many more following the first DF feature on this topic).

Using basic common sense and wild ingenuity, Heckendorn has come up with a new design, and the PS3 controller latency board prototype was duly sent across the oceans to its new home within the Digital Foundry lair.

Benjamin J Heckendorn's brand new PlayStation 3 latency controller monitor. The board itself is powered by dual AA batteries on the rear of the unit while the controller itself uses the built-in PS3 joypad battery.

So, what shall we do with it? In the here and now, we can take a look at how latency affects PS3 titles, plus we can expand upon the original feature and cover some new ground. In the future, having the kit on-site also means that when we talk about diminished controller response in a Face-Off article, we can physically measure it and quantify it.

The original Lag Factor feature also left us with some unfinished business. So, let's get the obvious game out of the way then: Killzone 2. We had a go at measuring its controller lag back in September, and we pegged it at a tentative 150ms, but with reservations. Gauging the "zero frame" - the point at which the fire button is depressed - is difficult going by camera footage alone. We're introducing doubt into a measurement, calling into question the whole point of the exercise. Our technique for confirmation was simple: keep repeating the test over and over. Only then were we confident enough to publish that one PS3 result.

Thanks to the work of Ben Heck, we can now revisit the game (the patched version, of course) and get the exact measurements we need. More than that, it's possible to gauge response time when the game is running both in optimum conditions and also when frame-rate is suffering. So here are a few shots of Killzone 2. Bear in mind that aside from a new controller monitor, all other elements of the test are identical to the previous DF feature. That's the same Dell monitor we used last time, while our Kodak Zi6 720p60 camera was used for recording the action.

Killzone 2 latency can finally be accurately measured, both at its optimum 30FPS and also when the engine is struggling.

With the game engine operating at 30FPS, we can assume that the response should be at its optimum. Where frame-rate is steady, and factoring out the display lag, a nine-frame gap between button press and on-screen action is confirmed, regardless of the weapon used. So yes, Killzone 2 latency is confirmed at 150ms, a full 50 per cent higher than many 30FPS first-person shooters.

But the later measurements in the video are also revealing. As frame-rate drops, so does response time. It stands to reason really - the game is missing screen refreshes. Temporal resolution is dropping, so the time taken to display the results of your input fall with it. But by how much? According to the final shot, we can see latency rise to as high as 183ms in Killzone 2. Combined with the lag in LCD monitor displays, there's a strong chance it'll rise above a fifth of a second.

There is a sense of inertia in Killzone 2. The feel of the game is dramatically different to the Call of Duty titles and there is the sense that it is so by design. Halo 3 can run at the same 150ms lag for actions such as jumping. However, pulling a trigger shouldn't apply and in the case of Killzone 2, the 150ms figure is steady whether you're jumping or shooting. In the latter case, Halo 3 is timed at 100ms, which in terms of my testing (plus that of Neversoft co-founder Mick West) appears to be the fastest response a 30FPS game is capable of achieving.