Skip to main content

Long read: How TikTok's most intriguing geolocator makes a story out of a game

Where in the world is Josemonkey?

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Standard Def: The Forgotten Generation

Have developers and platform holders neglected most of their userbase?

In an interview on Eurogamer this week, Epic Games' Mark Rein dropped what you might call a technological bombshell: over half of the Xbox 360 owners who played Gears of War 2 did so on a standard-definition television.

Of course, there are certain caveats attached to Rein's information, but if anything, it paints an even bleaker picture of the take-up of high-definition displays. The only way Epic is able to make this determination at all is down to the information relayed back to Microsoft HQ via the Xbox Live connection, which phones home with all manner of intriguing, but apparently non-personal information about how you use your console. The means of this data collection suggests that while the standard-def gamers in questions may be display-challenged, they're not Luddites - they have either the nous to connect up their 360 to the internet via a LAN cable, or else they've invested in Microsoft's supremely expensive wireless dongle.

While no similar information is available about the PlayStation 3, gut feeling tells me that the situation is similar, even assuming that people with more money to spend on a premium console should be more capable of affording an HD display. While the platform holders are making headway into getting their consoles into the living room (where HDTVs are far more likely to be situated), a ton of them are actually being located in the bedroom/office. Gaming remains a mostly solitary experience that ties up a TV for hours at a time - not good when the wife/parents have pencilled in a Coronation Street/Midsomer Murders double-header for the evening.

While I'm fairly confident in suggesting that Eurogamers en masse are most likely HDed up to the max (or at least to 720p), the questions I want to ask consist of the following: how do developers approach the SD modes in the games? And, are there any performance penalties over and above the reduction in detail? More than that, bearing in mind that less than half the resolution needs to be rendered, are there actually any performance benefits in dropping down to SD, or indeed 480p? In short, assuming that SDTV owners are indeed around 50 per cent of the audience, are they getting the attention they deserve?

Let's start by taking a look at the game Mark Rein was talking about: Gears of War 2. In this test (and along with all the others), I'm measuring performance based on progressive scan output. While "classic" 576i or 480i might offer different performance, it is highly unlikely: both PS3 and Xbox 360 render the framebuffer as a progressive image before letting their TV compatibility systems interlace the signal. The same thing holds true for 1080i/1080p - you get the same effective frame-rate even if actually pixel throughput is being halved on the interlaced signal.

Gears of War 2 SD vs HD. The green line and the left FPS indicator follows 720p performance, while the blue line and the right indicator measures 480p. Bottom line: there's not much in it.

As you can see, there is effectively no difference at all in terms of frame-rates, even though the Xbox 360 has fewer than half the pixels to render. The implication is that the console is scaling down the HD framebuffer to provide a "super-scaled" image for SD users. This pretty much always guarantees a superior image than rendering natively, with vastly reduced anti-aliasing issues. However, what it also means is that the same frame-rate issues that affect the HD version impact the SD game too. Just as with PC gaming, there's nothing to stop developers rendering to a smaller resolution and using that to provide superior or more stable frame-rates. In most of my Xbox 360 tests, this didn't happen. There is a disparity in torn frames you'll notice though. This may well be down to the fact that detecting torn frames programmatically on a scaled image is actually bloody hard, or it could actually be a small advantage in SD's favour.

Two more tests then. First up, some Call of Duty: World at War, followed by some Far Cry 2. The idea here was very straightforward. World at War runs at a sub-HD resolution already. I was curious about how much performance might be increased by dropping down to SD. The answer was that in this case, there was no benefit at all. Far Cry 2 was utilised in order to see whether torn frames could be reduced by dropping to SD. Again, no dice, but more about FC2 later as the same game is analysed in its PS3 incarnation.

Call of Duty: World at War and Far Cry 2 performance analysis. Once again, the left indicator and the green line measures 720p frame-rate, while the blue line and the right FPS counter follow 480p.

So, right about now then, we see no evidence whatsoever of any benefits at all for SDTV users. The news is actually about to get a whole lot worse, particularly for PAL PS3 owners, but before we get to that, there is one ray of shining light. Take a look at this Sacred 2: Fallen Angel performance analysis I carried out last month across the range of supported resolutions: 1080p, 720p and 480p. There you'll see that there is a clear advantage in terms of frame-rate and v-sync by playing in standard definition, on Xbox 360 at least. If the will is there, developers could provide some tangible performance advantages to gamers using older display technology and if that market is as massive as Mark Rein's stats say it is, perhaps it is worth some thought?