Long read: The beauty and drama of video games and their clouds

"It's a little bit hard to work out without knowing the altitude of that dragon..."

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Not so High Definition

Tekken 6 and the sub-HD phenomenon.

Digital Foundry's shock of the week: Tekken 6, one of the most anticipated fighting games of the year, has an ugly secret. Blocky, blurred and clearly upscaled, it is the latest example of the "not so high definition" generation: games designed for HD consoles that fall short of the visual quality we should be expecting from our hardware.

The rumours surrounding Namco's flagship fighting game - going cross-platform for the first time - have been floating around tech forums for the last couple of weeks, based on screenshots from various sources, none of them conclusive. Analysis of Xbox 360 screens released to date reveals native resolutions of anything from 1280x720, to 1360x768 to full-on 1080p. On the other hand, PS3 assets released so far reveal a disappointing 1024x576 resolution - over 33 per cent of the 720p detail gone, just like that.

The disparity between the available Xbox 360 shots was enough for Digital Foundry to suspect that something was clearly amiss, and based on the preview code we have available, direct HDMI dumps reveal a 1024x576 framebuffer on default settings. All of which sounds like another cross-format catastrophe, but the reality is somewhat more intriguing and, at the same time, rather bizarre.

On a more general level, controversy has surrounded the issue of sub-HD gaming since around the time the Xbox 360 actually launched, but some eagle-eyed journos were quick to notice that something wasn't quite right even before the console hit the streets. Working on the UK's Official Xbox Magazine at the time, UK: Resistance writer Gary Cutlack was probably the first to realise that something was amiss. Way before the online press caught on, Cutlack had figured out that the HD revolution hadn't quite begun quite yet.

"I first noticed it while reviewing launch game Project Gotham Racing 3 a few weeks prior to the console's release," Cutlack told me. "The Xbox 360 debug grabbing software - which dumps images onto a PC direct from the console's memory as you well know - was leaving me with in-game screen grabs at 1024x600, while the menu screens were the proper 1280x720. 'That's not quite the 720p Microsoft has been endlessly banging on about,' I remarked, probably to myself then to you in an email. Little did I realise this was a downsizing output scandal that could've generated sizeable Internet traffic had I 'gone public' with it. Numerous games did it in the early days - including pretty much everything from Activision."

In short, the tools supplied by Microsoft itself to games journalists gave writers the ability to get a direct dump of the Xbox 360's framebuffer, in its original format. Simply by looking at the resulting file's dimensions, you got an accurate reading of the game's actual resolution before the scaler in the Xenos GPU got to grips with upscaling the image to proper 720p. Project Gotham Racing 3 came in at a disappointing 1024x600. Perfect Dark Zero measured up at 1152x640. Tony Hawk's Project 8 achieved a measely 1040x585.

At the time, Microsoft's TRCs (the Technical Requirement Checklist) could have effectively put all of these games back to the drawing board. Pre-launch, Microsoft had promised that native 720p would be minimum, and that 2x multisampling anti-aliasing - used to smooth off jagged edges - would be mandatory. These requirements were put on hold during the launch period simply due to the fact that the game-makers only had final silicon for a few months before the system launched. Prior to that, G4 Macs with ATI graphics cards were used to emulate the console. One rumour has it that Need for Speed: Most Wanted ran on an overclocked version of this set-up for its debut on "360" at the 2005 E3.

As development on the console gathered pace, Microsoft's resolution and AA requirements seemed to relax still further, up until the point where recently, Black Rock Studios' David Jefferies interpreted his NDA somewhat differently to most other 360 coders and revealed that Microsoft had dropped these particular TRCs completely.

"We are making a trade-off and saying that the screen resolution is more important to us than the quality of the anti-aliasing," Jefferies told Develop magazine. "This isn't necessarily an entirely voluntary move because, until recently, Microsoft had a TCR insisting that games run at 1280×720 - providing you weren't one of the lucky ones like Halo, who got it waived and ran at 1152×640, that is."

So what's the score with sub-HD gaming on Xbox 360? Why can't we have full 720p and 4xMSAA, as seen on very clean-looking games like DiRT 2 or Fight Night Round 4? The answer, ironically, is all down to one of the architecture's greatest strengths. The Xenos GPU is able to achieve massive throughput due to the fact that 10 megabytes of so-called eDRAM is attached directly to the graphics core. An effectively infinite level of bandwidth is available to cope with "expensive" effects such as transparent (alpha) textures and of course anti-aliasing. It's one of the key reasons why Xbox 360 cross-format titles often have a graphical edge over the PS3 versions.

Unfortunately, that 10MB limit is the 360's Achilles heel. It's enough to contain a 720p image, but with no anti-aliasing. To incorporate 2x anti-aliasing simultaneously, resolution needs to drop to the 1024x600 or thereabouts seen in titles like Call of Duty 4: Modern Warfare, Project Gotham Racing 3 or Oblivion. If you want to go higher, a process called tiling kicks in, where the framebuffer is split into chunks and swapped out into normal memory, impacting performance. Geometry that spans tiles has to processed twice, or even three times. Some developer estimates put the cost of using two tiles (enough for 720p, 2xMSAA) at around 1.4 times the level of keeping everything in the eDRAM. Three tiles, as used for a 720p, 4xMSAA image, or a non-AA 1080p framebuffer, up that requirement to 1.6 times the overhead.

Capcom's Framework MT technology, as used in most of its Japanese titles from Lost Planet through to Resident Evil 5, adopts an interesting - and unique - solution. By default, it uses the three tiles for a maximum quality 720p, 4xMSAA framebuffer. However, when the engine needs that extra level of performance, it'll drop down to using two tiles (2x) or even no tiling at all (no AA). To put things more simply, Capcom adapts its engine dynamically, and very well too - it's extremely unlikely that in a truly action-packed scene that the gamers will notice "teh jaggies".

In short, the processing cost of tiling can seem prohibitive to some developers, so they either lower the resolution, or drop the anti-aliasing completely. Often, computationally less expensive effects like blurring are added instead which rarely helps overall image quality, hence the term coined by Eurogamer editor Tom Bramwell: the Vaseline effect. [Not sure I invented this, but I'll take it. - Ed]

The advent of Halo 3 brought the whole sub-HD issue back to prominence when it was discovered that Microsoft's key tentpole title of 2007 was in fact in contravention of its own technical requirements for game developers. By this time, the journalist screenshot trick no longer worked. Developers were internally scaling their lower-res framebuffers back up to 720p, then overlaying text and HUD data at proper HD resolutions before supplying the result to the video output - Halo 3 is a case in point. Text and HUD detail looks terrible upscaled, so this technique ensures readability of key information without requiring the whole framebuffer to be rendered at 720p. Curiously, Microsoft itself patched the screenshot code in their own tools too, so that even older games like PGR3 still output 720p shots.