Digital Foundry vs. HDMI video • Page 3

1.50 vs. 100 cable, and PS3/360 outputs put to the test.

Accurate screenshot exports on 360 aren't quite so common, but thankfully Microsoft kit out journalists with a tool from the XDK development kit that allows us to FTP into debug consoles and test kits and - yes - dump video RAM for screenshot purposes. Sony isn't quite so forthcoming with its own equivalent (ProDG's Target Manager) alas.

So here are a couple of games with the same shot dumped via the XDK tool and then captured via HDMI at all three of the reference levels: standard, intermediate and expanded. While we do see the precise RGB levels we would expect, the gamma level is nothing like it should be on any of the shots.

On Assassin's Creed: Brotherhood, the framebuffer dump comes from the intermediate capture. We had to re-run the capture to do the other two settings, which explains minor differences in the image.

It's worth pointing out that you are still getting the pin-point precision that HDMI offers, but getting the right colour balance is going to require some pretty nifty display-side tweaking. There is the danger that the ramped up gamma will produce black crush style artifacting that strips away some element of image definition. The accuracy of our frame-rate analysis tools is not affected, however.

While the change is self-evident, many games - Assassin's Creed: Brotherhood included - do operate with a dynamic lighting subtly altered by world objects that appear to be randomly placed, so there are some tiny changes in the make-up of the image between capturing sessions. Just to be perfectly clear about the differences between the various modes, we shifted to Teenage Mutant Ninja Turtles: Reshelled, which has a very static form of lighting we can take advantage of.

Here we see similar differences and still no firm match for the original framebuffer.

The Xbox 360 does process internally at 10-bit component and you can see some dithering artifacts if you look closely enough at the pixel level. And of course the Xbox 360 started out without any support for HDMI at all - it wasn't introduced until the Zephyr model that debuted in the form of the Xbox 360 Elite. Beforehand, the best output you could get was analogue component or VGA. Could this bizarre gamma shift be a bug within the new HANA controller that replaced the previous, analogue-only ANA chip?

Well, a comparison of component up against the framebuffer suggests the issue is not related just to the 360's HDMI output. While the scale of the gamma shift is different, it's still clearly not the right colour balance, whether you're using analogue component or the HDMI interface's YPrPb digital component alternative.

Quite why this happens is something we were uncertain about before a kindly developer clued us in. From what we've learned this week, it appears that this ramping up of the gamma is actually deliberate on Microsoft's part. The actual reason it is in place on the Xbox 360 is because Microsoft believe that it looks better on the average TV. Bearing in mind the breadth of displays available and how they are typically so badly calibrated when you buy them (brightness and contrast are often ramped up in order to make them stand out on the shop floor), we can't help but think that this is a call that the developer should make.

Developers are made aware of the gamma conversion, and Microsoft provide an exact table of the transformation so it can be reversed should developers wish to factor it out while they build the framebuffer - but with the black crush we see on occasion, we can only wonder if this is a 100 per cent non-destructive process. However, the fact that this option is there means that the issue probably can't be changed with a firmware update, as the chances are that "corrected" games will then look very strange indeed. Properly calibrated games like Burnout Paradise for example will no longer be displayed correctly.

Going back to our Face-Off comparison shots, the fact that people automatically assume that it is the PS3 game that is "washed out" as opposed to the 360 version having the gamma arbitrarily adjusted suggests that perhaps Microsoft has it right, that maybe the image is more pleasing to the human eye.

Update: An interesting email from a key multiplatform developer this morning corroborates the fact that 360 operates internally in component, the REC.709 standard to be precise, and this contact suggests that in theory it is the correct way to address an HDTV, though in practise sticking to RGB in the way that PS3 and PC do is more developer-friendly. He also suggests that the richer colour we see in 360 Face-Off shots might be explained in some cases because so much development is 360-led: calibration takes place on 360 and isn't corrected on PS3, resulting in the "washed out" look. However, even factoring in the REC.709 component standard, this developer also believes that 360 HDMI output does require adjustment.

Regardless, there does appear to be something rather odd going on here - what is the purpose of the reference levels? Displays operate either at limited RGB (360's "standard") or full-range RGB ("enhanced" on the dash) so you do have to wonder quite why there is an "intermediate" setting at all when to the best of our knowledge there is no hardware that supports it. The intermediate setting does tone back the gamma effect a little, but this setting definitely isn't operating with full-range RGB so the gamut of available colours is lower.

For its part, PlayStation 3 appears to output the framebuffer in exactly the way that developers create it. However, with the advent of the PlayStation 3 Slim, Sony decided to change things around a bit with the hardware make-up of the HDMI interface and this has resulted in a curious side-effect.

DTS-HD MA and Dolby TrueHD audio bitstreaming capabilities were added to the HDMI output, and Bravia Link functionality was also included. This came about via a new HDMI controller chip supplied by Panasonic, which appears to have the unfortunate side effect of adding a slight noise to the video output - something we can only assume is a bug in the design.

Thankfully it's imperceptible to the human eye, but our HDMI cable hash check would've been a complete non-starter if we'd used the Slim to carry out the tests. It's also the reason we don't use the Slim for performance analysis tests, as unique frame-counting requires the noise to be filtered out. We implemented a fix pretty quickly, but for the sake of precision and ease of use, we simply moved back to the noise-free output of the original "fat" PS3.

For the record, the HDMI implementations in all the AMD and NVIDIA graphics cards we've used over the last couple of years have been absolutely fine.

To conclude then, it's fair to say that the advent of HDMI has effectively made the era of stupendously expensive AV cables with dubious-quality claims somewhat obsolete. That 1.50 (including delivery) cable from Amazon will do sterling work for your Xbox 360, PlayStation 3 or media PC - and if it is in some way not up to the job, you'll see it immediately in the form of obtrusive digital artifacting. Only if you're attempting some seriously long connections will a custom cable be required - and even then, the chances are there is an inexpensive version available that will do the job just fine.

As for what comes out of the HDMI port of the HD generation consoles, while the precision and quality offered by the pure digital signal is second to none, bugs or platform holder filtering mean that in many cases we're still one step away from that mythical lossless transfer from video RAM to display.

Comments (215)

Comments for this article are now closed, but please feel free to continue chatting on the forum!