What this demonstrates fairly clearly is that the AMD implementation isn't really good enough for use on console, where the available 720p resolution simply isn't high enough to produce visually pleasing effects. It could perhaps be deployed on certain games though - The Saboteur, for example, might not be using pure MLAA, but the visual drawbacks are very similar, and it still looks pretty decent.
However, with AMD only officially supporting its new high-end HD 6850 and 6870 graphics cards, it's highly unlikely that you'll be using a resolution as low as 720p. So, let's run the same series of clips at 1080p. So we can stream them we're clipping the image to fit within our 720p player but still running at 1:1 per pixel, and it's clear that while a good deal of blur is added to the image overall, the negative elements of the MLAA are not so apparent.
Sony's MLAA solution is designed for a 720p framebuffer and produces outstanding results that the AMD post-process can't really compete with. However, the fact that AMD's MLAA is so light in terms of GPU resources means that there is nothing to stop you adding the effect in addition to hardware MSAA, so in theory you could deploy 2x MSAA to deal with a lot of the sub-pixel issues, then use the MLAA post-process to effectively kill off the close-up and ugly jaggies that remain.
Going forward though, it wouldn't be surprising if AMD decides to add in game-specific MLAA profiles. As its tech is running at the driver-level, it's difficult for it to get access to the raw image without the HUD, but perhaps with developer assistance game-specific setups could be deployed that minimise the artifacts and make it a more viable alternative to the performance and RAM-sapping MSAA.
While the AMD solution isn't really happy working at console resolutions, other developers are looking into producing a viable solution for the Xbox 360 and PC. While AMD has hit the headlines with its implementation, this cross-platform PC/360 project actually pre-dates the AMD work by several months. Jorge Jimenez, Belen Masia, Jose Echevarria, Fernando Navarro and Diego Gutierrez have collaborated on an MLAA system that they say produces effects approximate to 8x MSAA.
They've also posted a movie, demonstrating their tech.
The developers also include timings on processing frames from a range of AAA titles, showing an average cost of 3.79ms on Xbox 360. This isn't insignificant bearing in mind that a 30FPS title will want to render a frame in less than 33ms, and while it sounds like a figure in the same ballpark as the PS3's MLAA, it is worth remembering that Sony's tech operates on SPU, and is designed to be run in parallel with the RSX graphics chip. Any Xbox 360 solution will be GPU only. We asked Jorge Jimenez if it can really be described as a like-for-like alternative to Sony's offering.
"The RSX works in parallel with the Cell SPUs, that's true. But that also means the Cell SPUs aren't available while the MLAA algorithm is running. They moved the algorithm from the GPU to the CPU, giving extra time for the RSX to render better graphics (6ms as explained here), including improved shaders or increased geometry complexity," Jimenez says.
"But it also reduces time for physics calculations, artificial intelligence, etc. It is a wise decision, however, given that the RSX can easily be the bottleneck, given its weakness when compared with the PS3's CPU. Our case is similar, but we are exchanging GPU time for GPU time, as the CPU is not involved at all. In contrast with the PS3, Xbox 360 has a strong GPU when compared with its CPU, so we believe that moving the calculations to the CPU, in that particular case, would not make much sense."
Jimenez is keen to point a couple of differences between the two implementations, however.
"Firstly, using a Cell-based MLAA requires a level of SPU and GPU management that not all games can afford. In contrast, our approach is totally straightforward. Secondly, our solution is universal, it can run in DX9, DX10, DX11, Xbox 360 and, in theory, even on the PS3. So, to make a long history short, if we consider the machine as a whole, yes, our technique is a like-for-like alternative."
But does the new GPU implementation address any of the image quality issues we observed in both the Intel proof of concept and AMD code?
"We haven't had the time to make a more in-depth study of our algorithm in motion, but looking at our demo video, we think it looks quite comparable to 8x MSAA at 720p. However, as we have said before, we are studying temporal coherence... In the material we tried, we didn't observe many distracting artifacts when handling graphics in motion," Jimenez says.
"But still, as using 8x MSAA has a cost of around 5.192 ms and our technique only requires 0.44 ms in a GeForce 9800 GTX+, that gives plenty of room to enhance the quality. We are currently working in quality improvements in the PC version (and considering a more in-depth study of our MLAA approach in motion), and performance improvements in the Xbox 360."
On PC, the low cost of the team's MLAA means that a hybrid approach can make a lot of sense - Jimenez believes that an optimum configuration would be a 4x MSAA pass followed by the MLAA. The first pass would reduce the kind of sub-pixel issues that MLAA typically has problems with, while the second pass would tidy up additional edges.
The 360 MLAA work looks promising, and we hope to be able to put the tech through it paces in a follow-up feature: Sony's MLAA work is interesting in that it takes the existing concept work and turns it into a fully-fledged solution that might not solve some of the underlying problems but does an excellent job of working around them. If the 360/PC solution from Jimenez and his co-authors can match the quality, that would be quite an achievement. In the meantime, other developers are producing interesting work using alternative anti-aliasing algorithms, with LucasArts' DLAA looking really impressive...