One of the biggest assets of any games console in this day and age is its underlying graphical architecture. Many moons ago, it was a question purely of software, screenshots and format to carry the weight of the console. I bought my original Playstation on the strength of its launch titles and the fact that it was a CD-based 32bit system - and I dare say a lot of you will have as well. I had no particular interest in its graphical capabilities, nor in how they compared to rival machines. If my favourite developers and games genres weren't present, and it wasn't a next generation console, I was not going to be satisfied. The console industry has matured; gamers have matured. I no longer buy consoles based on games and graphics, I look at the big picture. If Console Y can pack more polygons to its punch than Console Z, I want to know what the real-world benefits of such a development are. Yet now that consoles seem to be converging upon a single point, technologically; the modern-day PC, the graphical capabilities, are going to carry the most weight. As such, the main consideration in purchasing a "next-generation" console should surely be its graphical processing power... In this feature, we're going to take a look at what each of the next-generation consoles is packing, graphically, and also to pick apart the various terms which may be blinding your judgement. Hopefully we'll come to some form of conclusion about who has the upper hand as well, but it may be too tight to call. Who knows, keep reading!
The Battle of the Brands
Sony and Microsoft are the unusual adversaries in this contest. With Nintendo's Dolphin still shrouded in secrecy and uncertainty, and Sega's Dreamcast already well-established, the two newcomers to the console market are really where the battle is at. And at the moment, it's more a war of words than anything. Early impressions of Japanese PS2 launch software are less than favourable, but that is to be expected. Games like Tekken Tag Tournament and Evergrace should do a lot to redress the balance. At the moment everything we (literally) see out of Microsoft's X-Box seems either fabricated or rendered! The two key labels here are Graphics Synthesizer, and Custom-NVIDIA 3d GPU. A lot of people are choosing to buy into the idea that Sony have something very special in the shape of the Emotion Engine, the main system behind the PS2's processing. It's backed up by a "Graphics Synthesizer", which runs at half the system clock-speed (~150Mhz) and has 4Mb of onboard VRAM. In essence this is very similar to Microsoft's NVIDIA setup, which would doubtless utilize that company's "GPU" (Graphics Processing Unit) for separate graphical processing (fairly obviously). Current GeForce 2 PC graphics cards run at 200Mhz Core Clock-speed, with Double Data Rate memory at 333Mhz (or 166.5, Single Data Rate). So it's fair to assume that if the NVIDIA chipset present on the X-Box is Single Data Rate, it will have a marginal edge. On the other hand if it's Double Data Rate, things are looking even more in its favour. Either way, the 200Mhz Core Clock-rate puts it a fair way above the Playstation 2 (which as stated clocks at ~150Mhz), and the Dreamcast is left burning up in the haze, some miles behind. Both setups include DVD Decoding and both are more labelled than featured, if you get my meaning. Another consideration is "FSAA", or "Full Screen Anti-Aliasing". The X-Box can do it, but the PS2 cannot. So, to keep track, the dazzling terms used (abused?) so far are... Emotion Engine: Technically a buzzword. It's something Sony have cooked up to disguise the ~295Mhz processor, using 128-bit system architecture. GPU: Graphics Processing Unit. A term coined by NVIDIA to describe the onboard processing powers of their GeForce line. Many card manufacturers are now choosing to opt for a similar route. Both the Playstation 2 and the X-Box have GPUs in one form or another. Think of it as a separate processor exclusively for the graphics card. Single Data Rate and Double Data Rate: Commonly referred to as SDR and DDR. DDR indicates that [hardware element] transfers data at twice face value. So memory clocked at 165Mhz, would therefore actually operate as 333Mhz, if DDR. The term SDR is only necessary to describe the standard - you need a term to describe what DDR isn't! FSAA, or Full Screen Anti-Aliasing: A big industry catchphrase this year, think of it in terms of a hill. The brow of a hill, with the sun beaming over it. Within a game that didn't utilize FSAA, jagged edges where the hill curved would be clearly visible - I'm sure you all recognise that phenomena. FSAA uses shades in-between the two contrasting colours to create a visually "blurred" effect, removing these jagged edges almost entirely from view, and making the image a lot less rough. This causes a performance hit, but in the case of consoles, which are built and tested on a level playing field, it may not.
Treading Deep Water
I'm going to jump off a bridge here and admit that it's very confusing stuff. At the moment, because (unlike PCs) consoles are not created using the same core components, the figures and specifications provided by each company may not make sense to all and sundry. So it basically must boil down to the ratios again, and the "my stats are bigger than yours" arguments. You have to look at it in terms of which can do the most, and operate at the quickest speeds, and of course produce the best image quality. The Playstation 2 seems to suffer in light of the X-Box's specifications. It will feature a NVIDIA-based graphics engine, with a separate GPU, DDR memory, and even FSAA. At the end of the day, you'll have to wait, and probably pay a premium for these features, but it's clear who's winning the next-generation war.