GeForce 3 Titanium
Review - the very latest graphics cards from NVIDIA undergo scrutiny at EuroGamer headquarters
- NVIDIAPrice - £TBA
A Lick Of Paint
NVIDIA have sculpted a commanding lead in the consumer graphics card industry, but in doing so they have wielded only the bluntest of weapons. The first of these is obviously speed. The key to every gamer's heart, they reason, is the framerate and resolution at which he can run his favourite games. The second is marketing. It is debateable as to whether they have done enough of the right sort of marketing; they have, after all, assumed an air of dominance without really managing to outdo the brand-recognition of the company they bought out, 3dfx, leaving them open to attack from competitors like ATI. But their ability to sell graphics cards based on features not yet employed in games certainly commands respect. Transform & Lighting sold a lot of GeForce cards, full scene anti-aliasing sold a lot of GeForce 2 cards, and now vertex and pixel shaders are selling a lot of GeForce 3 cards. They are starting to grasp the intricacies of all-round market domination, but is it too little too late? With the introduction this afternoon of GeForce 2 Titanium, GeForce 3 Titanium 200 and GeForce 3 Titanium 500, there exists a card using an NVIDIA chip to fulfil every consumer need. But will the consumer instinctively turn to NVIDIA? Too many customers are slamming NVIDIA in public for its inferior 2D image quality and its extraordinary pricing. With the Titanium line, the old dog has learnt some new tricks, but are they enough? The question on my lips is whether they have sewn up the graphics card market in time, or whether they have failed. If they have failed, a whole lot of heads will roll.
Born To Serve
The Titanium line serves mainstream, performance and enthusiasts. Gamers are after performance and are also enthusiasts, so it stands to reason that the GeForce 2 Titanium, whose features we are fully acquainted with anyway, will be largely ignored for the purposes of this article. The GeForce 3 Titanium 200 and 500 however are aiming to service the cutting edge of gaming. The Ti200 and Ti500 are very high performance cards. The Ti500 can handle 960 billion operations every second and pump out 3.84 billion anti-aliased sub-pixel samples per second. Coupled with 64Mb of 500MHz memory, the card is capable of improving the frame rate in Quake 3 at what many consider to be its peak resolution (1600x1200 in 32-bit colour) by over 40 frames per second compared to the vanilla GeForce 3, bringing the total to almost 125fps. It adds another 1500 3D Marks at 1024x768 in 32-bit colour, giving close to 7500 in the popular gaming benchmark. Throw something like Max Payne or our preview version of Aquanox at it - games that gave even the GeForce 3 a hard time - and it barely breaks into a sweat. Here is where the first of NVIDIA's weapons comes into play. The Ti500 is the fastest consumer graphics card on the market, with the potential to dwarf even the achievements of ATI's rival Radeon 8500, while the Ti200 gives its nearest competitor the Radeon 7500 a run for its money. On all four cards anti-aliasing is a perfectly feasible option, with NVIDIA edging the ATI cards out ever so slightly. Their now familiar unified drivers also make things much simpler for users. The Detonator XP drivers, which have just launched, serve every NVIDIA GPU with the exception of the portable GeForce 2 GO, and build in support for DirectX 8.1, hardware-specific OpenGL revisions and Windows XP. In the process NVIDIA have bettered the ability of their latest cards to overcome bandwidth bottlenecks by optimizing the drivers for Lightspeed Memory Architecture, which you can read about in our previous GeForce 3 review. Apart from a couple of issues (including a refresh rate problem that limits punters to a rate below their monitor's abilities), the unified driver programme is an excellent way of giving the customer an easy option. Here is the latest driver for your NVIDIA card. This is the only file you need.
NVIDIA cards always have something new to peddle. Effectively you are shopping for features to use in six months' time, but since the same will be true of the next card they release you might as well bite the bullet now, six months ahead of widespread implementation. It's a little clumsy though; you can't create an installed base of cards with features that mean nothing to the consumer, you need something to wow them with. So you have to hype them up. Everybody will be using Transform & Lighting in six months, you claim. Are consumers wise to this yet? If they are then there's little evidence of it. Punters will buy into the new features even if they can't use them yet, especially when people like id Software's John Carmack come out and say that they are building their future games around the GeForce 3. The GeForce 3 Titanium series cards introduce two major new features; the shadow buffer and the 3D texture. Shadow buffering is simply a way for the cards to display images with physically correct shadows with nice soft edges instead of the hard outlines seen in most current games. It does this by rendering the scene from the perspective of the light source, rendering it again from the perspective of the audience, comparing the two and then building the shadows based on this. 3D textures are more complex, heightening the realism of objects and the environment by using textures that store information on the third dimension. This means that instead of simply taking a 3D object and mapping a 2D texture over its surface, games can know what the inside of the object looks like as well, which is obviously important if you want to slice it open or break it up. Coupled with the first instance of 3D MIP mapping - the process of averaging a small and a large texture to fill in the gaps - and quad-linear filtering, objects take on a new depth. More realistic volumetric lighting, fog, smoke and clouds, and better caustics are amongst the benefits.
Things are going according to plan for NVIDIA; the GeForce 3 Titanium features faster, grittier visuals that will improve the experience for the consumer and give them something to look forward to in the future as well. Thanks to the unified driver template, consumers will never have to worry about support either. Even the newest operating systems are fully supported, with special optimisations to wring the best performance out of the recently released Windows XP. By contrast ATI's driver site is muddled and difficult to understand, and some of the specific cards aren't even mentioned by name. The only concern for NVIDIA here is that its resellers (companies like ELSA, Hercules and co.) will get the hang of pointing consumers to the Detonator XP driver set. Reference drivers may not suit the needs of the reseller, but widespread confusion is sure to propagate unless they defer control of their video driver contents to NVIDIA. Ultimately, NVIDIA do have a virtual monopoly on graphics cards at the moment, and the GeForce 3 Titanium cards will come in at enthusiast and performance pricing, meaning all round accessibility to meet their main competitor ATI head on. The strategy of improving performance and adding complicated new features has come up trumps again; the Titanium is an excellent graphics card. Control of the market is, appropriately enough, now in the hands of marketing.
Hover over each image for a description. Beware of larger-sized images download times.