Skip to main content

GeForce 3

Preview - nVidia's latest graphical breakthrough, under the microscope

Dark blue icons of video game controllers on a light blue background
Image credit: Eurogamer

- nVidiaPrice - TBA, estimated $699 Why on earth call it the GeForce 3? That's the question I've been pondering for the last few days since details emerged of nVidia's next GPU (Graphics Processing Unit), codename NV20, and its capabilities. As Steve Jobs of Apple so elegantly put it, "this thing is amazing".

The nVidia GeForce 3 GPU

The REAL GeForce 2

Even professionally cynical industry gossip-mongers are having difficulty speaking ill of it, with John Carmack commenting, "I haven't had such an impression of raising the performance bar since the Voodoo 2 came out". The truth of the matter is that the GeForce 3 is the true successor to GeForce. It's a wholly new chip architecture, unlike the GeForce 2 GTS and latterly the GeForce 2 Ultra, which are effectively padded out versions of the original cards. The third iteration offers improved speed, stability and coming back to John Carmack again, "a ton of new features for programmers to play with". GeForce 3 is powered by the new nVidia nfiniteFX(TM) engine, which allows programmers to write to and from the GPU using the Pixel Shader and Vertex Shader processor. In English, it allows them to create a virtually infinite number of special effects and custom looks, without having to rely on a preset palette of ideas. The Vertex Shader is rather like an emotion generator - it will allow even the non-focal points to experience animation and personality. Facial emotion, materials-stretching; it helps scenes to come alive. For example, consider what humans do when we're told something and don't believe a word of it. We screw up our faces, and the skin stretches and contracts as we furrow our brows with uncertainty. Thanks to the Vertex Shader, next time someone tells QuakeGuy that he's being given a lunchbreak, he can furrow with us and we can watch the effect in real time.

The GeForce 3 will debut on Apple Mac, as shown at MacWorld Tokyo

Improvements across the board

FSAA gets a boost from the GeForce 3 as well. You'll remember that 3dfx attempted to overcome the massive requirements of Full-Scene Anti-Aliasing by building gigantic double-length AGP graphics cards with multiple processors and bags of memory, but nVidia have overcome the bottle-necks by using multi-sampling, one mode of which is their somewhat unfortunately named (and patented) Quincunx AA mode. The new technique claims to generate high performance samples at nearly four times the rate of the GeForce 2 Ultra, mostly thanks to the advanced memory architecture onboard. Also getting a boost is Transform & Lighting - a buzzword long expunged from our memories by claims and counterclaims, not to mention battered to death by uncertainty and inexplicability. The GeForce 3 delivers T&L like never before, enabling more complex, visually exciting objects and scenes. As you will no doubt remember from EuroGamer's "Noddy's Guide", Transform and Lighting are two very mathematically intensive procedures used in 3D games. The Transform phase converts 3D data from one frame of reference to another, redrawing every object with each passing frame, while the Lighting element enhances the realism of the scene by controlling the brightness of elements within the scene. Relatively few games have so far taken full advantage of hardware T&L acceleration, but you can expect to see more games and applications using it once cards like the GeForce 3 are firmly embedded in everyone's PCs.

This nice looking fella is from Doom 3, sampled on the GeForce 3

nVictory

Compared to what little competition the industry can muster, the GeForce 3 looks like the Holy Grail. The Radeon, which is the GeForce 3's nearest non-nVidia competitor, is a very good effort, but as John Carmack says has "enough shortfalls that I still generally call the GeForce 2 ultra the best card you can buy right now". The GeForce 2 Ultra, which we put under the spotlight yesterday, will be the market leader going into the release of the GeForce 3, and the jump is apparently mesmerizing. Although the games don't yet exist that take broad advantage of it, using anti-aliasing will become a moot point, it will become a standard thing - hell, it's already pretty much standard with the GeForce 2 Ultra. Previous levels of performance without FSAA are expected to mirror performance levels on the GeForce 3 with it enabled. The GeForce 3 design isn't without faults, however. For instance, the GeForce 3 will incorporate a 128bit DDR memory interface, and unless nVidia decide to announce a 256bit one, or perhaps a second 128bit interface to separate texture and T&L data from one another, the GeForce 3 will be bottlenecked by memory bandwidth, yet again. The downsides of adding another bus or increasing to 256bit are power consumption, the physical space requirement and other low-level impacts. That said, surely nVidia have spent the last year or so working all this out? The possibility that nVidia could add a second 128bit interface is, according to an insider, not to be precluded, but with the cost of the GeForce 3 already estimated at an astounding $700, it will be tough.

He looks friendly...

Conclusions

What this all amounts to is that it's going to be harder to buy a graphics card this year than it was when nVidia, 3dfx and ATI were in tight competition. With the GeForce 3, everything will be quite a bit faster than it was with the GeForce 2 GTS, but for how long? And when people are actually starting to take advantage of it, won't there already be something new in the pipeline, a GeForce 3 Ultra perhaps? nVidia have come dangerously close to doing an Intel, and releasing a chipset and processing unit that is both ahead of its time and overpriced. Many would say that's exactly what they are doing. Early adopters may find themselves hung out to dry when the market wakes up to the GeForce 3's capabilities, but obviously, the smaller the uptake, the less likely any improvements are to become industry standard any time soon. Whatever the price turns out to be, the GeForce 3 will probably be the most powerful video card on the market for the next 6-8 months, barring an unlikely return to fortune for any of nVidia's competition, but whether the GeForce 3 will be worth owning in that space of time is debatable. It really is a case of waiting to see how and where it performs best before jumping in with your readies.

-

ELSA Gladiac Ultra Review

ELSA interview

NVIDIA interview

Noddy's Guide To Graphics Card Jargon

Read this next