If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Nvidia GeForce GTX Titan X review

Power extreme.

On paper, the possibilities offered by Titan X are hugely exciting. Nvidia already has the fastest single-chip GPU on the market - its GTX 980 is comfortably ahead of the best that AMD has to offer - but its latest GPU offers a potentially massive bump to performance. The GM200 chip at the heart of Titan X offers a 50 per cent spec boost to the GM204 inside the 980 in virtually every regard: CUDA core count, ROPs, memory bandwidth, you name it, there's 50 per cent more of it here. In effect, Nvidia has combined a GTX 980 and a GTX 960 in one single piece of silicon.

Of course, you pay for the privilege - US pricing is set at $999 [UPDATE 18/3/14 2:10pm: UK pricing has just been announced at £879]. Nvidia never gives away its latest and greatest at bargain basement prices, and with the Titan brand it has always charged as much money as it reasonably believes it can afford to. The notion of paying quite so much for a graphics card is sure to rankle, but it's worth remembering that for every Titan, there's a more reasonably priced GeForce product around the corner - the discontinued GTX 780 and 780 Ti bear testament to that, both of them using the same GK110 silicon as the first Titan.

So with the new card, interest in the product is based on two specific areas - firstly, just how much performance can be squeezed from GM200, and how much of a leap Nvidia's ultimate iteration of its Maxwell architecture represents. And secondly, there's the question of that gargantuan 12GB allocation of GDDR5 video RAM. Profligate overkill with no use whatsoever or the ultimate future-proofing solution in a world where current-gen console has redefined just how much VRAM is required? Well, it is clearly an immense amount of memory, but Titan X works best at extreme resolutions and our 4K testing suggests that the 4GB found in the GTX 980 isn't quite enough to service 4K gaming on at least one of the games we tested. Meanwhile, other games use the memory as a vast cache - we spotted Call of Duty Advanced Warfare using up to 8.5GB of VRAM.

First impressions of the card are impressive. Titan X ships with the standard high-end Nvidia casing, this time presented in a black aluminium finish. The 12GB of GDDR5 takes the form of 24 512MB memory modules encircling GM200 on both the front and the rear of the PCB, but otherwise it looks very similar indeed to other premium Nvidia cards - though the plastic backing plate found on the reference GTX 980 is gone. Power is supplied with a six-pin and eight-pin connector - standard Titan territory.

In terms of acoustics, Titan X sounds much like its Titan Black/GTX 780 Ti predecessors, though there is some 'whine' from the board when the card is under load, despite Nvidia's use of polarised capacitors and moulded inductors. Curiously the noise seems to come and go - during our overclocking tests, we didn't notice it at all. While the overall chassis looks very similar indeed to existing Nvidia products, we're told that airflow is increased over the last-gen Titan, resulting in better overclocks. The firm claims that a 1400MHz overclock is possible, a claim we found to be true after extensive testing (though Titan X really doesn't like to be pushed any further beyond this limit).

Kicking off with our performance tests, we load up our 'go to' title for an initial assessment of the card's abilities. The GTX 980 made mincemeat of our Crysis 3 maxed 1080p60 challenge, with just a couple of performance dips, so we went one step further with Titan X, targeting a 2560x1440 resolution. The Crysis test is somewhat different from a standard benchmark run. Instead of turning off v-sync and running the GPU as fast as we possibly can, we aim for consistency, matching the resolution and refresh rate of the monitor for the smoothest experience, meaning v-sync is engaged.

Stacked up against Titan X we have its nearest competitor, the GTX 980, along with AMD's best offering, the Radeon R9 290X. For the latter, we only have a reference card, known for overheating - something the third party models generally avoid. To take out throttling from the equation, we ramp up the fan to the 65 per cent level - adding an unwelcome mini-vac style noise to the room, but ensuring that the GPU never breaks 77 degrees Celsius.

Overall results demonstrate conclusively that while Titan X can't quite match the experience we've previously enjoyed with a multi-card Titan set-up, it's still by far the most capable card of the three, handing in a much smoother, more consistent gameplay experience. So there's a clear jump in performance between Titan X and GTX 980 that's felt in the way that even the most demanding games play - but the question is to what extent it represents a tangible leap over its stable-mate.

Cover image for YouTube video[60fps] Crysis 3 Titan X vs GTX 980/R9 290X 1440p Gameplay Frame-Rate Test
In our Crysis 3 gameplay tests, the idea isn't to run at the absolute fastest frame-rates possible, but rather to synchronise as closely as possible with the display in terms of resolution and refresh - in this case, operating at 2560x1440 at 60Hz. We're using the top-end very high preset, with very high textures, SMAA T2x anti-aliasing and v-sync.
Crysis 3 1440p60/V-Sync Gameplay GTX Titan X GTX 980 R9 290X
Lowest Frame-Rate 44.0fps 31.0fps 32.0fps
Dropped Frames (from 18650 total) 477 (2.56%) 3852 (20.66%) 5560 (29.80%)

Order the graphics cards tested against the Titan X from Amazon with free shipping:

UPDATE 18/3/15 11:06: We've made some changes to our benchmarks below in terms of comparisons with the GTX 980. Our initial text was based on performance from an MSI Gaming 4G version of the GTX 980, which is faster than the reference design. While the difference isn't vast, it does have an impact on the percentage boost you should expect to see from Titan X. We've incorporated both sets of measurements so you can see the difference, and adjusted the article to include comparisons with both versions of the card. The SLI and Crysis tests remain unchanged as they used the reference 980.

Moving into pure benchmarks, what becomes clear is that a 50 per cent boost to specification doesn't translate into a linear increase in raw performance. GPU power is increasing at a phenomenal rate year-on-year, but other hardware and software within the PC isn't. To cut to the chase, the lower the resolution you run your games at, the less pronounced Titan X's advantage is over the GTX 980.

Looking at 1080p results against the GTX 980 reference card, only Ryse breaks the 30 per cent improvement barrier - with most of the other titles lurking just below. Far Cry 4 and Call of Duty Advanced Warfare only see a 16-17 increase in overall frame-rates. On average across all nine games, the Titan X is only 26 per cent faster than GTX 980, and only 21 per cent faster than the MSI Gaming variant. Nvidia's reviewers' guide offers healthy comparisons with AMD's Radeon R9 290X and we can see why - in our tests, the average performance boost at full HD leaps to a colossal 53 per cent.

The evidence seems to suggest that we're bumping into the limits of the game design and the graphics API. Most games are optimised for quad-core processors at least these days, but DirectX 11 is still built around the principle of a fast single thread feeding all of the others - it's for this reason that six- and eight-core CPUs offer little advantage over a quad, something we hope to see resolved with DX12. But in the here and now, results suggest that unless you really like MSAA or other bandwidth intensive effects, you can achieve an experience relatively close to the Titan X in many cases by overclocking a much less expensive GTX 980.

Cover image for YouTube videoTitan X vs GTX 980/R9 290X/GTX 970 1080p Benchmarks
1080p is the most popular gaming resolution, but the Titan X's resources aren't best utilised. We record a 26 per cent boost over the GTX 980 on average (21 per cent vs the custom MSI GTX 980), but a whopping 53 per cent boost over the R9 290X. Overclocking sees us hit system limits elsewhere within the PC - on average we gain just 13 per cent over stock performance.
1920x1080 (1080p) R9 290 R9 290X GTX 970 GTX 980 MSI GTX 980 GTX 980 SLI Titan X Titan X OC
Battlefield 4, Ultra, 4x MSAA 64.3 68.0 74.6 86.5 89.6 138.1 111.2 121.9
Crysis 3, Very High, SMAA 65.4 70.2 71.4 81.5 85.9 113.7 105.0 120.5
Assassin's Creed Unity, Ultra High, FXAA 38.0 42.6 51.7 62.4 62.6 100.0 77.6 90.4
Far Cry 4, Ultra, SMAA 72.2 75.0 77.4 87.4 92.0 103.5 101.5 104.9
COD Advanced Warfare, Extra, FSMAA 90.5 92.9 117.8 128.0 134.6 110.2 149.9 158.9
Ryse: Son of Rome, High, SMAA 69.2 74.8 65.1 75.8 80.4 116.9 99.2 113.9
Shadow of Mordor, Ultra, High Textures, FXAA 82.8 88.3 80.8 91.7 95.9 136.3 118.9 137.7
Tomb Raider, Ultimate, FXAA 84.1 89.0 102.4 118.2 123.8 208.9 150.4 180.8
Metro Last Light Redux, Max, Post AA 72.6 77.5 79.4 91.3 96.9 130.3 117.5 137.8

Moving up to 2560x1440 - 2.5K or 1440p, if you will - the overall average increase in performance compared to the GTX 980 rises to 29 per cent with the reference GTX 980, but only 23 per cent against the MSI Gaming variant. Interestingly, the differential with R9 290X drops to 48 per cent. The higher the resolution, the more the GTX 980's bandwidth limitations come into play and conversely, the more performance you get from the R9 290X's mammoth 512-bit memory bus.

To be honest, we had expected the GTX 980's performance to drop back somewhat more significantly, but what's clear is that Nvidia's compression technology really pays off, even at 1440p. Overclocking would be an option there for even better performance of course, but it turns out that Titan X is a capable overclocker too. We added 230MHz to the core clock, 465MHz to RAM and ramped up the power limit to 110 per cent (unfortunately no voltage boost options were available). At 1080p, this only gave us a 13 per cent increase to stock performance, but at 1440p that rose significantly to 18.2 per cent.

And here's where things get really interesting. With the overclock in place, Titan X compares favourably with two GTX 980s operating in SLI - even beating the dual GPU set-up in games like Crysis 3 and Call of Duty Advanced Warfare. The COD situation highlights another advantage of single-chip over SLI - the fact that some games simply don't work properly with it. In truth, a single GTX 980 offers a better experience than an SLI set-up owing to the insane stutter using a multi-GPU set-up incurs. It's worth taking a look at our Titan X overclock vs GTX 970/980 video - the combination of more RAM and no need to synchronise two cards results in smoother performance overall.

Cover image for YouTube videoTitan X vs GTX 980/R9 290X/GTX 970 1440p Benchmarks
Moving up to 1440p, the Titan X begins to stretch its legs. The increase over GTX 980 increases to 29 per cent (but only 23 per cent with the MSI Gaming GTX 980), and the overclock gives us an average 18 per cent boost over stock. However, the lead over the R9 290X diminishes down to 48 per cent.
2560x1440 (1440p) R9 290 R9 290X GTX 970 GTX 980 MSI GTX 980 GTX 980 SLI Titan X Titan X OC
Battlefield 4, Ultra, 4x MSAA 43.4 46.8 48.9 57.0 59.6 100.8 75.1 87.0
Crysis 3, Very High, SMAA 42.0 45.1 43.0 50.0 53.7 73.6 67.0 78.6
Assassin's Creed Unity, Ultra High, FXAA 26.6 29.5 32.8 39.4 41.1 71.0 51.6 60.7
Far Cry 4, Ultra, SMAA 53.2 57.8 53.9 61.3 65.6 101.0 78.6 93.6
COD Advanced Warfare, Extra, FSMAA 77.2 81.5 87.0 98.2 102.0 95.4 114.6 133.8
Ryse: Son of Rome, High, SMAA 51.8 55.6 46.3 54.1 57.9 94.4 71.4 84.8
Shadow of Mordor, Ultra, High Textures, FXAA 61.3 65.7 57.0 66.0 68.9 104.6 85.3 102.6
Tomb Raider, Ultimate, FXAA 56.9 62.4 65.5 76.7 81.0 142.2 101.7 122.4
Metro Last Light Redux, Max, Post-AA 45.9 48.9 48.9 58.3 60.0 93.0 74.5 87.9

Alternative analysis:

It becomes clear that 4K gaming is the challenge that Titan X has been waiting for. We knock down quality settings a notch for each title (ultra level settings are generally overkill from an image quality perspective and can hit frame-rate hard) in order to give us playable frame-rates, and find that the Titan X's relative performance increase over the reference GTX 980 hits 30 per cent as an average across all nine games, and dropping to 41 per cent versus the R9 290X. We also get our best overclocking results here - the average across titles rising to 21 per cent. Curiously, the MSI variant of the GTX 980 fails to offer any increase over the reference card here, presumably as bandwidth becomes the bottleneck over compute with this extreme pixel-count.

Outside of compute, it's clear that the massive increase in resolution (4x 1080p, 2.25x 1440p) really puts the GM200 architecture through its paces, and those 96 ROPs and the 384-bit memory bus get to stretch their legs. At 4K, the benchmark comparisons with the GTX 980 SLI set-up really show the card's strengths - frame-rates are competitive but as you can see from the videos (which also track frame-times - more indicative of the actual gameplay experience), the overall consistency in performance is significantly improved. Take Assassin's Creed Unity, for instance. GTX 980 SLI frame-rates are higher than the overclocked Titan X by nine per cent, but it comes at a cost - significant stutter. In this case, we suspect that ACU at 4K is tapping out the 4GB of RAM on the GTX 980, while Titan X has no real memory limitations at all.

Overall then, Titan X - and by extension, future GM200 products - gives us something we've been looking forward to for some time: the ability to play at 4K with decent quality settings and a sporting chance of hitting and sustaining 60fps. Pair Titan X with a G-Sync monitor and you're looking at an experience that no other GPU is currently capable of achieving.

Cover image for YouTube videoTitan X vs GTX 980/R9 290X/GTX 970 4K Benchmarks
At 4K, the GPU becomes more of a bottleneck, so the bandwidth, ROPs and RAM of the Titan X come into play. Across our nine test games we record an average 29 per cent performance boost over the GTX 980, while the overclock yields a 21 per cent increase over stock performance. Titan X's lead over the R9 290X drops again, down to an average of 40 per cent.
3840x2160 (4K) R9 290 R9 290X GTX 970 GTX 980 MSI GTX 980 GTX 980 SLI Titan X Titan X OC
Battlefield 4, High, Post-AA 36.9 39.4 39.5 46.8 47.0 76.2 61.4 73.4
Crysis 3, High, SMAA 33.9 36.0 31.9 39.0 39.4 49.6 51.8 61.8
Assassin's Creed Unity, Very High, FXAA 16.6 18.1 18.4 21.8 22.2 37.9 27.9 34.7
Far Cry 4, Very High, SMAA 33.3 36.0 30.0 36.1 36.9 65.0 45.9 56.4
COD Advanced Warfare, Console Settings, FXAA 60.2 62.1 59.6 69.0 69.8 63.3 85.3 100.7
Ryse: Son of Rome, Normal, SMAA 31.2 34.0 25.7 31.5 31.8 53.2 41.5 50.0
Shadow of Mordor, High, High Textures, FXAA 41.5 44.8 35.3 42.3 42.1 68.4 56.2 67.1
Tomb Raider, Ultra, FXAA 37.6 41.0 39.0 47.0 46.8 78.8 60.7 75.6
Metro Last Light Redux, High, Post-AA 30.7 32.4 29.9 37.3 36.9 62.0 48.9 58.5

Alternative analysis:

GM200 is often referred to as 'Big Maxwell' - the largest slice of silicon Nvidia produces for its current architecture. It's a bit of a beast, featuring eight billion transistors and an overall area somewhere in the region of 600mm2. You'd expect it to suck the electricity from the wall and to give off a good degree of heat, but it retains the power efficiency the architecture has become famous for. Using the Metro Last Light Redux benchmark sequence as the basis for our like-for-like measurements, we find that peak system power draw on a Core i7 4790K system is pretty much identical to the Radeon R9 290X - not bad considering that you're getting 40 to 50 per cent more performance. It's also considerably more economical than running two GTX 980s in SLI.

Overclocking sees the card draw an additional 40W from the wall at peak load, taking us over the 400W threshold for the entire system. This procedure also sees the fan kick in at a higher level - louder but still not particularly intrusive. We can't help but feel that the GM200 chip has more to give - there were no overvolting options available in MSI Afterburner (not yet, at least) and while temperatures could hit 85 degrees if the system was left to look after fan control itself, letting the MSI tool take over the cooling saw the fans ratcheted up just a touch and temperatures fell below 80 degrees.

Results from the reference card are impressive then, but we'd really like to see what a customised Titan X could do with the MSI TwinFrozr cooler or Asus' Strix design, to name two prominent examples.

R9 290 R9 290X GTX 970 GTX 980 GTX 980 SLI Titan X Titan X OC
Peak System Power Draw 340W 363W 265W 265W 454W 364W 402W

Nvidia GeForce GTX Titan X - the Digital Foundry verdict

Titan X is clearly an impressive piece of technology, but for many, the big takeaway will be that the leap from GTX 980 to Titan X in gameplay terms doesn't seem quite as pronounced as the gap between the last-gen equivalents - the original Titan was anything from 35 per cent to 50 per cent faster than the GTX 680. Now this may well be down to our benchmarking tests being rather more stringent these days, plus we're using more modern, demanding games but to get anything close to the same leap in performance, you need to be using an ultra-high resolution display - or else downsampling from 4K. With a 25 to 30 per cent increase in performance over the GTX 980 at 1080p and 1440p (shrinking somewhat if you have a factory overclocked card), the premium price-point for the full halo product is going to be even harder to justify than the original Titan's.

However, there's enough of a leap here to make any forthcoming, more reasonably priced GTX 980 Ti or GTX 990 (based on the same chip) look quite enticing, particularly when the results of Titan X are stacked up against the performance of GTX 970 and GTX 980 in two-card SLI configurations. The single-card set-up is preferred amongst enthusiasts owing to higher levels of compatibility and less stutter, but what we're seeing with Titan X is that with an overclock in place, performance is broadly comparable with an SLI set-up, frame delivery is smoother and, in the case of Assassin's Creed Unity at least, there are no memory limitation issues at 4K. And then there's the potential of running a GM200 card in SLI too - conceivably we could hit 4K at 60fps on even the most demanding titles.

Overall, Titan X seems to demonstrate that while GPU hardware continues to scale, the surrounding hardware and software inside our PCs isn't progressing at anything like the same rate. With modern games really pushing harder on both CPU utilisation and Direct X 11's capabilities, it seems we're hitting performance bottlenecks that possibly aren't related to the graphics hardware. The fact that Titan X's biggest gains are in 4K territory - where we are far more likely to hit the GPU as the limit - appears to bear this out, but we suspect that it'll require DX12 to really show us what this hardware is fully capable of.