Long read: The beauty and drama of video games and their clouds

"It's a little bit hard to work out without knowing the altitude of that dragon..."

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Nvidia GeForce GTX 960 review

The green team's 1080p 'sweet spot' card is good, but not great.

UPDATE 18/10/15 12:20pm: We've now had the opportunity to review both 2GB and 4GB versions of the GeForce GTX 960, where we compare them with AMD's updated rival, the R9 390 - also available in 2GB/4GB SKUs. Check out this new review for a more recent take on Nvidia's 1080p 'sweet spot' hardware.

Original story: Nvidia's GeForce GTX 970 took no prisoners, reshaping the high-end desktop graphics market by outperforming both AMD's R9 290 and its top-end 290X, brutally under-cutting both with an excellent price-point. Its only drawback? At around £250, the value on offer was - and is - tremendous, but that's still a hefty outlay for a graphics card, its charms remaining out of reach for the majority of PC gamers. All eyes were on Nvidia to deliver the same kind of seismic shift to the GPU market at the £150-£180 sweet spot.

The bad news? GTX 960 doesn't offer the same kind of mindboggling value as its pricier sibling. The good news? It's keenly priced for its position in the marketplace, offering competitive - though not exactly spectacular - performance. Despite its lack of a killer edge, the GTX 960 shouldn't be written off - it has charms of its own that AMD cannot offer, particularly in terms of power efficiency. With a 120W TDP and a relatively meagre power draw, this card runs cooler and quieter than its competition, draining far less juice from the mains. Even running in concert with an overclocked Core i7 CPU, total system power consumption is still under 200W - a remarkable achievement.

The arrival of the GTX 960 sees the debut of a new mid-range graphics core based on the Maxwell architecture, dubbed GM206, fabricated on the existing, mature 28nm process and featuring eight SMM CUDA core clusters for a total of 1024 processors. That's up against 2048 cores in the top-end GTX 980, and 1664 found in the GTX 970. ROPs are pared back from 64 to 32, while the memory interface is compromised too - there's a 128-bit interface here as opposed to the 256-bit version found in the higher-end cards.

Memory bandwidth is the key concern then, owing to the constricted interface. Nvidia's solution? To begin with, it's using top-end 7gbps GDDR5 modules - pretty much the fastest RAM the firm has access to. On top of that, the second-gen Maxwell memory compression interface is in full effect, with Nvidia offering a notional 9.3gbps throughput as data between GPU and RAM is compressed and decompressed on the fly.

The GTX 960's clock-speeds are broadly comparable to its bigger brothers, with core clock running at 1126MHz, boosting up to 1178MHz if thermal headroom allows (and it almost certainly will). Nvidia reckons that the new card is a bit of an overclocking monster, and to that end, many of the 960s reaching the market are factory overclocked out of the box. Our review unit - an MSI GTX 960 Gaming 2G - actually has 100MHz of extra clock-speed added as standard, and there's additional OC headroom on top of that, with Nvidia stating that 1450MHz is achievable with ease, with no fan speed or voltage increases required.

Kicking off our performance testing, we turn to our 'go to' game for hardware stress-testing - Crytek's Crysis 3, running on our new test system featuring a Core i4 4790K running at 4.6GHz, working in combination with 16GB of DDR3 RAM operating at 1600MHz. A fully updated Windows 8.1 is our base operating system, running from a 512GB Crucial MX100 SSD. Our aim with Crysis 3 is to play the game at as close to a locked 1080p60 as possible with v-sync engaged, matching the most popular gaming PC monitor resolution and refresh rate used by gamers today. In order to do this, we need to run Crysis 3 at the high quality settings, one notch down from the maxed out very high we used in our GTX 970 and GTX 980 testing, but otherwise identical.

Stacked up against Nvidia's new card are its two AMD competitors in the same price segment - the recent Radeon R9 285, a 2GB card with a 256-bit memory bus based on the new Tonga architecture, and the older (but arguably more desirable) Radeon R9 280, effectively a rebadged Radeon HD 7950, a definite 'oldie but goodie' in GPU terms, based on the Tahiti design with its mammoth 384-bit memory interface and 3GB of onboard RAM. AMD's newer card is faster in some benchmarks but not remarkably so, and in the age of PS4 and Xbox One's unified RAM set-ups, the more GDDR5 memory you have the better, making the cheaper R9 280 our preferable buy out of the two AMD products tested here.

Despite its smaller memory bus and massively reduced power consumption, the GTX 960 is clearly competitive with its AMD rivals, but unlike the GTX 970, there is no conclusive 'winner' in our three-way face-off in terms of the overall quality of the gameplay experience. Looking at the metrics reveals that the 960 wins in terms of the fewest number of dropped frames overall, but the numbers between all three contenders are very close to the point where we suspect we're well within the margin of error. We'll need to go deeper to separate these offerings.

Cover image for YouTube video[60fps] Crysis 3: GTX 960 vs R9 285/R9 280 Gameplay Frame-Rate Test
Crysis 3 running at 1080p on high settings with very high quality textures in concert with v-sync and SMAA T2X anti-aliasing. Here we're attempting to run an extremely demanding game with resolution and refresh rate matched to the most popular monitors used for gaming today. None of our tested cards can claim outright victory here and the overall experience is much of a muchness.
Crysis 3 1080p60/V-Sync Gameplay GTX 960 R9 280 R9 285
Lowest Frame-Rate 40fps 40fps 38fps
Dropped Frames (from 18650 total) 689 (3.7%) 840 (4.5%) 724 (3.9%)

Buy the graphics cards tested on this page from Amazon with free shipping:

This brings us quite nicely to our revised 2015 gaming benchmark suite - a new series of tests that retains a few old favourites, but concentrates mostly on modern titles that are built from the ground up with DirectX 11 in mind. We carry out all of our testing using Nvidia's FCAT tool, individually marking every single frame displayed on-screen with a coloured border. We capture everything, using our own frame-rate analysis software to scan through the FCAT mark-up, giving definitive results. This allows us to present all of our benchmarking data to you via the videos below. Not only do you get metrics for every single frame captured, you get the context too - you see what's being analysed.

Joining the three cards used for our Crysis testing, we also have another comparison point - Nvidia's outgoing GTX 760, the card that is effectively replaced with by the GTX 960. This presents an intriguing series of data points: usually we are comparing multiple cards based on the same architecture. Here we're seeing AMD's GCN 1.0 in Tahiti take on GCN 1.2 in Tonga, while from Nvidia, the new Maxwell architecture challenges the firm's outgoing Kepler technology.

We're going to see some fascinating results here, as we kick off with 1080p testing at max settings (though we have disabled super-sampling where appropriate, favouring post-process anti-aliasing elsewhere - except in Battlefield 4). To add some further spice to the mix, we've also overclocked the GTX 960 and included our results. The MSI card already has a factory overclock, but we were still able to add 160MHz to the core, and a healthy 700MHz to the RAM. That still seems a little low compared to the claims made for the card's overclockability, but it clearly makes a difference. It's worth pointing out that the GTX 760 is a reference design, and not overclocked at all - this explains its lowly scores to a certain extent, though even if all of our tested cards were running at stock clocks, it would still be the weakest of the bunch.

Cover image for YouTube videoGTX 960 vs R9 285/ R9 280/ GTX 760 1080p Benchmarks
1080p is the most popular gaming resolution - here we run a suite of demanding benchmarks at maximum settings to see what this graphics card is capable of. All benchmarks are produced using FCAT - you get the at-a-glance averages below, but the video shows you how each featured GPU performs on a frame-by-frame basis.
1920x1080 (1080p) GTX 960 GTX 960 (OC) GTX 760 R9 280 R9 285
Battlefield 4, Ultra, 4x MSAA 48.6 54.6 44.0 41.0 45.2
Crysis 3, Very High, SMAA 47.0 50.9 44.1 46.0 49.7
Assassin's Creed Unity, High, FXAA 43.0 47.7 32.3 41.7 32.9
Far Cry 4, Ultra, SMAA 50.8 56.5 41.8 56.3 53.9
COD Advanced Warfare, Max, SMAA 86.4 96.9 61.0 68.2 74.4
Ryse: Son of Rome, High, SMAA 42.2 47.0 33.4 42.4 46.8
Shadow of Mordor, Ultra, Medium Textures, no SSAA 53.2 59.1 49.8 49.6 62.2
Tomb Raider, Ultimate, FXAA 65.7 73.1 54.4 52.4 59.3
Metro Last Light Redux, Max, no SSAA 51.5 47.5 42.3 52.3 52.7

Sledgehammer's new Call of Duty engine produces very good results on Maxwell - moving performance a step beyond both of AMD's cards and wiping out the old GTX 760. However, the new Lithtech engine in Shadow of Mordor shows a significant performance boost on the AMD cards, at the expense of both Maxwell and Kepler - a similar state of affairs seen in Far Cry 4. Meanwhile, our new star attraction - Assassin's Creed Unity - produces the best results on the GTX 960, but the old R9 280 is very competitive, while the newer R9 285 suffers badly. What's interesting to note is that in all cases, Maxwell is a clear winner over its predecessor. However, the same cannot be said for the R9 285 - its predecessor actually beats it in some tests and offers ballpark performance in others. Bearing in mind that the R9 285 has less RAM than its processor, and is around £30 cheaper, that's definitely food for thought.

What's clear though is that while the Kepler-powered GTX 760 is left in the dust, AMD's offerings clearly are in the mix. Of the nine titles tested, there are wins for the GTX 960 in just four titles (ACU, COD, Tomb Raider, BF4), while the R9 285 wins four (Crysis 3, Metro Redux, Shadow of Mordor, Ryse) and the R9 280 emerges triumphant on Far Cry 4. However, there's plenty of devil in the detail, and in our videos - showing the benching process in context - you can see that in several cases, different scenes favour different architectures. Most of the other results show much of a muchness between the three cards, with only minor differences evident that are unlikely to be noticeable during play. What is curious is the appearance of obtrusive stuttering on the R9 285 on Assassin's Creed Unity - crippling performance in a way we don't see on the R9 280. We've run this test several times over with identical results.

Moving on to 2560x1440 testing on the same settings, frame-rates drop significantly - as you would expect with a 77 per cent increase in pixel-count. In many cases, presets require lowering to get good, playable, consistent performance, but it's interesting to note that that the distribution of 'wins' for each card varies in this second round of testing. The 384-bit memory bus of the R9 280 brute-forces its way to victory in Assassin's Creed Unity (beating the GTX 960), also besting the R9 285 in Metro Redux and Ryse. To be honest, we'd recommend moving up to an R9 290 or GTX 970 for 2560x1440 gaming - but with appropriate settings tweaks, the cards tested here can still deliver playable, very attractive results.

Cover image for YouTube videoGTX 960 vs R9 285/ R9 280/ GTX 760 1440p Benchmarks
At 2560x1440, the demands on the GPU increase significantly over 1080p. Significant adjustments downwards on the quality presets of several titles are required to get what we'd consider good, playable performance.
2560x1440 (1440p) GTX 960 GTX 960 (OC) GTX 760 R9 280 R9 285
Battlefield 4, Ultra, 4x MSAA 29.7 31.0 29.4 28.0 30.1
Crysis 3, Very High, SMAA 28.4 31.6 26.8 29.7 30.8
Assassin's Creed Unity, High, FXAA 23.8 26.3 21.1 28.2 22.6
Far Cry 4, Ultra, SMAA 35.0 39.1 29.2 40.7 48.8
COD Advanced Warfare, Max, SMAA 62.3 69.5 49.5 61.4 57.8
Ryse: Son of Rome, High, SMAA 29.9 33.3 24.0 36.7 32.8
Shadow of Mordor, Ultra, Medium Textures, no SSAA 37.0 41.4 35.0 43.9 45.2
Tomb Raider, Ultimate, FXAA 42.7 47.6 35.7 42.0 41.0
Metro Last Light Redux, Max, no SSAA 31.0 34.6 26.5 33.2 32.3

The story so far is pretty straightforward - while AMD and Nvidia have particular games that suit their particular hardware architectures resulting in benchmark 'wins', the GTX 960, R9 280 and R9 285 offer very similar capabilities overall. Crysis 3 is an interesting case in point - according to the raw benchmark runs, the Crytek game is 'better' on an R9 285, but going back to our original gameplay comparison, there's very little difference between all three cards and we'd be happy to play the game on any of them.

What the GTX 960 requires is a differentiating factor or two, and thanks to the Maxwell architecture, it has at least one - power consumption. Looking at peak draw from the wall, the GTX 960 is competitive with the Radeon cards from a performance standpoint, but absolutely annihilates them in terms of efficiency. The more powerful GTX 970 manages to draw 100W less than its AMD competitors at peak load, and remarkably, the GTX 960 almost manages to achieve the same trick at the lower end of the GPU scale.

The implications here are obvious. The GTX 960 saves you money in the long term (admittedly this is probably not the number one consideration for a PC gaming enthusiast) but in the here and now you're getting a cool, whisper-quiet, relatively potent GPU that's equally at home in a small form-factor gaming PC with a low wattage power supply as it is in a standard desktop chassis. The GTX 960 is so power efficient that, even when overclocked, it's still way ahead of the AMD cards in terms of power consumption, handily beating the GTX 760 in the process.

We use the demanding Metro Last Light Redux benchmarking sequence running on a loop to discern power consumption for our GPUs. CPU overclocking is turned off as clock-speed spikes can see huge increases in power draw.
GTX 960 GTX 960 (OC) GTX 760 R9 280 R9 285
Peak System Power Draw 178W 197W 234W 267W 255W

And efficiency isn't just a matter of hardware, it's about the software too - something that is all too readily overlooked. In producing this article, we've followed basic GPU benchmarking procedures used by virtually every hardware review publication. Put simply, you set up a powerful PC with the fastest processor available, eliminate the CPU as an overhead while at the same time ramping up the quality settings. What is left is GPU performance in its purest form, and from there a hierarchy of 'which graphics card is better than the next' is produced. This is all well and good, but the reality is that it also eliminates the GPU driver as an overhead - and as we'll see, this appears to be a very important component for a certain level of gaming PC. We'd say that this kind of way of assessing graphics performance isn't really a problem in scenarios where GPUs are likely to be paired with powerful quad-core Intel CPUs or better. However, in the entry-level and mainstream segments, that assumption cannot be made.

In our recent Call of Duty: Advanced Warfare PC performance testing, we noted that while the established GPU hierarchy was maintained with a Core i7 powering the show, R9 280 performance fell off a cliff when paired with a dual-core i3 - something that didn't happen with Nvidia's GTX 760. At the time, we put it down as a one-off and contacted both AMD and Sledgehammer Games with our findings in the hope that a solution would be forthcoming (right now, nothing has changed). However, we noted something very similar in our recent testing of The Crew. Pair a Core i5 quad-core CPU with an R9 270X or R9 280, and it's possible to enjoy 1080p60 gameplay. However, in complex environments, the same GPUs produce big performance drops when paired with the Core i3. Once again, the GTX 760 emerges without a similar hit to performance, and it's the same state of affairs with the GTX 960, as you'll see below.

This topic still needs to be studied in depth, but the suggestion from these results is that - in some titles at least - GPU driver overhead is significantly higher on AMD cards, meaning that a more powerful CPU is required for the Radeons to maintain their competitiveness with the GeForce cards. This has fundamental implications for a certain section of the established graphics card hierarchy, where AMD typically dominates the value-orientated end of the market. There's a sporting chance that these cards will not be paired with powerful CPUs, making the buying decision sway much more in favour of Nvidia - particularly with the GTX 960, where performance is so close to its rivals.

Overall, this is a topic that needs more exploration on a wider variety of CPUs. It's not so easy to test because sequences need to be found that specifically target CPU load, and historically benchmarking runs concentrate very much on GPU performance. Our range of test processors is limited, but we have confirmed similar behaviour by lowering clocks and disabling two cores on an i7 on an entirely separate PC. In the here and now, we'd say that if you're an i3 owner considering a graphics card upgrade, the established GPU hierarchy might not work for you in choosing the best possible product. To give some idea of the extent of the issue, this is the moment where a GTX 750 Ti outperforms a significantly more powerful R9 280 in Call of Duty: Advanced Warfare in gameplay conditions where 1080p60 at PS4 quality settings is the target.

While indications suggest that those with lower power CPUs may be better off with the Nvidia card even if the AMDs are notionally more powerful, a big disadvantage with the GTX 960 is its limited 2GB of GDDR5 memory (and remember the R9 280 is cheaper and has an extra gig of RAM). In the age of the new consoles with their vast pools of unified RAM, it's essential that your PC GPU has as much video memory as possible. 2GB works fine on the majority of titles out there, but releases like Ryse: Son of Rome and Call of Duty: Advanced Warfare are already recommending that extra 1GB. Meanwhile, Middle-earth: Shadow of Mordor's high quality textures look considerably better than its medium equivalents, but you'll need a 3GB GPU to use them without awful stutter creeping in to the gameplay experience. There's even an optional texture pack that requires a mammoth 6GB of video RAM, though the advantages there are less apparent.

Right now, we'd say that a 2GB graphics card is fine for the majority of games. Even titles like Ryse and Call of Duty look virtually indistinguishable whether they're running on a 2GB or 3GB card, despite the recommended specs. However, Shadow of Mordor shows us the future - and it's looking somewhat blurry for GPUs with limited video RAM like the GTX 960, as you can see in the shots below. Assuming you have a decent CPU, AMD's Radeon R9 280 looks like the most future-proof card out of the quartet of products tested for this feature.

PlayStation 4
Ultra Textures (6GB)
High Textures (3GB)
Medium Textures (2GB)
Middle-earth: Shadow of Mordor runs nicely on all of the GPUs tested in this piece. However, only the Radeon R9 280 has the 3GB of RAM necessary to equal texture quality found in the PS4 version of the game without incurring a crippling performance penalty. Ultra textures are limited to cards with 6GB of RAM, and to be honest, the increase in detail level isn't that stunning at 1080p resolution.
Ultra Textures (6GB)
High Textures (3GB)
Medium Textures (2GB)
Middle-earth: Shadow of Mordor runs nicely on all of the GPUs tested in this piece. However, only the Radeon R9 280 has the 3GB of RAM necessary to equal texture quality found in the PS4 version of the game without incurring a crippling performance penalty. Ultra textures are limited to cards with 6GB of RAM, and to be honest, the increase in detail level isn't that stunning at 1080p resolution.
Ultra Textures (6GB)
High Textures (3GB)
Medium Textures (2GB)
Middle-earth: Shadow of Mordor runs nicely on all of the GPUs tested in this piece. However, only the Radeon R9 280 has the 3GB of RAM necessary to equal texture quality found in the PS4 version of the game without incurring a crippling performance penalty. Ultra textures are limited to cards with 6GB of RAM, and to be honest, the increase in detail level isn't that stunning at 1080p resolution.
Ultra Textures (6GB)
High Textures (3GB)
Medium Textures (2GB)
Middle-earth: Shadow of Mordor runs nicely on all of the GPUs tested in this piece. However, only the Radeon R9 280 has the 3GB of RAM necessary to equal texture quality found in the PS4 version of the game without incurring a crippling performance penalty. Ultra textures are limited to cards with 6GB of RAM, and to be honest, the increase in detail level isn't that stunning at 1080p resolution.

Nvidia GeForce GTX 960: the Digital Foundry verdict

The GTX 960 is a solid, but not spectacular performer. Priced at £160/$199, it falls slap bang in the middle of AMD's two offerings in the same price range. It runs some games better than its competitors, but falls short in others - sometimes significantly so in the case of Far Cry 4 and Middle-earth: Shadow of Mordor. Generally speaking though, all three contenders do a similar job at similar price-points, assuming you're running a PC with processing power equivalent to a quad-core Intel chip, or better.

However, while the GTX 960 fails to live up to the expectations generated by the sensational GTX 970, it has charms of its own. It's more flexible than the AMD cards in the kind of systems it can be integrated with, and you don't need to worry much (at all) about heat or noise. The AMD cards draw a lot more power, and you really need to choose a cooler design that's quiet and efficient (for the record, the XFX R9 280 and the Gigabyte R9 285 we tested have really meaty cooling assemblies that do a fantastic job - and are quiet to boot). However, the £160/$199 market is enthusiast territory and while low power consumption is a nice thing to have, we'd venture to suggest that it's not a primary reason behind a GPU purchase.

With that in mind, the GTX 960 is either a little under-powered or over-priced, depending on how you look at it - a surprising state of affairs as the GPU market still reverberates to the aftershock of the GTX 970 megaton Nvidia dropped at the end of last year. Given a 192-bit memory bus with 3GB of RAM, we can't help but feel that Nvidia could have reshaped the so-called 'sweet spot' sector of the graphics market, but there's a genuine sense that this card has been designed with financial considerations first and foremost in mind, rather than the needs of the gamer - the inclusion of just 2GB of RAM is the biggest misstep here in particular, just as it was for the Radeon R9 285.

The notion of a 'sweet spot' product offering lower quality textures than console versions of multi-platform titles just isn't going to sit well with enthusiast gamers and while it's not a huge problem now, we suspect it will be before the year is out as developers aim to extract more from the prodigious amounts of RAM offered by PS4 and Xbox One. In the meantime, the problem is compounded for Nvidia as AMD is offering a 3GB card that offers competitive performance with a £10-£20 saving. Overall, while there's much to commend the GTX 960, there's a feeling that there are just one or two compromises too many to make it a must-have product for the more budget-conscious PC gaming enthusiast. It's a good effort capable of some sterling performance, but it could have been so much more.