Nvidia's GeForce GTX 1080 is the undisputed GPU king, unparalleled in virtually every game application - but that situation changes just two weeks from now on August 2nd, with the release of a new, next-gen Nvidia Titan X. Based on the firm's new Pascal architecture, we're looking at Nvidia's largest 16nm processor yet, featuring 3584 CUDA cores (up against GTX 1080's 2560), 11 teraflops of power and 12GB of GDDR5X memory.

Titan by name and seemingly a Titan by nature, the new processor is codenamed GP102 and may have some lineage from the 'Big Pascal' chip found in Nvidia's P100 super-computer line - the CUDA core count is the same, after all. Using the new 16nm FinFET process, the new Titan X features 12 billion transistors, a vast upgrade over the 7.2 billion found in the GTX 1080's GP104 chip. However, the next-gen HBM2 memory used in the P100 isn't utilised here. Instead, Nvidia is sticking with the GDDR5X technology utilised with GTX 1080 - albeit with a twist in the form of much increased memory bandwidth.

VRAM allocation aside (12GB vs 8GB), the new Titan X uses a 384-bit memory bus vs the 256-bit interface on GTX 1080. In a stroke, this should see memory bandwidth rise by 50 per cent, bringing us up to 480GB/s. There's no data from Nvidia on ROP count or texture units but as Anandtech surmises, 224 texture units and 96 ROPs is likely based on the way Nvidia tends to scale its processor architecture.

Anandtech has also crunched the figures, suggesting that compared to GTX 1080, "the new Titan has 24 per cent more shading/texturing/geometry/compute performance, 50 per cent more memory bandwidth and 33 per cent more ROP throughput". What is notable from Nvidia's figures is that - as is usually the case with its 'big chip' products - the clock-speeds are generally lower than the smaller processors in its line-up. The 1733MHz boost clock found in GTX 1080 drops to 1531MHz here - but it'll be interesting to see the extent to which real-life boost clocks compares (Nvidia's averages here are generally quite conservative) and also how well it overclocks. All of the Pascal processors we've tested thus far tend to hit their limits at around 2.05GHz. Here's a quick look at how the new Titan X's specs stack up compared to the rest of the Pascal line.

Nvidia's reveal video for the new, Pascal-driven Titan X.

Titan X GTX 1080 GTX 1070 GTX 1060
CUDA Cores 3584 2560 1920 1280
Base Clock 1.42GHz 1.6GHz 1.5GHz 1.5GHz
Boost Clock 1.53GHz 1.73GHz 1.68GHz 1.7GHz
Memory 12GB GDDR5X 8GB GDDR5X 8GB GDDR5 6GB GDDR5
Memory Bandwidth 480GB/s 320GB/s 256GB/s 192GB/s
TDP 250W 180W 150W 120W
Processor GP102 GP104 GP104 GP106
Transistors 12bn 7.2bn 7.2bn 4.4bn
US Price $1200 $599 $379 $249

In terms of the new Titan X's physical properties, it's looking very much like an extension of the design language used in GTX 1080 and GTX 1070, while retaining the black metalwork of the last-gen Titan X. We should also expect to see its standard line-up of video outputs - 3x DisplayPort, HDMI 2.0b and a dual-link DVI port. Like previous Titans, power is supplied via one eight-pin and one six-pin PCI Express inputs on the top of the card. We're looking at the usual blower-style cooler here, using the same vapour chamber technology used in GTX 1080.

So the question is what gamers will actually be able to do with this card that they can't with GTX 1080? One thing that Nvidia does deserve some kudos for with all of its high-end offerings is the sheer scalability across resolutions - even 1080p saw big gains with GTX 1080, and we're fascinated to see if that valuable functionality applies here too, or if we will finally see an end to scalability, particularly in DX11 applications. But with a card like this, it's likely to be 1440p, ultrawide 3840x1440 and full-fat 4K where we should really see just what this card is capable of delivering.

Last weekend, we took a look at MSI's Gaming Z version of the GTX 1080, overclocked it and then ran a 'let's play' 4K60 gameplay video that you can see embedded below. For our money, the GTX 1080 is the first GPU we've come across that can run advanced titles at 4K with high frame-rates (50-60fps) and without unduly limiting visual impact via pared back quality presets. That said, a consistent bottleneck we encounter in virtually every game is a weakness with advanced effects work - principally, alpha transparencies. This suggests a fundamental limitation either in memory bandwidth or ROPs - areas where the new GP102 processor features big, big boosts compared to GTX 1080's GP104. 4K60 locked at ultra or close to it... will the new Titan X take us there?

Our attempts to hit a locked 4K60 on demanding PC titles met with encouraging - if not totally consistent - results with an overclocked GTX 1080. Hopefully the new GP102-powered Titan X will hit the target.

And now for the not quite so welcome news: Nvidia has boosted performance significantly with all of its Pascal products thus far, but they've all seen increases in price too compared to the products they replace (GTX 1080 vs GTX 980, for example). The new Titan X is no exception, with a $1200 ticket price - $200 up from the first Titan X. On the one hand, it's not too difficult to imagine that the new 16nm FinFET production process is probably more expensive for Nvidia than the mature 28nm process it replaces. However, on the other, Nvidia essentially has the high-end to itself and can set its own prices. AMD has concentrated its efforts on reclaiming market share in the mainstream sector, leaving the green team to dominate the high-end with no competition.

For those looking for this level of insane performance and not ready to lay out this level of cash, we can assume that a GTX 1080 Ti will be waiting in the wings - possibly aligned for release when AMD's Vega architecture rolls out. And this poses an interesting question: how can Nvidia trim back the new Titan X into a viable GTX 1080 Ti? The last-gen approach principally involved cutting back to 6GB - not an option here as the GTX 1080 would then have a RAM advantage. There may be some leeway in Nvidia cutting the CUDA core count though. That strategy was also deployed on GTX 980 Ti and bizarrely resulted in virtually no performance drop whatsoever compared to the full-fat Titan X.

The bottom line? The new Titan X should cement Nvidia's complete domination of the top-end as the new 'halo' product in the graphics market, but there's still the sense that there's more to come. The increase in shaders vs the increase in transistor count compared to GTX 1080 may suggest that this iteration of the GP102 processor is a salvage part - that there are more deactivated CUDA cores on the die, meaning that the door is open to an even more powerful card. And there is precedent for this - GTX 780 Ti has more cores than the original Titan, while being based on the same processor. Additionally, it's interesting to note that Nvidia still hasn't deployed HBM2 memory for its gaming flagship.

We're fascinated to see just how powerful the new Titan X is, and we'll re-run our 4K60 gameplay test along with the usual benchmarks and bring you all of the media and metrics as soon as we can.

Sometimes we include links to online retail stores. If you click on one and make a purchase we may receive a small commission. For more information, go here.

Jump to comments (108)

About the author

Richard Leadbetter

Richard Leadbetter

Technology Editor, Digital Foundry

Rich has been a games journalist since the days of 16-bit and specialises in technical analysis. He's commonly known around Eurogamer as the Blacksmith of the Future.

More articles by Richard Leadbetter

Comments (108)

Hide low-scoring comments
Order
Threading

Related

Like what we do at Digital Foundry? Support us!

Subscribe for only $5 and get access to our entire library of 4K videos.

Digital Foundry

Digital FoundrySamsung adds FreeSync support to select 4K TVs

Tear-free, smoother action for AMD GPU and Xbox One owners.

Digital FoundryDetroit: Become Human is a different kind of tech showcase

What happens when advanced tech, talent and budget is deployed on a more focused experience instead of a giant open world?

Digital FoundrySony is working with AMD's Ryzen CPU tech - and PS5 is the most likely target

PlayStation programmer improving Ryzen support on a key developer tool.

Digital FoundryDF Retro: Revisiting E3 2004 - PlayStation Portable vs Nintendo DS

How these machines defined mobile gaming, from smartphones to Switch.

Advertisement