Last week, AMD finally revealed its hand in terms of its next generation graphics line-up. Based on the new RDNA 3 architecture and featuring highly innovative designs, two products were unveiled: a flagship Radeon RX 7900 XTX priced at $999 and a cut-down RX 7900 XT - yours on December 13th for $899 (UK prices for both products remain unknown at this time). The RDNA 3 reveal was AMD's most promising opportunity in years to disrupt a discrete graphics market that sees Nvidia command an 80 percent share - and hopes were high that AMD could reshape the competitive landscape. So how did it fare?
What's clear is that the reality of AMD's products did not live up to the pre-launch hype delivered by leakers who clearly were not in possession of much in the way of actual facts. Talk of 2x performance boosts and 'almost 4GHz GPUs' clearly let down some fans and rather unfairly, took the sheen away from AMD's actual achievements, which are highly impressive in many ways. For example, an additional 50 to 70 percent of performance is, by and large, exactly what Nvidia achieved with RTX 4090. And we're seeing the first realisation of AMD's chiplet design in the graphics space, where a 5nm compute processor sits on an interposer with six memory cache dies at 6nm - saving money. This does seem to have come at the expense of clock speeds and thus raw performance - a 2.3GHz core clock is only a small bump faster than RDNA 2 when other 5nm products have proven prodigiously faster. But the point is, AMD is breaking new ground here with rewards that can only scale positively in future products.
Beyond the highly innovative chiplet design, the RDNA 3 architecture itself also looks like more of a refinement of RDNA 2 - there is a lot more compute power, but the compute units themselves look similar in design ethos to their predecessors. There's still no sign of the kind of RT hardware acceleration seen in Intel Arc and Nvidia products and there are seemingly no bespoke machine learning blocks either - everything is built into what AMD describes as its 'unified compute unit'. This has huge advantages in terms of die area (and thus cost) but what it also means is that RT and ML features will continue to lag behind the competition. Based on AMD's own numbers, RT performance vs last-gen rises almost entirely in line with non-RT performance.
- 00:00:00 Introduction
- 00:00:35 News 01: RDNA 3 unveiled!
- 00:24:15 News 02: FSR 3 frame-rate boost tech teased
- 00:31:26 News 03: How does RDNA 3 measure up against RTX 4000?
- 00:45:18 News 04: RT upgrades in Halo Infinite, Forspoken, and Snowdrop
- 00:51:31 News 05: PSVR2 release details announced
- 00:59:33 News 06: Microsoft takes $100-200 loss on Xbox Series consoles
- 01:03:51 News 07: Sackboy PC Stutter Struggles + shader compilation questions
- 01:11:47 DF Supporter Q1: What’s the best value CPU + motherboard + RAM combination for a PC?
- 01:15:41 DF Supporter Q2: Was FSR 2 really worth it for the Series X version of Scorn?
- 01:18:09 DF Supporter Q3: IS ALEX GERMAN OR AMERICAN? I MUST KNOW.
For many though, it's all about the pricing - Nvidia's biggest weakness. On the face of it, AMD has delivered a clear pricing win in that neither of its top-tier products breach the $1000 barrier, while both of Nvidia's rivals do - and by quite some margin. The question is how these cards compare competitively and here we still have no concrete details. After the event, AMD positioned its RX 7900 offerings against Nvidia's upcoming RTX 4080, effectively ceding the 'GPU king' status to Nvidia and the RTX 4090. The firm sees the 4090 as a different class defined by its lofty price point, and would therefore not provide competitive benchmarks - RTX 4080 is the target instead, and product is not available for comparison right now. Fair enough.
However, the benchmark numbers AMD provided for the RX 7900 XTX position the product very clearly in delivering a 50 to 70 percent performance uplift vs its existing RX 6950 XT, which would see its non-RT performance sit within striking distance of the RTX 4090 and conceivably ahead of the RTX 4080. Ray tracing performance? The consensus is that both of the RTX 4000 cards will be faster, with the AMD flagship more in line with Nvidia's last-gen.
The lack of competitive benchmarking does mean that PC outlets have produced 'projected' numbers which look great for AMD in non-RT workloads - but publishing estimated numbers only adds noise right now, in my opinion, as what AMD actually benched and at what settings is not clear in most instances. We need independent testing that is not based on vendor metrics. Only AMD know how close to the RTX 4090 its products are - and if they are offering 80 to 90 percent of the non-RT performance as the estimates suggest, I'm surprised we didn't see that in the presentation.
Despite not talking about competitive performance, AMD couldn't resist some comparisons against the competition in terms of RTX 4000's form factor, power inputs and display connectivity. The RDNA 3 cards follow on in design language from RDNA 2, where AMD finally delivered excellent reference designs after years of disappointing blower-based offerings. Heavily implying that Nvidia's RTX 4000 cards are too large and require a power supply upgrade seemed a bit needless though - none of us here have had any issues with Founders Edition cards. Third party monsters? That's another issue and it looks like AMD will be getting its fair share of those too.
Perhaps more on point is the power input situation where the new AMD cards - just like the old ones - require just two eight-pin inputs - and that's because power requirements are lower than the new Nvidia cards. The green team has pushed board power hard - perhaps unnecessarily so, in fact - with extremely limited returns at the high-end. It may well be the case that the constrained clock speeds on RDNA 3 simply preclude the idea of requiring any more power, but we'll have to wait until the reviews to see exactly how efficient the new offerings are.
Other aspects of the presentation that caught our eye included support for DisplayPort 2.1, up against Nvidia's DisplayPort 1.4a. AMD spent a lot of time talking about about high resolution, high refresh rate displays that do not exist yet, that no GPU currently on the market can really service - so I see this point of differentiation as a 'nice to have' for AMD and a disappointing omission from Nvidia, but not a game-changer.
Meanwhile, I found AMD's announcement for FSR3 - seemingly a temporal frame generation technique similar to DLSS 3 - rather intriguing. On the one hand, competition in this space improves standards and the concept is a superb companion for high refresh rate displays. On the other hand, DLSS 3 - while extremely promising - still has teething issues. While some parts of the PC community have written off DLSS 3 as 'fake frames', by developing its own alternative AMD has effectively validated the concept as something desirable and worth having. It may have actually helped Nvidia here by acknowleding the value of the concept and I'll be fascinated to see how the Radeon solution pans out.
We discuss this a lot more in the Direct, but we also try to tackle biggest question of all: is Nvidia in trouble in the wake of the RDNA 3 reveal? Personally, even if the RX 7900 XTX gets close to the RTX 4090, I feel that product remains unassailable to those who want the best of the best - it should still be ahead in rasterisation performance and it will be a generation ahead with RT. A $600 gulf is huge, but I see the RX 7900 XTX vs RTX 4090 face-off as a re-run of the RX 6900 XT vs RTX 3090 battle of 2020. The 6900 XT was an excellent product for non-RT gaming and it brought the fight to the 3090 admirably, but it failed to excite the audience in the same way - at least based on sales.
Positioning the RX 7900 XTX against RTX 4080, AMD has chosen its battleground wisely, because I see this card as the ultimate litmus test for the power of the Nvidia brand in a time of transition in terms of price vs performance. It's a test for Nvidia's ability to shift premium-priced products because on a technological level, the 4080 is facing challenges across the board. Based on AMD's numbers and what little Nvidia has revealed of the RTX 4080, the RX 7900 XTX should be faster in non-RT gaming. Ray tracing performance should see AMD lag, but if it does match the top-tier Ampere cards, it'll still be capable - especially when combined with FSR2. However, RTX 4080 does not just face competition from AMD, it faces competition from Nvidia too - specifically from the legacy $699 RTX 3080 10GB, a truly excellent product that's still extremely capable today.
|Model||CUs||Game clock||VRAM||Mem. bus||Board power||Launch MSRP|
|RX 7900 XTX||96||2.3GHz||24GB||384-bit||355W||$999|
|RX 7900 XT||84||2.0GHz||20GB||320-bit||300W||$899|
|RX 6950 XT||80||2.1GHz||16GB||256-bit||335W||$1299|
|RX 6900 XT||80||2.0GHz||16GB||256-bit||300W||$999|
|RX 6800 XT||72||2.0GHz||16GB||256-bit||300W||$649|
A point I'd like to end with isn't exactly optimistic. There's no doubt that AMD's prices are good compared to Nvidia's but I'm concerned about the RX 7900 XT, where the Radeon team provided almost no performance data whatsoever. The specs suggest that the technological gap between XT and XTX models is actually wider than the difference between last-gen's RX 6800 XT and RX 6900 XT. While the XTX costs the same as the 6900 XT, the 7900 XT at $899 is 23 percent more expensive than the 6800 XT was back in 2020. That's nowhere near as much of a bump as the RTX 4080 is compared to RTX 3080, but it strongly suggests that AMD, like Nvidia, is reducing margin on its flagship offerings while at the same time raising prices by significant amounts on less capable cards - leading to the worrying conclusion that we may well be looking at less impressive improvements in performance vs cost the lower down the stack we go.
This DF Direct post has proven longer than usual, but the RDNA 3 reveal is a pivotal point for PC graphics and even after filming the show last Friday morning, I've still been thinking a lot about the products - and can't wait to review them. However, there's a lot more to talk about in this week's episode and perhaps the revelation that Xbox Series X still costs Microsoft a supposed $700 to produce two years into production is enough to put the GPU price hikes into perspective. Silicon is clearly not getting cheaper any more and inflation is rife.
Beyond that, we also talk about the PSVR2 launch, its reasonable-for-the-specs price-point but its apparent lack of an absolute 'must have' game at launch. We also tackle the reaction to the Halo Infinite RT upgrade for PC - and yes, provide answers as to whether Alex is American or German! Of course, members of the DF Supporter Program will know this essential information already, so join us to gain access to our brilliant community, bonus material, early access and much more.
Digital Foundry specialises in technical analysis of gaming hardware and software, using state-of-the-art capture systems and bespoke software to show you how well games and hardware run, visualising precisely what they're capable of. In order to show you what 4K gaming actually looks like we needed to build our own platform to supply high quality 4K video for offline viewing. So we did.
Our videos are multi-gigabyte files and we've chosen a high quality provider to ensure fast downloads. However, that bandwidth isn't free and so we charge a small monthly subscription fee of £4.50. We think it's a small price to pay for unlimited access to top-tier quality encodes of our content. Thank you.Support Digital Foundry