Skip to main content

Long read: How TikTok's most intriguing geolocator makes a story out of a game

Where in the world is Josemonkey?

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

AMD Radeon RX 5500 XT vs GTX 1650 Super: performance analysis

Assassin's Creed Odyssey/Unity, Battlefield 1.

Our performance analysis of the new RX 5500XT and its GTX 1650 Super competitor kicks off with a range of titles from 2014 to 2018, where the two newcomers are stacked up against established rivals. Our frame-rate figures here are recorded from a system using the steadfast Core i7 8700K, overclocked to an all-core turbo frequency of 4.7GHz. This is backed with 16GB of 3400MHz DDR4 memory running in dual channel mode and a single 2TB SSD from Gigabyte (a PCIe 4.0 model, but running at PCIe 3.0 here). The processor is hooked up to a Gamer Storm Castle 240mm liquid cooler, allowing that all-core turbo to be maintained throughout.

As usual, our results are represented with the bespoke Digital Foundry benchmarking system, which generates its results based on direct capture from the graphics card itself, avoiding the accuracy issues of internal frame-time measurements.

If you're reading on a phone or tablet, you'll see the results presented in a way that makes sense for smaller screens: a simple table for each game with average frame-rate and lowest one per cent measurements at each resolution we tested. On desktop, you get the full-fat Digital Foundry experience.

Play the YouTube videos embedded for each game, and you'll see how each card handles the test scene with real-time frame-rates and frame times. You can add or remove data points from the comparison using controls on the right, so you can see how the same card performs at different resolutions, or perhaps how four cards fare at the monitor resolution you're using at home. Below this, a bar chart provides information on how each card fared on average, including the often-illuminating worst one per cent figure. Remember that you can click to swap from absolute frame-rate readouts to percentage differences, which change as you mouse around the bar chart.

Assassin's Creed Odyssey

What's immediately apparent here is that on ultra high settings, four gigs of VRAM simply isn't enough to cope with a fully unleashed Assassin's Creed Odyssey experience, which explains how the older GTX 1060 (benched here in its 6GB guise) is able to effortlessly outperform the GTX 1650 Super and indeed the 4GB iteration of the RX 5500 XT. A console equivalent 'high' setting would be the best place to start for these cards, clearly. There's a good uptick of 19 per cent in performance terms up against the RX 590, which is fairly impressive bearing in mind that we're looking at 22 active RDNA compute units up against 36 GCN equivalents.

Assassin's Creed Odyssey has never been particularly friendly to AMD hardware, and certainly from a UK pricing perspective what we're seeing here is that AMD's £160 RX 5500 XT offering is only slightly better or on par with Nvidia's equivalent, the GTX 1650 Super. RX 5500 XT in its 8GB incarnation is on par with the GTX 1660, but cheaper with more memory. However, complicating matters is that with examples available for £20 more, the GTX 1660 Super offers a better price to performance ratio - albeit with less RAM.

AC Odyssey: Ultra High, TAA

Assassin's Creed Unity

Legacy title time, and a return to one of our evergreen favourites: Assassin's Creed Unity. This vintage 2014 GPU mangler is a great representative for a new wave of current-gen titles that could really cause problems for legacy PC hardware and still stresses kit immensely today at higher resolutions.

In this particular title, the GTX 1650 is outclassed by the similarly priced 4GB iteration of the RX 5500 XT but in the 1080p heartland, the higher spec model offers no further advantage. Nvidia's GTX 1660 is faster and the more expensive 1660 Super streaks ahead with a 20-point lead. Even at higher resolutions where memory may be more of an issue, Nvidia's lead is intact, despite its deficit in the VRAM department.

Assassin's Creed Unity: Ultra High, FXAA

Battlefield 1

Historically, EA's Frostbite engine has been highly favourable to AMD hardware, with Polaris, Vega and Navi technologies easily beating out Pascal-based Nvidia equivalents. However, things are changing up with the arrival of Turing, where Nvidia better competes against AMD. The established pattern continues in this bench: the four gig iteration of the RX 5500 XT has a small but significant lead against the GTX 1650 Super, while the 8GB version is on par with GTX 1660, with that extra two gigs of VRAM to factor into the buying decision.

It's a pretty good narrative for Team Red, but the great performance offered by GTX 1660 Super makes the purchasing decision a little trickier. It's a difficult one to fathom really, because with the arrival of next-gen consoles next year, there's a good argument that more RAM will be more future-proof for your system than extra compute. Another issue to factor in is RX 580 and RX 590 performance. They're very close to the 5500 XT here, they're cheaper and they too have eight gigs of RAM. Many factory OC RX 580s are nigh-on identical in performance terms to the RX 590 too.

There's an interesting discrepancy here we feel obliged to point out. In some of our benchmarks, the 8GB model of the 5500 XT is actually outperformed - marginally - by the 4GB version. The biggest margin of difference is right here in Battlefield 1. We repeated affected benches with the same results and it is something of a mystery bearing in mind that our 8GB card has a 15MHz game clock advantage over our 4GB card. Perhaps thermals are making a small impact to sustained boost clock? It's mostly margin of error stuff but in Battlefield 1, it's a little more than that.

Battlefield 1: Ultra, TAA

AMD Radeon RX 5500 XT vs GTX 16-Series Analysis