If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

What does it take to run Destiny 2 at 1080p60?

Digital Foundry on how budget PC hardware can deliver a superb experience.

The Destiny 2 beta finally arrived on PC this week, delivering a vast upgrade in terms of customisation over the console builds. Adjustable quality settings, unlocked frame-rate and field of view along with HDR support take pride of place in a package that seemingly does everything it can to capture the heart of the PC gaming enthusiast. It also gives us some idea of just how optimal the core code is, how well it scales across different hardware - and perhaps provides some insight into whether the upclocked CPUs in PS4 Pro and Xbox One X might be able to handle 60fps gameplay.

Bungie's justification for the locked 30fps on all console versions relates to how CPU-heavy the game is in terms of its physics, animation and netcode processing - and bearing in mind the issues we've seen with titles running with unlocked frame-rates on PS4 Pro, we're inclined to give the developer the benefit of the doubt here. That said, the PC version - a collaboration between Bungie and partner studio Vicarious Visions - proves highly adaptable in providing a 60fps experience with even the most meagre of gaming CPUs.

Enter the Pentium G4560: two cores, four threads and a locked 3.5GHz tells you everything you need to know about this one. It's an i3 in all but name. In fact, in virtually all of our gaming tests, pairing it with 2400MHz RAM (its limit) sees performance move into line with the last-gen Core i3 6100 when paired with 2133MHz DDR4 modules. The beauty of it is that this is a $65 processor, UK prices fluctuating according to exchange rates and availability. Put simply, it's the best price vs performance processor on the market.

Ruling out GPU as a bottleneck by using a GTX 1080 Ti at 1080p with carefully adjusted settings, the Pentium delivers 60fps - even in the beta's most demanding section, the fiery Tower defence stage a few minutes into the action. We get close to 100 per cent utilisation across all four threads, but the $65 processor holds up. Beyond that, PvP and even the Inverted Spire strike cause no problems - in fact, processor load seems lighter here, especially in the Crucible.

Cover image for YouTube videoDestiny 2 PC: What Does It Take To Hit 1080p 60fps?
Want to see our hardware testing in action? Check out just how low you can go hardware-wise with Destiny 2 - and still get a good experience.

On top of that, Destiny 2 on PC also features a properly implemented 30fps cap in its line-up of options (engine-driven cut-scenes seem to be locked at this frame-rate regardless), essentially switching your PC into console mode. Those 30 frames per second are delivered at an even 33ms per frame, ensuring a consistent experience. Of course, 60fps offers a palpably more responsive, smoother experience, but the 30fps Destiny 2 experience is still good (for joypad users at least) - after all, this is how the original was played by millions on its first outing on consoles, and it'll be the same with the sequel too.

The key point is that the CPU burden drops accordingly with the halving of frame-rate, meaning that even less capable processors than the Pentium should still offer an experience that matches the consoles. We tested out an old Pentium G3258 - a 3.2GHz dual core chip (with no hyper-threading), paired with 1333MHz DDR3. At 30fps, it held up OK but could stutter - a factor of its lack of threads and depleted memory bandwidth. Multi-threaded chips are actually more likely to do better though, based on the ugly stutter the Pentium hands in on modern game engines that a slower i3 from the same generation manages to avoid. In turn this opens the door to a range of budget-priced AMD CPUs and APUs.

The bottom line? 30fps is a doddle on PC and even 60fps gameplay is easily attainable, and we'd say that modern i3s and older i5s going back to the classic 2500K should be capable of delivering that crucial doubling of frame-rate over the console experience. Go higher and the amount of overhead available is breathtaking - a Core i7 7700K at 4.5GHz is capable of delivering frame-rates between 120-200fps: excellent news for competitive players and owners of high refresh rate monitors. It's actually very difficult to find a modern game that doesn't encounter some kind of system bottleneck when running at very high frame-rates. The likes of Battlefield 1 and Overwatch manage it and Destiny 2 can proudly takes its place alongside them.

We'd say that the Pentium G4560 or Core i3 6100 are the baseline CPUs for running Destiny 2 at 60fps - and you'll need an Nvidia card to pair with them. With an AMD GPU, the increased driver overhead requires an i5-level processor.

With performance this good, could PS4 Pro and Xbox One X deliver 60fps gameplay? In the initial stages of the beta, Bungie allowed third party monitor overlays to operate (they were disabled on Wednesday, annoyingly) and we could get an idea of what causes higher CPU loads. Crucible PvP barely troubled our Pentium at all at around 50 to 60 per cent utilisation, while all other gameplay essentially saw load rise in line with the amount of entities in play at any given point. That stands to reason - some of the CPU's primary jobs include processing AI, physics and calculate animation. The more characters on-screen at any given point, the harder that job gets.

If we assume that PS4's 1.6GHz CPU can handle 30fps with some overhead to spare, logic suggests that you'd be requiring more than a 31 per cent increase in processing power to run everything at 2x speed, which rules out PS4 Pro. Xbox One X hands in a 44 per cent increase in CPU power, but again, this still doesn't seem like anywhere near enough brute force to get the job done. In terms of our PC testing, our only regret is that we didn't have an AMD Jaguar-based CPU available for testing. Although designed for mobile applications, quad-core Jaguars did appear in AMD's Kabini-based line-up - and overclocking the Athlon X4 5350 or rarer 5370 might have offered up some interesting insights.

But what about the GPU? Well, once again, the 30fps limiter brings older and less capable hardware back into contention. The GTX 750 Ti - the second most popular gaming graphics card according to August 2017's Steam hardware survey - can be run at close to max settings at a locked 30fps. Just dial back depth of field and ambient occlusion a notch and swap out MSAA anti-aliasing for SMAA and you're good to go. And in fact, the same formula works beautifully for using the classic GTX 970 at full-fat 4K in all but the most demanding scenes - some drops to medium may be required to ensure a solid lock there though.

It takes a mixture of highest, high, medium and low settings, but combined with the Pentium G4560, a GTX 1050 can produce great results for 60fps gameplay with only minor drops to performance in the most demanding scenes. Go for GTX 1060 to push your settings up to the high level, and GTX 1070 for the highest.

As things stand, Destiny 2 is very scalable in terms of its graphics settings, and a mixture of medium and high options, along with 16x anisotropic filtering looks great and allows for 60fps gameplay on a £100/$120 GTX 1050. The biggest drains on GPU resources are MSAA anti-aliasing, 3D ambient occlusion and the depth of field effect - not to mention shadows, of course. Dial back the internal scaler to 83 per cent (effectively rendering at 900p with a full HD HUD) and the GTX 750 Ti hands in decent full frame-rate results too. Once again you're looking at GTX 1060 and AMD's Polaris-based RX 470/480/570/580 for pushing the boat out on graphical features at 1080p, but you may get more improved image quality by tweaking the high quality preset (texture quality and filtering can go higher, adjust other settings to medium as required) and ramping up the internal scaler for super-sampling instead. Destiny's anti-aliasing isn't exceptional - even MSAA produces underwhelming results - and nothing beats super-sampling.

One interesting aspect of tweaking Destiny's settings concerns how much VRAM texture quality requires. It may well be the case that the highest quality textures aren't included in the beta download, but as things stand, with this setting ramped up to the max, a 2GB graphics card seems to be able to hold all the required data with no signs of swapping to system memory - even when the game's internal memory meter measures much higher according to your settings. This is excellent news - there are still a huge number of capable GPUs out there, let down only by their 2GB framebuffers. These products - along with today's entry-level GPUs - aren't ruled out of contention with this title, it's super optimal, seemingly in every way.

So, what does it take to run Destiny 2 at 1080p60? Well, a £60/$65 CPU paired with a £100/$120 graphics card does the job just fine with the price and performance requirement dropping drastically if you're happy with the console-level 30fps - which still feels good to play. Maxing everything moves up the GPU requirement to a hefty GTX 1070/Vega 56, but you can keep most of the bling and get a great experience with GTX 1060 and the Polaris RX chips. One potential fly in the ointment concerns AMD's DirectX 11 driver - we swapped out GTX 1080 Ti for a Vega 64 and found that the Pentium G4560's lock on 60fps could drop to the mid-40s in the heaviest scenes, a scenario you're likely to encounter with Core i3 chips too. In this case, the CPU requirement for 60fps Destiny 2 increases to the Core i5 level when using an AMD card.

In conclusion, while there were some concerns about the quality of the PC Destiny 2 port owing to the recruitment of a partner studio, the fact is that this is one of the most technically accomplished PC versions we've seen to date, adapted skilfully to accommodate the unique strengths of the platform. The level of scalability on offer in this build is just exceptional, the additional features are well-implemented and for its return to the PC gaming market, Bungie and Vicarious Visions have seemingly knocked it out of the park. The only fly in the ointment? The game's internal frame-rate monitor is astonishingly inaccurate and disabling third party monitoring tools (FRAPS, Riva Tuner etc) is unnecessary and annoying. Fingers crossed that the developers will resolve both of these issues before the game's October 24th release date.