Future-proofing your PC for next-gen gaming

Component upgrade and new build advice from Digital Foundry.

Will a dual-core PC still hack it when PS4 hits? What should you look for in a graphics card upgrade? CPU - Intel or AMD? Digital Foundry investigates.

The arrival of next-gen consoles could well prove to be a double-edged sword for PC owners used to enjoying the best gameplay experience. On the one hand, it's extremely good news: developers no longer need to create engines for multiple hardware types with little common ground - console and PC development will all be based on x86 computer architecture. By extension, the need to use brute-force processing power to overcome unoptimised PC ports will hopefully become less of an issue, leaving gamers to enjoy the more positive aspects of the platform - upgrading, customising, shaping the experience towards their own requirements.

On the flipside, PlayStation 4 in particular offers a substantial challenge to the PC as the top-end gaming platform - a state of affairs that may surprise many. Sony's new console has often been described as a mid-range gaming PC in terms of its overall technological make-up. Rip apart the various components and the claims have some merit, but with the benefits of a closed box design and a unified memory set-up, the new console has certain qualities that could even give high-end PC rigs a run for their money.

All of which leads us to the point of this article. If you own a PC now, what upgrade paths are available to keep your rig competitive with the next generation of consoles? And if you're planning to buy or build your own gaming PC, what components should you choose to ensure that your hardware provides an excellent experience in line with the capabilities of the next Xbox and PlayStation 4?

Buying new - choosing a platform

Intel or AMD? Since the arrival of Intel's Core 2 Duo processors, AMD has struggled to remain competitive, remaining in the game by offering its higher-tier parts at very competitive prices. In recent years it has bet the farm on multi-core performance - its latest flagship, the FX-8350, offers eight cores at 4.0GHz with no overclocking restrictions, while its Intel competitor - the Core i5 3570K - offers four cores at 3.4GHz. In a world where single-core performance still dominates, the Intel offering is still considered the better buy - it's certainly more power-efficient and has more overclocking potential.

We approached a number of developers on and off the record - each of whom has helped to ship multi-million-selling, triple-A titles - asking them whether an Intel or AMD processor offers the best way to future-proof a games PC built in the here and now. Bearing in mind the historical dominance Intel has enjoyed, the results are intriguing - all of them opted for the FX-8350 over the current default enthusiast's choice, the Core i5 3570K.

Perhaps it's not entirely surprising - Crytek's Crysis 3 is a forward-looking game in many ways, and as these CPU tests by respected German site PC Games Hardware demonstrate, not only does the FX-8350 outperform the i5, it also offers up an additional, minor margin of extra performance over the much more expensive Core i7 3770K - a processor that's around 100 more expensive than the AMD chip. Only the six-core Intel Core i7 3930K - a 480 processor - beats it comprehensively.

A comparison of Epic's Elemental demo running on PS4 and the year-old version running on a Core i7 PC with GTX 680. We should expect many of the launch next-gen titles to be PC ports, rather than games designed to get the most out of the new console architecture.

It's a surprising state of affairs bearing in mind how modern games development typically works. In recent times, parallelising code over multiple cores has taken priority. It's the best way to get the same code working on Xbox 360 (three cores, six hardware threads), PS3 (six SPUs, one core, two hardware threads) and PC (anything from two to eight cores). Tasks are allocated as "job queues" that are spread out over whatever processing elements are available, and they are executed in parallel. Now, PlayStation 4 may well have eight cores, but they're running at just 1.6GHz. A Core i5 not only has massively superior single-thread performance, but it's also running at over twice the speed. The FX-8350 offers not only the same core count as PS4 but also a similarly impressive boost to clock speed. So in theory, chips from both vendors should easily outperform the next-gen consoles, but AMD has the potential to offer more performance at the same price-point - as Avalanche Studios' Chief Technical Office, Linus Blomberg, tells us.

"I'd go for the FX-8350, for two reasons. Firstly, it's the same hardware vendor as PS4 and there are always some compatibility issues that devs will have to work around (particularly in SIMD coding), potentially leading to an inferior implementation on other systems - not very likely a big problem in practice though," he says.

"Secondly, not every game engine is job-queue based, even though the Avalanche Engine is, some games are designed around an assumption of available hardware threads. The FX-8350 will clearly be much more powerful [than PS4] in raw processing power considering the superior clock speed, but in terms of architecture it can be a benefit to have the same number of cores so that an identical frame layout can be guaranteed."

In the here and now, games that favour AMD like Crysis 3 are the exception and not the rule. Intel is demonstrably the better choice for the current generation of games as pretty much every CPU review over the last several years demonstrates. However, bearing in mind how well established parallelisation is, it's surprising that AMD hasn't enjoyed more success. One source, who chooses to remain anonymous, tells us that the disparate architectures found in the current-gen consoles are partly responsible for this.

"Getting a common game architecture to run across both [Xbox 360 and PS3] is no easy feat and you have to take 'lowest common denominator' sometimes. This can mean that your engine, which is supposed to be 'wide' (ie. runs in parallel across many cores) ends up having bottlenecks where it can only run on a single core for part of the frame," he says.

"This usually isn't an issue, except when you come to scaling up to PC architecture. If your engine works in a certain way then running more in parallel helps for part of the frame, but you still get stuck on the bottlenecks. This is why, I think, that most games that are 'ported' to PC work better with fewer more powerful cores, like the i5. The single-threaded grunt is enough to get you through the bottlenecks and drive a faster frame-rate."

Crysis 3 - here benchmarked on a GeForce Titan and a GTX 680 using a six-core i7 overclocked to 4.8GHz. V-sync is disabled here to maximise GPU output - as soon as a frame is ready it is displayed on-screen. It's interesting to note that the fact that the Titan has 3x the RAM of the 680 doesn't seem to make any difference, even with the level of detail seen in this richest of games. It'll take time for devs to truly make the most of the huge amount of memory next-gen consoles offer.

The same source also sees AMD as a better long term bet than Intel:

"This (Sony) approach of more cores, lower clock, but out-of-order execution will alter the game engine design to be more parallel. If games want to get the most from the chips then they have to go 'wide'... they cannot rely on a powerful single-threaded CPU to run the game as first-gen PS3 and Xbox 360 games did. So, I would probably go for the AMD as well, as this might better match a console port of a game... based on what we know so far."

Engines like Frostbite 2/3 and CryEngine 3 are built with the future in mind - they are tailored towards getting the most out of PC in the present, with the developers knowing that the investment here will directly transition across to next-gen console development. It's a trend we're likely to see becoming more prevalent as x86 processors become the standard across all major triple-A platforms.

There are reasons to stick with Intel, of course. Power efficiency is markedly improved, you can overclock virtually any Sandy Bridge or Ivy Bridge chip to 4.2GHz (and perhaps beyond) very easily, plus you will get that performance boost in older games over the AMD architecture. But it's worth bearing in mind that there's no upgrade path with the current socket 1155 boards used to run mainstream Intel processors (a new 1150 standard arrives with the Haswell architecture in the summer), while it's believed that the current AMD AM3+ socket standard is good for at least one more CPU generation.

For existing PC owners suddenly looking to jump ship from Intel to AMD, pause for a moment - of all the components, CPU power is probably the least of the concerns the PC platform has, compared to the PlayStation 4 at least. After all, the AMD Jaguar cores in the next-gen consoles were designed to compete with Intel's low-power Atom architecture, created with tablets and low-power laptops in mind. Even with eight of them, today's quad-core and octo-core desktop processors outright own them in terms of processing power. What really sets PlayStation 4 apart from PC is graphics power and bandwidth across the system - the amounts of data that flow freely between the major processing elements.

Why PS4 makes your choice of graphics card crucial

In the current-gen console era, even relatively modest graphics cards easily out-muscle both Xbox 360 and PlayStation 3. The relentless progress of technology is what made the 300 Digital Foundry PC so much more capable than current-gen console, and a great jumping on point for cheap, quality PC gaming. In our recent budget graphics card shoot-out, the Nvidia GTX 650 enjoyed a victory of the narrowest of margins compared to the Radeon HD 7770 but both of them effortlessly power past the Xenos chip in the 360. In the next generation era, that all changes, and matching or indeed surpassing console power becomes a much more expensive proposition.

Similar to the CPU set-up, the graphics cores in both new consoles derive from AMD's PC tech, dubbed GCN (Graphics Core Next). The new Xbox's GPU has much in common with the new Radeon HD 7790 (which we'll be reviewing soon), while the PlayStation 4's chip fits in nicely midway between the higher-end 7850 and 7870. Just to match next-gen console from a core processing perspective, we're looking at investing anything between 130 to 180 in a graphics card. Factoring in the advantages developers have in addressing the technology more directly in a fixed architecture design, ideally we'd actually be looking to move beyond that, taking us into 200-230 territory where we find two excellent products: the GeForce GTX 660 Ti and the Radeon HD 7950.

In terms of pure processing power, the chances are that we now have the horsepower to exceed the first and second generation games seen on next-gen console. But what still isn't addressed to a satisfying degree is the question of on-board video RAM. Both Microsoft and Sony machines use 8GB of RAM with fast access to the GPU. We're currently living in a world where even a 400 GeForce GTX 680 only ships with 2GB - and that's a worry.

"I think we can assume that most games will use a majority of the 8GB for graphics resources, so I'd go for as much GDDR5 on the GPU as possible," says Avalanche's Linus Blomberg.

"For the CPU I'd say at least 8GB DDR3, depending on how much stuff you'll have running in the background. But this is a tricky one! In Avalanche Studios' upcoming titles we'll use a lot of tricks that take advantage of the unified memory layout. But on high-end GPUs there will be ways of compensating for that, to some extent at least."

Others sound a more cautious note:

"Replicating the 8GB unified ram of the Sony console will be impossible," another well-placed source tells us.

"The problem with Windows is that there is always a DirectX type 'layer' between the game and the actual hardware. This marshals and controls the movement of textures/shaders/vertices from the main PC memory to the memory on the GPU. Unless PC games programmers get direct control of the hardware (very unlikely), you will always be fighting against this issue. You never know where your textures are and when they will be uploaded to the GPU, which can cause stalls or micro-stutters in a frame as resources are shunted between the memory types."

And again, similar to the CPU recommendations, we see consensus from all of our sources on how to best future-proof your PC in this respect - buy a graphics card "with as much memory as you can afford". Realistically that means setting your sights on 2GB as a minimum. In our budget GPU piece, we gave the Radeon HD 7850 1GB - often found for 130 - an unreserved recommendation, noting no real difference in performance compared to its 2GB sibling. It's still remarkable value, but the decision not to invest the extra 20 in the 2GB model may yet prove to be an issue in the longer term.

More than that, there's the fact that there isn't really much choice in GPUs with more than 2GB of onboard memory - something we must consider if Linus Blomberg is right and that most of the PS3's RAM will be dedicated to graphics. High-end AMD and Nvidia cards are available with 6GB of GDDR5, which should have you more than covered, but they're hugely expensive. Perhaps in response to the PS4 reveal, we're now seeing some reasonably priced GPUs with beefed-up memory hit the market. The cheapest high-RAM card we could find was the Radeon HD 7950, available in a 3GB configuration for around 220, followed closely by the 250 GeForce GTX 660 Ti.

In terms of the bandwidth issues - the rate at which data shuttles around the system - the major bottleneck here is the transition of data from main memory to the graphics card's GDDR5 pool. In the fullness of time, doubtless there'll be new solutions or faster RAM (DDR4, most likely) but it's down to developers, GPU vendors and perhaps even Microsoft with its DirectX 11 API to optimise the flow of data if it proves to be an issue.

If it doesn't happen, we're left with just one option. While we can talk about PlayStation 4 as a mid-range PC in a miniature box, to comprehensively best the console's most powerful elements, once again it seems likely that PC owners will need to brute-force their way through to improved performance.

If you currently own a mid-range gaming PC and you're worried about your system becoming obsolete in the dawn of the next-gen era, don't panic. To recoup their investment, games need to be scalable across a range of systems. Here's Crysis 3 running on medium settings at 1080p with high quality textures and FXAA on a modest 1GB GeForce GTX 650 Ti. You'll get between 30-50FPS, which isn't bad at all...

Conclusions: everything in perspective

In this piece we've tried to anticipate the general levels of processing power offered by next-gen consoles - PlayStation 4 in particular - and come up with suggestions that keep the PC experience top of the pile. But it's important to note that the arrival of the new Sony and Microsoft consoles isn't going to instantly make your existing system obsolete.

For example, take the GeForce GTX 650 Ti - a 1GB graphics card that is available for as little as 99. As you'll note from the video above, it makes a good fist of playing Crysis 3 (possibly the most "next-gen" game we currently have available for testing) at medium settings at 1080p with high quality textures and FXAA - it's obviously no Titan, but it plays the game very, very nicely on more modest settings and it's still a good experience.

In the short to medium term at least, scalability is the name of the game. To get their money back on hugely expensive development costs, studios need to ensure that their titles run well on a wide range of hardware - big-name games like Battlefield 4, Destiny and Watch Dogs are designed to run on anything from PlayStation 3 and Xbox 360 upwards, so mid-level enthusiast PCs should still offer an experience that compares well with the next-gen console standard.

Also worth remembering is that developers have only received final next-gen console hardware in the last few months - games typically take over two years to develop and most of that work would have almost certainly been carried out on PC. It takes time to get the most out of console hardware, and in the meantime we fully expect enthusiast PCs to continue to deliver the goods. However, equally it's fair to say that your current gaming computer that runs existing titles at 1080p60 or higher may well have problems equalling that level of performance once the next-gen era kicks in at full force. But even then, for those eager for 60FPS gameplay, there should still be options available - even without an expensive upgrade. It all comes down to marshalling the resources available and defining the experience on your own terms.

"Not all games will provide you with the option to go from 30 to 60FPS, as it's an architectural challenge too and usually comes with other drawbacks," says Avalanche's Linus Blomberg. "But if they do, it will always be a trade-off between resolution and frame-rate. A PC card will most often have higher FLOPS, but you'll also typically run at a higher resolution. If you'd stick to 720p, as on most console games, then 60FPS should definitely be feasible. In my opinion 720p at 60FPS provides a superior visual improvement compared to 1080p at 30FPS."

Comments (239)

Comments for this article are now closed, but please feel free to continue chatting on the forum!