If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Inside Digital Foundry: How we analyse the PC gaming experience

Taking benchmarking beyond the bar-chart.

Digital Foundry owes a small measure of gratitude to the PC tech media. Two years ago, journalists from The Tech Report and PC Perspective carried out an in-depth analysis of micro-stutter issues in multi-GPU gaming PCs. The end result was a new type of analysis based on FCAT - a system that sees every frame produced by the graphics hardware marked up with a coloured border. Frames are captured, the video analysed and from there, accurate frame-rates can be calculated - but more than that, the quality of the experience comes into focus. This manifests in the form of frame-time - the amount of time (typically measured in milliseconds) that each frame persists on-screen.

And that's where Digital Foundry's approach to PC performance diverges. Nvidia's FCAT suite includes a bunch of scripts for scraping the relevant data from the video captures, utilised by much of the PC specialist press. We carry out our own analysis using our bespoke analysis tool - FPSGui. The captured videos are imported, the FCAT border mark-up is scanned and from there, our tools place the data on-screen alongside the data.

We can extract the same metrics from the captured video as Nvidia's scripts but the difference is that by overlaying the performance data on top of the actual video, we can judge performance in context. The crucial part of what FCAT offers - 'in the second' frame-times for every image output by the GPU - makes a lot more sense when you can actually see what causes any issues. FCAT also offers an extra layer of reliability to metrics - all benchmarks are measured based on data derived from the video output of the PC (ie what you actually see on-screen), while tools such as FRAPS extract performance data internally, which led to discrepancies in results in the pre-FCAT era.

In the video below, we highlight a recent example of how FCAT allows us to communicate the quality of the PC gameplay experience, using Avalanche Studios' Just Cause 3. We benchmarked a range of cards in our original feature, but the GTX 960 vs R9 380 comparison in particular was fascinating. AMD's hardware is unequivocally more powerful than the Nvidia equivalent, and this is reflected in frame-rate metrics that put the red team in pole position. However, intrusive stutter on the R9 380 arguably made it the less enjoyable experience overall. Using our tools in combination with FCAT, we could isolate the areas that were causing issues and combined with multiple GPU tests across the stack of available hardware, we were pretty sure that driver issues were to blame. One month after Just Cause 3's release, the Radeon Crimson 16.1 hotfix driver update resolved almost all of the issues (on the R9 380 at least) - something we could confirm with A to B testing.

Cover image for YouTube videoInside Digital Foundry: How We Measure PC Performance
A live demo of Digital Foundry's PC performance analysis software - and how performance in context is key to understanding the quality of the gameplay experience.

The video also highlights how we test GPU performance for our graphics card reviews and our GPU upgrade guide. The exact same techniques are used, except that identical sequences - matched to the frame - are captured with FCAT enabled. When the sequence video is imported, FPSGui creates a cache file with all the border mark-up data, and as the content of the video itself is essentially identical between each test run, we can bin the multi-gig captures. All of the FCAT data is overlaid over pristine 1080p60 v-sync captures of each sequence, making the final benchmark videos significantly more 'watchable' (no performance dips, no tearing etc).

Benchmarking generally throws up fewer issues than per-game analysis, for a number of reasons - firstly, drivers are generally mature, meaning that issues like Just Cause 3's launch performance on AMD hardware would be a thing of the past in a typical GPU test. Secondly, necessarily we can only benchmark a relatively small sequence of action. Benchmarks should never be considered as an assessment of how well a particular game will run - it's more a comparative analysis of how well game engines run on each GPU. Once the entire benchmark suite is complete, we gain a more complete picture of how the hardware is performing.

However, as the Far Cry 3 benchmark run highlighted in the video reveals, we can get a grip on certain issues. For example, Ubisoft's open-worlder demands at least 3GB of VRAM for smooth(ish) 1080p gameplay at high frame-rates. Comparing the GTX 960 and R9 380 again - two cards available in both 2GB and 4GB iterations - a couple of conclusions can be drawn. Firstly, that some games really do need that additional amount of memory - intrusive stutter is observed on the 2GB cards, absent on the 4GB variants from both vendors. And secondly that a 2GB Nvidia card has more efficient memory utilisation than its Radeon equivalent: the AMD card stutters more often, and with longer split-screen freezes.

This doesn't happen on every game, but games like this, Assassin's Creed Unity, Rise of the Tomb Raider, Shadow of Mordor and of course Batman Arkham Knight show that the demand for VRAM is only moving in one direction. For any one considering a GPU purchase on a higher-end £150+ card, four gigs really is the way forward and via our tools, we can visualise the impact on the gameplay experience when VRAM is over-committed.

Performance in context allows us to capture and analyse the actual experience of PC gaming and while bar-charts and tables are invaluable in providing an at-a-glance look at performance, the work we've carried out here is more informative: faster isn't always better, and generally speaking, consistency is king. There's still much more work to do here - for example, in better highlighting issues in the 'in-the-second' experience issues in our at-a-glance metrics, not to mention optimising the whole process (capture and analysis is time-consuming) but our aim here is to supplement existing metrics provided elsewhere and try to concentrate more on the actual gameplay experience, and how new hardware can genuinely improve playability on PC titles.

Inside Digital Foundry is an occasional series where the team discuss the techniques and the tools used to produce its articles and videos. Other topics covered so far: