The surprise Nvidia announcement at CES? Surely it has to be the reveal of adaptive sync/VRR support for GeForce products - or to put it more plainly, Nvidia graphics cards now deliver FreeSync functionality. Right now, only 10-series Pascal and 20-series Turing GPUs get the required support, but this is a highly significant development: over 550 monitors out there adhere to the variable refresh open standard, and with the arrival of a new driver earlier this week, you can test your GeForce graphics card on any of them.
In short, Nvidia has folded FreeSync support into its G-Sync brand - and as cool as that sounds, it should be stressed that it's not all plain sailing. Out of 400 tested monitors, Nvidia has validated just 12 that deliver an adaptive sync experience worthy of the standards set by their proprietary version of adaptive sync technology. The firm warns of potential incompatibilities and artefacts including strobing and ghosting, while other displays throw up other compatibility problems. This Reddit thread and its associated megasheet are slowly building up a picture of how Nvidia products interface with a range of FreeSync displays.
I don't own one of Nvidia's favoured dozen of displays, but I do have access to an Asus VP28U - an entry level 4K display with FreeSync support I previously tested for Xbox One's VRR upgrade. It's a screen that's built to a price, and there's no way on Earth that Nvidia would give this G-Sync certification - but it's for exactly that reason that I wanted to check it out. Can you still get a good experience from entry-level kit?
I initially started tests by hooking up an RTX 2080 Ti to the Asus monitor via DisplayPort (there's no HDMI VRR support yet) but quickly moved onto a less capable GTX 1080 Ti instead - the new Turing flagship is just too fast to put VRR fully through its paces. This turned out to be a great test, as locking to 4K60 with the GTX 1080 Ti can be challenging. In common with many entry-level FreeSync screens, the Asus VP28U can only support VRR in a tight 40-60Hz window, or more specifically with 16.7ms to 25ms frame-times. If your game performance falls out of that window, you lose all the benefits for VRR. One of the many advantages of 'pure' G-Sync is low frame-rate compensation, designed to smooth off the experience if you do drop out of the VRR window. It's a very useful feature that's only available on a subset of FreeSync screens.
First impressions? It works and as long as you remain in the VRR sweet spot, the experience is game-changing. The GPU is now in charge of when a new frame is displayed on-screen, rather than the PC needing to synchronise with the display refresh. V-Sync judder when operating at frame-rates below 60fps is gone. The horrible screen-tear associated with turning v-sync off is banished forever - until your frame-rate dips under 40fps, of course. After years of working on Digital Foundry, I'm very sensitive to frame-rate drops and to v-sync judder. Spinning around on the spot in Crysis 3's jungle stage, frame-rates vary between the high 40s and the top-end 60fps. The judder is easy to spot with standard v-sync, but on the cheapo VP28U, the smoothness and fluidity still impresses. This screen has some issues, but the core adaptive sync experience works.
Adjusting settings to stay within the VRR window of your display is a must - and the FreeSync range can vary dramatically from screen to screen. If you are considering a non-G-Sync monitor purchase to pair with an Nvidia GPU, this is essential information you need before pulling the trigger. I'd also take a good, hard look at the crowd-sourced megasheet, as many displays have a strobing effect in VRR mode that is highly distracting. I noticed this crop up sporadically on the Asus VP28U, depending on the content and often at the lower end of the VRR window. G-Sync certification on a display may add a price premium, but generally, you don't need to worry about these issues.
Adaptive sync has found a home with high frame-rate displays, but it's in the 40-60fps range where I first saw the technology demonstrated and I still think it's here where VRR is most potent. The effectiveness of the experience is very much in the eye of the beholder, but I find that it's really difficult to notice the difference in performance between a game running at 50fps and at the more optimal 60fps. You may spy a little ghosting on-screen, or slightly heavier controls (most noticeable on a mouse) but the feel of consistency is exceptional, and in this optimal range, it's difficult to tell G-Sync and FreeSync apart. Once we reach the low to mid 40s, you can tell that something's not quite right, but it's still a huge improvement over v-sync judder or constant screen-tear.
Adaptive sync technology is brilliant - especially at 4K resolution - where the demands of the latest games make locking to 60 frames per second extremely challenging. Battlefield 5 runs well for the most part on the GTX 1080 Ti at 4K/ultra. However, once the post-processing effects kick in hard during intense combat, gameplay that's more usually in the 55-60fps range can see a 10fps drop. On a normal screen, this would be too distracting, but with VRR, the experience is sufficiently smooth - and short-lived - that I'm comfortable to stay at ultra settings, rather than tweaking settings to match worst-case scenario performance (and you can see the lengths this can take by checking out my attempts to run Battlefield 5 on the RTX 2060 at a locked 1080p60 with ray tracing enabled). In short, adaptive sync technology doesn't just produce a smoother, more consistent refresh - it also makes the job of tweaking performance a lot easier, with more flexibility in jacking up settings.
Nvidia supporting open standards adaptive sync is a big deal. PC monitors with adaptive sync support could become more commonplace - if not the standard - now that the GPU vendor with by far the largest market penetration supports the feature. And while artefacts like strobing are commonplace on a lot of adaptive sync screens right now, the fact it's being highlighted now may bring about a general increase in the quality of VRR support. On top of that, it paves the way for Nvidia to provide compatibility for the upcoming wave of HDMI 2.1 televisions, where VRR is a part of the spec. Selected 4K screens from Samsung are already FreeSync-capable, but for now at least, Nvidia's VRR functionality is DisplayPort only - hopefully the firm will indeed implement HDMI support at some point.
In here and now though, for Nvidia GPU users, the barrier for entry is now much lower for one of the most profound improvements to the PC gaming experience. However, in choosing a cheaper option, it's worth remembering that that the quality of the implementation will vary on a display by display basis and that the functionality may not be as fully robust as existing G-Sync screens. But with support from the market leader in PC graphics hardware, open standards VRR has just received a huge boost and I'm hugely excited to see how this changes the landscape going forward for display technology.
Will you support the Digital Foundry team?
Digital Foundry specialises in technical analysis of gaming hardware and software, using state-of-the-art capture systems and bespoke software to show you how well games and hardware run, visualising precisely what they're capable of. In order to show you what 4K gaming actually looks like we needed to build our own platform to supply high quality 4K video for offline viewing. So we did.
Our videos are multi-gigabyte files and we've chosen a high quality provider to ensure fast downloads. However, that bandwidth isn't free and so we charge a small monthly subscription fee of €5. We think it's a small price to pay for unlimited access to top-tier quality encodes of our content. Thank you.Support Digital Foundry