A battle royale game with only 12 players? How is that going to work? We recently had the opportunity to go hands-on with Dying Light: Bad Blood - Techland's answer to that very question, and came away really impressed. While the sheer size and scope of the concept is miniaturised somewhat, the action is no less thrilling - in fact, the close-quarters intimacy of the encounters, paired with Dying Light's parkour traversal system makes for a unique take on the battle royale concept.
Companion app data hack offers up some clues.
But is it worth replaying?
Settings tweaks and hardware recommendations for a truly demanding game.
And the challenges facing developers using the cutting-edge RTX tech.
The embargo lifts today on video capture of Battlefield 5's beautiful new Rotterdam map, which looks all the better when rendered in RTX - Nvidia's brand new ray tracing technology for its upcoming 20-series cards. We had the chance to go hands-on with an RTX-enabled version of the game, and to talk directly with the graphics engineers responsible. How does ray tracing work? What are its limitations? And with performance such a hot topic surrounding RTX titles, what are DICE's plans for future optimisation and further features?
Just how demanding is Monster Hunter World's PC port? Is it really heavy on CPU as coverage of the closed beta seemed to suggest? And if so, what PC hardware is actually required to run the game at a consistent 60 frames per second? We went into this one expecting a battle - lowering CPU requirements is far more challenging with far less room for manoeuvre than tweaking graphics settings. But after extensive testing, the reality is that it is indeed the GPU side of the equation that makes running this title so challenging - and even a GTX 1070 running at just 1080p can't lock to 60 frames per second at max settings.
A surprise - but a genuinely good one! As a series debut for the franchise on PC, I went into Yakuza 0 not really knowing what to expect, especially considering the dubious history of many late-arriving PC ports - but Sega has delivered here. As you might expect, the port doesn't deliver a massive improvement over the existing PlayStation 4 game, but what it does offer up is scalabilty in both resolution and frame-rate and some small but welcome extras. Whether you're looking to game at 120Hz or on 4K or ultrawide displays, Yakuza 0 has you covered.
No Man's Sky's recently released Next update radically improves content and visuals and we've had a lot of fun playing it in its new Xbox One X incarnation - but we've got to say that the PC version still needs a lot of work. Performance doesn't seem to be where it should be, even on higher-end GPUs while basics like v-sync don't seem to work properly. On top of that, right from the off, basic user-friendliness comforts and presentation create a genuinely poor introduction to the game. For a title that has improved so dramatically since launch, we genuinely hope to see Hello Games make one last push to make life easier for PC users.
At its best, rendering technology doesn't just make a game look great and run smoothly, it has an intimate relationship with gameplay - and it's for that reason that Crytek's The Hunt: Showdown is well worth checking out. In this fascinating multiplayer first-person survival horror shooter, CryEngine is used for more than just window dressing. Yes, it looks beautiful and well up to triple-A standards, but the technology is fundamental in creating some of the The Hunt's most impactful moments, as well as establishing its unique atmosphere.
Anthem's return at E3 2018 was just as spectacular as its debut a year earlier, and last week, Electronic Arts posted a full 20-minute gameplay video of the latest demo - with developer commentary, no less. Presented in full 4K, the firm shared a source quality version of the asset with us, allowing us to take a closer look at the game's Frostbite foundations, and compared to other EA titles using the tech, we're witnessing a use of the engine quite unlike anything else we've seen before. Developer BioWare isn't talking about performance targets, but the sheer intensity in detail, and the integrity of the open world suggests that this will likely be a 30fps game on consoles - and an interesting counterpoint to the 60fps heroics of Battlefront and Battlefield.
Players of the recent Battlefield 5 alpha have been witness to quite a treat. Building on DICE's excellent work in BF1 and Battlefront 2, we're looking at an exceptionally handsome game that, small bugs aside, almost feels like the finished article. It's visually outstanding in fact, the only disappointment - if you can call it that - being that the signs are pointing towards an evolution of the Battlefield formula and its Frostbite engine, as opposed to a full-on next-gen revolution.
It's been ten years since Crysis first released on PC. In 2007, it pushed real time rendering to new heights and spawned the memetic phrase, "but can it run Crysis?". Never had a game released that pushed hardware and engine technology so much, and never has one since. In fact, combine the latest and greatest Intel Core i7 8700K overclocked to 5.0GHz with an Nvidia Titan Xp and there'll still be areas of the game that drop beneath 60fps - even at 1080p. For its own very specific reasons, Crysis is still more than capable of melting the most modern, top-end PCs, but regardless, it remains a phenomenal technological achievement. It deserves a remaster at the very least, but a franchise of this standing really deserves a full next-gen sequel, with state-of-the-art rendering and back-to-basics gameplay.
Have you ever loaded up a new PC title, run the in-game benchmark, tweaked settings for optimal performance then discovered that actual gameplay throws up much lower frame-rates, intrusive stutter or worse? It's a particular frustration for us here at Digital Foundry, and it leads to a couple of very obvious questions: firstly, if benchmark modes are not indicative of real-life performance, what use are they? And secondly, if their use is limited, how representative of real-life gaming are the graphics card reviews that use them, including ours?
Under Ubisoft's stewardship, the Far Cry franchise is now celebrating its 10th year - a full decade of series entries and offshoots that have seen the focus of the gameplay and the technology shift dramatically. And this has led to some interesting YouTube offerings from Mark Brown and CrowbCat, showing what look like substantial engine downgrades over the years. So what's going on here? Has the massive increase in processing power provided by the current-gen consoles been matched with a simplification in aspects of the technology?