Skip to main content

Long read: How TikTok's most intriguing geolocator makes a story out of a game

Where in the world is Josemonkey?

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Andy Serkis' alien Macbeth: a giddying demo of game characters from the future

Gollum gosh!

Daydream a moment. Pause those critical voices in your head - we'll listen to them in a minute - and watch renowned actor Andy Serkis show us a character performance from a video game of the future.

This is a demo shown by engine-maker Unreal this week at the Game Developers Conference in San Francisco. In it, Serkis recites a monologue from Shakespeare's Macbeth, his face contorting believably in anguish as water pools in his eyes. It's very impressive.

Watch on YouTube

But what's even more impressive is how, a few moments later, the same performance is translated onto the face of an alien. Look at the eyes, the mouth, the facial movement: you can feel the malevolence - there's an energy and a power radiating from it. It's even better than the human Andy Serkis performance if you ask me. Imagine this kind of confrontation in a cinematic, character-driven science fiction game like Mass Effect.

Watch on YouTube

It's a demonstration of a kind of performance capture not a million miles away. And it's a kind of performance apparently capable of being rendered in real-time - which will really save time. Unreal is demonstrating this at GDC by rendering, live, the performance of an actor at the show. This demonstration is introduced in the below video.

Watch on YouTube
Watch on YouTube

One of the big new engine buzzwords wrapped up in these demos is raytracing - real-time raytracing. It's something graphics-maker Nvidia showed off this week in a demo built by Remedy Entertainment, and something EA flaunted in a charming Project Pica Pica demo.

But Epic Games chose a more eye-catching subject for its Unreal raytracing demo: Star Wars.

Watch on YouTube

Time to let those critical voices back in. This kind of graphical showboating is being powered on incredibly expensive specialist hardware, and is rendering under the gaming standard of 30 frames-per-second. It'll be years before this kind of scene will be achievable, broadly, in games.

But as Epic's chief technology officer told GamesIndustry.biz: "Honestly, between five and 10 years from now, I don't think you're going to be able to tell the difference between the real and the virtual world.

"You'll see hardware that can support these kinds of capabilities pretty shortly, and then, finally, the greatest blockbuster with the most complicated effects: within 10 years, you'll be able to do that in real-time."

Read this next