Skip to main content

Long read: The beauty and drama of video games and their clouds

"It's a little bit hard to work out without knowing the altitude of that dragon..."

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Epic's new MetaHuman Creator delivers super-realistic digital actors

See the new technology in action.

Epic Games today lifts the lid on MetaHuman Creator - a new tool designed to bring the highest fidelity facial rendering to the wider development community. It's part of Epic's ongoing mission to democratise high-end graphics technology, giving a wider range of development studios the chance to deliver characters up there with the industry's best and to get the most out of today's hardware. According to Epic, we're looking at the kind of facial quality and animation seen in high-end titles like The Last of Us Part 2 - and you can see for yourself just how close Epic's technology gets via the embedded video on this page.

MetaHuman Creator takes the form of a browser-based app, plumbed into Unreal Engine Pixel Streaming. Vladimir Mastilovic, VP of Digital Humans Technology at Epic, described the initial process to us as being as simple as playing a game, with no programming knowledge required as developers create and sculpt their digital actors - you get a sense of that in the video below. As changes and enhancements are made, MetaHuman Creator intelligently uses data from its cloud-based library to extrapolate a realistic digital person. At the end of the process, the final creation can be imported into Unreal Engine via Quixel Bridge, with full animation rigging and Maya source data provided. At that point, a massive degree of rendering customisation is available via the features of Unreal Engine itself - and the data is, of course, compatible with both UE4 and the upcoming UE5.

A look at Epic's MetaHuman technology in action.Watch on YouTube

Based on the quality of the sample (and there's another one here, this looks like an impressive showing for Epic, though in reaching the fidelity seen in the character rendering found in the first-party triple-A juggernauts, there is more to the process than just the graphics - quality of performance and motion capture are going to be key. However, we are clearly seeing some cutting edge technology here and these initial demos are striking. Skin shading, texture quality and geometric density are very impressive, while eyes look expressive. Additionally, hair is always a particularly tricky part of rendering convincing characters - but MHC can tap into the very latest strand rendering technology to produce a convincing look, a 'next-gen' feature we've only really seen on proprietary engines so far. While likely too demanding to run on anything other than next-gen consoles and high-end systems, MHC can fall back to more standard texture 'cards' for hair rendering. In fact, the system itself scales its creations to eight LOD levels, ensuring scalability from powerful systems down to mobile platforms.

It's still early days for the MetaHuman Creator system with a limited number of preset characters in the cloud from which to work with, but work continues apace. Epic Games is looking to increase overall diversity, but also basic 'types' of character too - in a press briefing, it was acknowledged that adults are currently the focus and there's a lot more work to do so for example, adding children at various stages of development is part of the plan. But Epic is clearly proud of its achievement and eager to share an impressive technology, to the point where the firm will be releasing two completed sample characters for developers to experiment with, before moving onto an early access programme at some point in the next few months.