Weekly News

Epic Games showcases the Unreal future of graphics with real-time ray tracing in Star Wars

March 22,2018 19:21

Over at the Game Developers Conference in San Francisco this week, Epic Games hosted a State of Unreal presentation to demonstrate future improvements to its graphics engine. The biggest highlight for most people will likely be the company's real-time ...


Over at the Game Developers Conference in San Francisco this week, Epic Games hosted a State of Unreal presentation to demonstrate future improvements to its graphics engine. The biggest highlight for most people will likely be the company’s real-time ray tracing showcase, featuring a cute elevator scene from the Star Wars universe. But there was also a spookily realistic lizard-like alien called Osiris, voiced and animated by the performance of Gollum actor Andy Serkis, and a photorealistic “digital human” named Siren.

The Star Wars scene was designed to show off the various real-time light reflections and cinematic effects that can be achieved with ray tracing. Nvidia just announced real-time ray tracing, dubbing it Nvidia RTX, as a feature of its next generation of graphics cards, and Epic Games is among the first to offer support for the same. This rendering technique has been “a dream of the graphics and visualization industry for years,” said Nvidia senior vice president Tony Tamasi, but the hold-up until now has been the lack of graphics chips powerful enough for it. With the help of Nvidia’s RTX and Microsoft’s new DirectX Raytracing (DXR) API, Epic Games will be making real-time ray tracing available to Unreal Engine developers later this year.

The Osiris monologue is all about facial animations and how they can be mapped directly from the human actor to the digital character. Working with 3Lateral’s so-called Meta Human Framework technology for capturing actors’ facial performances, the Unreal Engine shows itself capable of producing extraordinarily lifelike (for a fictional alien, anyway) animations. The level of graphical detail and fidelity in this scene is frankly staggering, with all sorts of little twitches and convulsions happening across the alien dude’s face. And it’s all done in real time. This proof of concept, unlike the ray tracing demo, isn’t destined for any sort of immediate deployment and it serves only to show what will one day be possible.

You’ll have to look closely to tell that the Siren performance is digital rather than human. It’s another proof of concept, this time produced in partnership between Epic Games, Tencent, Vicon (who did the finger and body motion capture), Cubic Motion, and 3Lateral. The video was done by mapping the likeness of actor Bingjie Jiang onto the movements performed by actor Alexa Lee. The techniques for achieving the lifelike realism of motion are still quite demanding — including full-body motion capture rigs — but the advancement from the Unreal Engine is in streamlining the rendering process and raising the bar for visual fidelity of the final product.

gamestar games gamestorrents games workshop gamesdeal gameshop gamestop games with gold games online gamespot

Share this article

Related videos

State of Unreal | GDC 2018 | Unreal Engine
State of Unreal | GDC 2018 | Unreal Engine
Unreal Engine 3 Samaritan Real Time Demo HD HQ
Unreal Engine 3 Samaritan Real Time Demo HD HQ

DON'T MISS THIS STORIES