Epic Games shows off photo-realistic ray tracing demos, coming to Unreal Engine soon

William Gayde

Posts: 382   +5
Staff

It is usually pretty easy to tell the difference between a computer-generated scene and a real life one but some new demos from Epic Games are making that harder and harder.

Epic hosted the State of Unreal presentation at the Game Developers Conference in San Francisco this week where it presented a collection of different scenes showing off their new real-time ray tracing, motion capture and real-time facial animation mapping.

Also read: Microsoft's 'DirectX Raytracing' technology aims to bring movie-quality lighting to video games

This Star Wars scene showcases Epic Games' implementation of Nvidia's new RTX feature. Ray tracing is a rendering process of mapping light rays through an image to create realistic shading, reflections and depth-of-field. It requires incredibly powerful hardware to do quickly and we are only now seeing real-time implementations thanks to Nvidia's Volta GPUs. Epic will use the DirectX Raytracing API (DXR) to make this feature available to Unreal developers later this year.

If you still don't believe the scene was computer-generated, here is a behind-the-scenes explanation of what is going on.

Although not as flashy as the Star Wars scene, the next demo is just as incredible. Siren is a digital human developed through a partnership between Epic Games, Tencent, Cubic Motion and 3Lateral. It was created by mapping the appearance of one actor onto the movements of another.

Motion capture technology isn't new but recent advancements have made the rendering process easier and have constantly been increasing its realism. Epic also showed a demo of 3Lateral's Osiris character. Note how realistic the facial animations and mannerisms are, then remember that this is all being rendered in real-time.

Permalink to story.

 
Second one isn't as believable. Photo-realism doesn't work as well on matte surfaces which is probably why her face seems a little weird
 
Second one isn't as believable. Photo-realism doesn't work as well on matte surfaces which is probably why her face seems a little weird

The fact it's rendered in real time is very cool. You're not wrong though, the face and movements look off but still pretty realistic.
 
Amazing progress.

Wasn't that long ago when Pac-Man was considered state-of-the-art computer graphics.
 
Did they mention anywhere what hardware they had to get this running in realtime?
It was running on a $60k Nvidia DGX Station which has 4X Tesla V100.
Also, the 24fps Star Wars scene was not fully ray-traced, just parts of it. It's a hybrid using normal rasterization and ray-tracing.

From my understanding, the key innovation seems to be the dynamic scaling of the number oo lines used for ray-tracing depending on the importance of the scene/object/etc. The rest still needs brute-forcing using as much hardware as you can throw at it. We are still 3-4 years away until this technology truly becomes viable for games.
 
Last edited:
Did they mention anywhere what hardware they had to get this running in realtime?
It was running on a $60k Nvidia DGX Station which has 4X Tesla V100.
Also, the 24fps Star Wars scene was not fully ray-traced, just parts of it. It's a hybrid using normal rasterization and ray-tracing.

From my understanding, the key innovation seems to be the dynamic scaling of the number or lines used for ray-tracing depending on the importance of the scene/object/etc. The rest still needs brute-forcing using as much hardware as you can throw at it. We are still 3-4 years away until this technology truly becomes viable for games.
DGX in the UK is £75,000.00
 
Second one isn't as believable. Photo-realism doesn't work as well on matte surfaces which is probably why her face seems a little weird

You could "achieve" this look on real footage with poor lighting and make up but the animation should be less "edgy" to make it look real.
 
I dont believe any of this in real games any time soon. We´ve been here before. Remember the 2011 Epic Games "Samaritan" demo? They promised that level of photo realistic graphics were coming but games have stalled in terms of graphics since 2007 Crysis 1, in my opinion. I blame consoles.
 
Look
You could "achieve" this look on real footage with poor lighting and make up but the animation should be less "edgy" to make it look real.
her teeth, that's obvious. Then her skin just seems odd, it's a matte color but there is no texture to it
 
It' still looks like the movement is out of sync with the audio. Movement is too slow, not snappy enough, and lags behind the audio. It's appears to be a poor match of voice and voice tempo to go with the shapes created by the mouth, neck and lips. Need to pair voices better, or fine tune. Still wicked for real time though really/
 
Did they mention anywhere what hardware they had to get this running in realtime?

Probably a couple of Titan Vs in SLI paired with an i9.

"It requires incredibly powerful hardware to do quickly and we are only now seeing real-time implementations thanks to Nvidia's Volta GPUs. Epic will use the DirectX Raytracing API (DXR) to make this feature available to Unreal developers later this year."

Volta GPU's. I was at GDC and had heard it took four of them, but can't confirm that anywhere.
 
"It requires incredibly powerful hardware to do quickly and we are only now seeing real-time implementations thanks to Nvidia's Volta GPUs. Epic will use the DirectX Raytracing API (DXR) to make this feature available to Unreal developers later this year."

Volta GPU's. I was at GDC and had heard it took four of them, but can't confirm that anywhere.

What are the chances the first few iterations of this working like early PhysX stuff? One GPU for the standard rendering, the other(s) dedicated to performing the calculations for ray tracing?
 
Back