1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Epic Games shows off photo-realistic ray tracing demos, coming to Unreal Engine soon

By William Gayde · 20 replies
Mar 22, 2018
Post New Reply
  1. It is usually pretty easy to tell the difference between a computer-generated scene and a real life one but some new demos from Epic Games are making that harder and harder.

    Epic hosted the State of Unreal presentation at the Game Developers Conference in San Francisco this week where it presented a collection of different scenes showing off their new real-time ray tracing, motion capture and real-time facial animation mapping.

    Also read: Microsoft's 'DirectX Raytracing' technology aims to bring movie-quality lighting to video games

    This Star Wars scene showcases Epic Games' implementation of Nvidia's new RTX feature. Ray tracing is a rendering process of mapping light rays through an image to create realistic shading, reflections and depth-of-field. It requires incredibly powerful hardware to do quickly and we are only now seeing real-time implementations thanks to Nvidia's Volta GPUs. Epic will use the DirectX Raytracing API (DXR) to make this feature available to Unreal developers later this year.

    If you still don't believe the scene was computer-generated, here is a behind-the-scenes explanation of what is going on.

    Although not as flashy as the Star Wars scene, the next demo is just as incredible. Siren is a digital human developed through a partnership between Epic Games, Tencent, Cubic Motion and 3Lateral. It was created by mapping the appearance of one actor onto the movements of another.

    Motion capture technology isn't new but recent advancements have made the rendering process easier and have constantly been increasing its realism. Epic also showed a demo of 3Lateral's Osiris character. Note how realistic the facial animations and mannerisms are, then remember that this is all being rendered in real-time.

    Permalink to story.

     
  2. Burty117

    Burty117 TechSpot Chancellor Posts: 3,351   +1,133

    Did they mention anywhere what hardware they had to get this running in realtime?
     
    stewi0001 likes this.
  3. davislane1

    davislane1 TS Grand Inquisitor Posts: 5,231   +4,374

    Probably a couple of Titan Vs in SLI paired with an i9.
     
  4. stewi0001

    stewi0001 TS Evangelist Posts: 2,002   +1,393

    That is exactly what I would like to know. Considering that they never say, I am betting it requires some heavy horse power else they would brag about it not needing such.
     
  5. yRaz

    yRaz Nigerian Prince Posts: 2,646   +1,870

    Second one isn't as believable. Photo-realism doesn't work as well on matte surfaces which is probably why her face seems a little weird
     
  6. Prosercunus

    Prosercunus TS Maniac Posts: 258   +105

    The fact it's rendered in real time is very cool. You're not wrong though, the face and movements look off but still pretty realistic.
     
  7. TomSEA

    TomSEA TechSpot Chancellor Posts: 2,972   +1,320

    Amazing progress.

    Wasn't that long ago when Pac-Man was considered state-of-the-art computer graphics.
     
    poohbear likes this.
  8. Puiu

    Puiu TS Evangelist Posts: 3,131   +1,559

    It was running on a $60k Nvidia DGX Station which has 4X Tesla V100.
    Also, the 24fps Star Wars scene was not fully ray-traced, just parts of it. It's a hybrid using normal rasterization and ray-tracing.

    From my understanding, the key innovation seems to be the dynamic scaling of the number oo lines used for ray-tracing depending on the importance of the scene/object/etc. The rest still needs brute-forcing using as much hardware as you can throw at it. We are still 3-4 years away until this technology truly becomes viable for games.
     
    Last edited: Mar 22, 2018
  9. GreyFoxx

    GreyFoxx TS Booster Posts: 85   +61

    DGX in the UK is £75,000.00
     
  10. Jamlad

    Jamlad TS Maniac Posts: 172   +155


    Only 40 years. So, half a lifetime?
     
    poohbear likes this.
  11. regiq

    regiq TS Addict Posts: 203   +80

    You could "achieve" this look on real footage with poor lighting and make up but the animation should be less "edgy" to make it look real.
     
  12. TomSEA

    TomSEA TechSpot Chancellor Posts: 2,972   +1,320

    (shrugs) First electronic computer showed up in 1937, so I'm still thinking relatively new. ;-)
     
  13. Badelhas

    Badelhas TS Booster Posts: 87   +31

    I dont believe any of this in real games any time soon. We´ve been here before. Remember the 2011 Epic Games "Samaritan" demo? They promised that level of photo realistic graphics were coming but games have stalled in terms of graphics since 2007 Crysis 1, in my opinion. I blame consoles.
     
  14. Badelhas

    Badelhas TS Booster Posts: 87   +31

  15. yRaz

    yRaz Nigerian Prince Posts: 2,646   +1,870

    Look
    her teeth, that's obvious. Then her skin just seems odd, it's a matte color but there is no texture to it
     
  16. pcnthuziast

    pcnthuziast TS Guru Posts: 405   +43

    Not saying they're crappy demos, but all it really showed me is how much farther we have to go.
     
  17. richcz3

    richcz3 TS Enthusiast Posts: 28   +11

    NVIDIA RTX Technology running on NVIDIA Volta architecture GPUs
     
  18. qking

    qking TS Booster Posts: 48   +28

    It' still looks like the movement is out of sync with the audio. Movement is too slow, not snappy enough, and lags behind the audio. It's appears to be a poor match of voice and voice tempo to go with the shapes created by the mouth, neck and lips. Need to pair voices better, or fine tune. Still wicked for real time though really/
     
  19. Bubbajim

    Bubbajim TS Evangelist Posts: 497   +477

    Well, yeah, but the pointy stick with the height of advanced technology for like tens of thousands of years...
     
  20. BananaKing

    BananaKing TS Rookie

    "It requires incredibly powerful hardware to do quickly and we are only now seeing real-time implementations thanks to Nvidia's Volta GPUs. Epic will use the DirectX Raytracing API (DXR) to make this feature available to Unreal developers later this year."

    Volta GPU's. I was at GDC and had heard it took four of them, but can't confirm that anywhere.
     
  21. davislane1

    davislane1 TS Grand Inquisitor Posts: 5,231   +4,374

    What are the chances the first few iterations of this working like early PhysX stuff? One GPU for the standard rendering, the other(s) dedicated to performing the calculations for ray tracing?
     

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...