Metro Exodus Ray Tracing Benchmarked

Julio Franco

Posts: 9,099   +2,049
Staff member
After watching a video of RTX on this game I now really want an RTX card, I’ve always loved the metro games and I can’t think of a title more suited to the technology than this game.

Also great to see a smaller performance hit on this game with RTX on. The technology took a massive step forward with the release of this game in my opinion.

Now do I buy a 2080ti now or wait until the next RTX flagship releases? (Which I fully expect to also cost $1200).
 
Sub60fps @ FullHD in 2k19 on 2080Ti :joy::joy::joy::joy::joy::joy::joy::joy:

Nice job Nvidia, so you made it run bad on purpose with RTX off so it doesn't look so bad.

Done with this company. And it doesn't look any better than the previous Metro, just sharper textures.
 
Just bought a new 1440p 165Hz G-Sync monitor so I just bought a 2080 as well for $650, not for Ray Tracing, could not care less about it, TBH. I bought a 2080 because in terms of traditional raster rendering it's going to be about as good as it gets for the next 12mths or so, sure a 2080Ti is better but it's not an extra $400-$500 better.

If I can use Ray Tracing in games at decent frame rates, then fine, I'll use it, if not, I'll turn it off and know that my 2080 is playing my games just about as fast as they can be played for $650 in 2019.
 
Sub60fps @ FullHD in 2k19 on 2080Ti :joy::joy::joy::joy::joy::joy::joy::joy:

Nice job Nvidia, so you made it run bad on purpose with RTX off so it doesn't look so bad.

Done with this company. And it doesn't look any better than the previous Metro, just sharper textures.

lol, done with this company? Sounds like your not a fan to begin with.

Show me an AMD product that comes close to this performcance, even without RTX. I suppose you think market leading performance should be free?
 
After watching a video of RTX on this game I now really want an RTX card, I’ve always loved the metro games and I can’t think of a title more suited to the technology than this game.

Also great to see a smaller performance hit on this game with RTX on. The technology took a massive step forward with the release of this game in my opinion.

Now do I buy a 2080ti now or wait until the next RTX flagship releases? (Which I fully expect to also cost $1200).

I just bought the most powerful 2070 available for $580 - the MSI RTX 2070 GAMING Z 8G - and it runs at 2080 levels. That's the best bang for your buck right now if ray tracing is important to you.
 
Last edited:
I just bought the most powerful 2070 available for $580 - the MSI RTX 2070 GAMING Z 8G - and it runs at 2080 levels. That's the best bang for your buck right now if ray tracing is important to you.
Sounds like you have a use for it where it makes a difference. (y) (Y)
 
I either went blind or there is no difference between off and on. I was looking for minutes, but both images look exactly the same to me. Great feature, well done :D
 
RTX looks good in this game, but the differences are slim. The true revolution in gaming will be when we move again to a new generation of consoles with at least RTX 2080Ti performance. That is when developers will afford to spend some more time on better textures, better techniques, etc. Take for example Red Dead Redemption 2. It doesn't use ray tracing or any fancy tech, but it is a job done right in a few years (unlike normal developers that churn out games on yearly basis). Snow, grass, skies, animals, even the air looks amazing running on an RX480 class GPU (Xbox One X).
What nvidia is trying to do is nice, but we also need developers to invest more time in improving graphics (which means slower release cycles), so it probably won't happen.
 
What I find odd is that GI is not a raytraced feature in CGI to begin with - so it's an outright misnomer here to call it that, from Nvidia's end and from the game devs. The RT cores are simply processing their GI solution. It's not raytraced at all.

So why don't they just use the regular GPU cores to do this? You don't need RT cores to process GI calculations, as we have already seen from Vray-RT, Redshift, Octane, and other GPU renderers.
 
What I find odd is that GI is not a raytraced feature in CGI to begin with - so it's an outright misnomer here to call it that, from Nvidia's end and from the game devs. The RT cores are simply processing their GI solution. It's not raytraced at all.

So why don't they just use the regular GPU cores to do this? You don't need RT cores to process GI calculations, as we have already seen from Vray-RT, Redshift, Octane, and other GPU renderers.
As I see it, you would have to dig into the internal workings of nVidia for the answer to that.

Speculation on my part - given what we have seen in all the benchmarks, it may only be an attempt from their marketing department to add spin to their sales pitch for this generation of cards - which might imply that nVidia knew that this gen's performance improvements were piddling and they needed "spin" to sell the card.
 
You guys have the i9 9900K. I only have the older i9 7980 and 32GB DDR4 Hyper X.

As far as I've seen, my 2080Ti runs quiet and high FPS with RTX turned up.

I haven't tested on Metro tho.
 
As I see it, you would have to dig into the internal workings of nVidia for the answer to that.

Speculation on my part - given what we have seen in all the benchmarks, it may only be an attempt from their marketing department to add spin to their sales pitch for this generation of cards - which might imply that nVidia knew that this gen's performance improvements were piddling and they needed "spin" to sell the card.

Nice to see we agree on something. In my opinion it will be interesting to see how AMD responds to this, after Navi or whatever comes next from the Red Guys if only to compare solutions. Will they also be adding dedicated calculation cores to match Nvidia's, or simply feed all that through their main GPU cores? I've got some digging to do on the differences between RT cores and regular CUDA cores, but V-ray Next already uses both as far as I know.

So far, I haven't been able to find the information I saw previously about those RT cores in regular rendering to see what kind of boost they give, but some more comparisons between cards here:

https://www.pugetsystems.com/pic_disp.php?id=51596

"Surprisingly, the RTX 2070 actually outperformed the more expensive 2080 - just by a hair, but I had expected it to be slower. To verify this result I looked up several systems we have sold with the 2070 and 2080 in recent weeks, and sure enough: the RTX 2080 came in at 68 to 70 seconds, while the 2070 ranged from 67 to 69 seconds."

But then over on the official V-ray Benchmark site, we don't see the RTX cards keeping up at all with multi-GPU systems in the GPU benchmark. We don't see a 2080Ti (x4) until 10th place where it's boosted by a Titan-X, then 14th on its own. Granted these are all different CPU systems (an Epyc at #3 and that #14 was a TR 1920) but it seems like the RTX flagship should at least keep up with some of the previous tech. A lot of variables in these benches though. I suppose the best test would be with RT cores on and then off, for this type of application.

https://benchmark.chaosgroup.com/gpu

But those types of benches aside, of course the 2080Ti is still a monster for GPU calcs:

https://techgage.com/wp-content/upl...orce-and-Quadro-Performance-V-Ray-680x451.png
https://techgage.com/article/nvidia-geforce-rtx-performance-in-octanerender-redshift-v-ray/

But regarding that benchmark:
"Remember, this is without the Tensor and RT cores being fully utilized."
 
We all know the underwhelming performance of the current "RTX" cards. These are looking like very prototype cards, can even call it pre-beta cards.

The future looks good for gaming graphics, but we're not there yet without taking performamce hits.

Need to wait for another 2 or 3 generations for us to play with RTX Ultra settings comfortably and fluidly.

The differences are quite subtle.

If one really need to focus into the different shots intently, and can't differentiate at a glance, then RTX technology should improve a lot before it's worthy of the asking price.

For now, if we need the REAL RTX, all we need to do is open the door and look outside to the real world lightnings. Nothing beats it.

Missed those days when games were just games. All that mattered was just the gameplay.
 
Still not convinced, I see almost no difference and I don't think it's worth a penny to people who value their money.
Nice try nGreedia
 
Back