The State of Nvidia RTX Ray Tracing: One Year Later

Modern Warfare has got RTX and not so much performance loss, 2060S can get 60fps 1440p Ultra Details with RTX while 2080Ti can get 60fps at 4K. Although it's only shadows and the effects are subtle, it does contribute to a more realism environments.
 
I feel like it’s worth pointing out that both of the new consoles coming next year will be using entirely AMD hardware inside including for the ray tracing. Because of this, there will likely be a pretty decent increase in the number of games with ray tracing. But these games will also probably be created and tested on AMD hardware as well since the consoles aren’t using Nvidia. So if you’re looking to upgrade any computer parts and just build a new one, you’re probably going to be better off waiting for AMD to release the large Navi die GPU with ray tracing next year. The price of the hardware is much cheaper and not only are they already on 7nm, they have finished zen 3 with 7nm+ and are already in the middle of 5nm. And since consoles will be using AMD tech, there’s a good chance that AMD ray tracing will run smoother in games than Nvidea.
 
Yet, in MP games totally useless, specially in BFV, when lower the graphics , higher the advantage. Similar applies to CODMW 2019. In singleplayer/Campaign make sense to have IRL like impression, but in MP where every frames are accounted for and investigated if missing some... :D
 
I bought an MSI RTX 2060 gaming Z little after release because it was a very good deal where I live at that time. It is a great gaming card with low temps and low noise, however, I haven't used it for any AAA games with ray tracing. I just don't play those few titles but I don't really care either. All the games I play are running perfectly in 1440p on a 75Hz Freesync IPS monitor that works flawlessly with G-sync. That Freesync support from Nvidia was the big thing for me.

I only tried the Quake 2 RTX demo, but hardly managed to lower ray tracing to a level where fps was playable so clearly this would take years to be adopted across the industry and definitely not with first gen RT cards.

I agree it is promising for the future, but definitely nothing exciting for me today. It is a bit disappointing that the RT features I paid for are nowhere in games in a playable fashion.

All in all, I am very happy with my RTX 2060 without enjoying any RT features.
 
Looks like Ray Tracing is gaining ground and folks without it are missing out. When you can see the difference in screenshots, it would be very noticeable in many situations, as the tester implied. Even though its still in its infancy, its just another reason Nvidia is, has been, and will continue to be the best gaming GPU money can buy at every price point. And like Gsync compared to Freesync for example, and many of their other copied technologies, it will be more polished as well.
The other HUGE and MASSIVE awesomeness factor no one really thought of initially, is the implementation in older games. If they do remake HL2 with Ray Tracing, yours truly will smash through the campaign one last time down the road when I get an RTX GPU. If they redo Unreal with it, I will sell my RTX 2070 smoking overclocked to the beans GTX 1080 and get an RTX card.
Great review, so far so good with Ray Tracing. Steam gamers know whats up, which is why 80%-90% use Nvidia.
And to the haters and naysayers, A) its obvious, every thread, just an FYI, and B) Umm... make more money I suppose.

Awesome!

You must be new to computer games...!


John Carmack back in 2015, told us that the GAMING INDUSTRY is centered around 2020 for the release of wide spread ray tracing in games. Many other leading Developers, have also targeted year 2020 as the year in which they (as well) will be introducing low-level ray tracing to their games. This just didn't happen, it was all fashioned around Microsoft OS, and the AMD consoles to have the framework in place.

DirectX Ray Tracing (DRX) is a known thing, and has been discussed in detail many years ago.



That said^...
It was Jensen from nVidia, that decided to take a massive datacenter chip & sell a broken version of it to Gamers to try and market Nvidia as a leader in ray tracing. But that is not what happened..

Because Nvidia had to pay their own engineers to work with Developers, to implement Nvidia's proprietary RTX into their games.

When most Dev's would rather just be coding for DXR... instead. Since DXR is universal and doesn't need special coding, to make use of nVidia's gimmick hardware (passed down from datacenter), to make the claim "it just works" and it used RTX.

All a hoax, and now that most games coming out in 2020 by Developers, will simply use DXR, as all gaming hardware will coalesce to this Gaming Standard. No developer is going to bother with the added cost of another proprietary API, just to perpetuate Jensen's DataCenter hardware hoax. And the reason Jensen had teams working closely with Developer's... because they don't want to incur the added RTX cost, when DXR is where mainstream is.

Nvidia's Turing is not as advanced as AMD's scalable rDNA. That is why you see nvidia rebranding all their cards as "SUPER". All because of one single rDNA die, called Navi10.
 
It would be an epic fail not to buy a first generation RTX card for raytracing! It's not as if future generations will offer better raytracing performance for less money. Plus you want to be on that vanguard bandwagon! Remember Nvidia Hairworks, the list of games that support it is now up to five!...if you include Nvidia VR Funhouse as a game....and witcher 3 and its DLC as two separate games. AMD video card owners were left cursing at their monitors unable to play Call of Duty: Ghost in all its wavy hair glory.
Off the top of my head, Call of Duty: Ghosts, Far Cry 4, Final Fantasy XV, Monster Hunter Online, and The Witcher 3. There's also another game I don't remember (a MOBA, an MMO or something like that). I'd argue that Nvidia GameWorks are actually more noticeable than Ray Tracing. But just like RT, it has an incredible impact on performance that's not worth the trade for the visual enhancements.
 
Off the top of my head, Call of Duty: Ghosts, Far Cry 4, Final Fantasy XV, Monster Hunter Online, and The Witcher 3. There's also another game I don't remember (a MOBA, an MMO or something like that). I'd argue that Nvidia GameWorks are actually more noticeable than Ray Tracing. But just like RT, it has an incredible impact on performance that's not worth the trade for the visual enhancements.

Visual enhancements must come first before performance, no one is interested in CSGO graphic with 300+ fps for single player games. It is the basis for which AMD/Nvidia sell their hardware anyways. Just think of Gamework and RTX as settings above Ultra.
 
A few great replies for this feature on this reddit discussion, I thought I'd share here:

Give it time. Back when 3D accelerators first came out, people dropped $600 on a card that would give them 30fps at 640x480, and they still needed to spend $400 on a good quality 2D card on top of that.

IMO the "disappointing" results of raytracing is a testament to the artists and programmers who created all the "tricks" they could over the years to make games looks as good as they do. In most cases it is subjective if raytracing looks better or not, often you need to point a finger to tell someone what is actually different.
In none of their pictures I can say, "yes, this is an impressive improvement". Even in Quake 2 screenshots the biggest improvement actually comes from the high resolution textures, and if you compared it to other graphics mods with those textures and bump-mapping and specular light it doesn't look impressive at all.

This is what I've been saying since the start of the RT hype in games. Devs have had to make rasterization look pretty over decades due to HW limitations. They've gotten so good at it, with character models for example in Death Standing running on a weak console, looking like pre-rendered quality from not too far back.

Even sub-surface light scattering on their skin, moisture layers on lips, eyes, etc.

We've had great lighting, shadows and GI without the massive performance hit because it was a necessity for devs to solve these visual problems for the least amount of graphics hardware grunt required.
 
Author makes a mistake by calling RTX, as ray tracing.

RTX can do ray tracing, but RTX is it's own API.... that Nvidia engineers have to work with Developer's to implement (It just doesn't work). RT (Ray Tracing) is something like DX12 DXR... that a Developer just makes a game and whatever video card uses the DXR information, renders their ray tracing. No specific or re-engineered code to get it to work on specific hardware.

Jensen tried really hard to market RTX = ray tracing, and many people bought into his proprietary hoax. Most of the RTX hardware is not used in DXR.
Not at all and quite the contrary. Ray tracing is one thing, and RTX is Nvidia's consumer implementation that is the "first" to gain support on games. This is an evaluation of that first year since the first RTX GPUs arrived.

Nobody implied RTX = ray tracing. But for now, it is the sole hardware implementation available.
 
Techspot is good
Techspot is great
Ray Tracing rocks
DXR wears Ray Tracing socks
 
Last edited:
m3tavision said:
When most Dev's would rather just be coding for DXR... instead. Since DXR is universal and doesn't need special coding, to make use of nVidia's gimmick hardware (passed down from datacenter), to make the claim "it just works" and it used RTX.

All a hoax, and now that most games coming out in 2020 by Developers, will simply use DXR, as all gaming hardware will coalesce to this Gaming Standard. No developer is going to bother with the added cost of another proprietary API, just to perpetuate Jensen's DataCenter hardware hoax. And the reason Jensen had teams working closely with Developer's... because they don't want to incur the added RTX cost, when DXR is where mainstream is.

You obviously don't get what DXR is.
Ray-traced games are built on DirectX 12’s DirectX Raytracing API, DXR, its not something Nvidia is restricting based on gaming drivers or game engines. You may be an IT journalist but I'm an IT Specialist, if you plan to confront me and say something like 'I am new to PC gaming' try not to sound so clueless when doing so.

Not at all and quite the contrary. Ray tracing is one thing, and RTX is Nvidia's consumer implementation that is the "first" to gain support on games. This is an evaluation of that first year since the first RTX GPUs arrived.

Nobody implied RTX = ray tracing. But for now, it is the sole hardware implementation available.

I was going to try and explain this all to him, but he'll figure it out if he keeps researching.
 
Last edited:
You obviously don't get what DXR is.
Ray-traced games are built on DirectX 12’s DirectX Raytracing API, DXR, its not something Nvidia is restricting based on gaming drivers or game engines. You may be an IT journalist but I'm an IT Specialist, if you plan to confront me and say something like 'I am new to PC gaming' try not to sound so clueless when doing so.

I was going to try and explain this all to him, but he'll figure it out if he keeps researching.


No, I understand exactly what DXR is.

I was making a specific point, that Turing has specific hardware on it's chip, and making use of that hardware is called RTX... and that no Developers are using "RTX on" (as Jensen first tried to market it), except for the ones Nvidia themselves are working with. Because it is NOT the same as DXR. Even though Turing can do DXR, it doesn't use the specific RTX hardware to do it.

Matter of fact Navi and Nvidia's GTX cards can do DXR. But when using DXR, Turing added hardware is not used. You paid for a faster GPU and got it, but "RTX on" is essentially a hoax.. Bcz, it just ray tracing that any GPU can do.

Only that faster cards... are faster at DXR.

Not because of specific RTX hardware. Games that make use of Turing added "RTX hardware" have to be specifically coded for. And the reasons some Developer's came out and openly admitted they are not supporting RTX, but DXR, and no longer soliciting nvidia for help in Nvidia only coding.

It doesn't just work...
 
Last edited:
Nvidia said:
The Turing architecture used in GeForce RTX GPUs was designed from the start for DXR-type workloads. Pascal, on the other hand, was launched in 2016 and was designed for DirectX 12. As such, Pascal and older-generation GPUs were not designed like Turing.
RT Cores on GeForce RTX GPUs provide dedicated hardware to accelerate BVH traversal and ray / triangle intersection calculations, dramatically accelerating the ray tracing process. On GeForce GTX hardware, these calculations are performed on the programmable shader cores, a resource shared with many other graphics functions of the GPU.
On Pascal-architecture GPUs, we see that ray tracing and all other graphics rendering tasks are handled by FP32 Pascal shader cores. This takes longer to perform, meaning the gamer encounters a lower framerate. The Turing architecture introduced INT32 Cores that operate simultaneously alongside FP32 Cores, helping reduce frame time. And with RT Cores and Tensor Cores, execution time shrinks significantly, translating to 2-3x faster in-game performance.
The only difference with Turing architecture, as Steve has stated in reviews, is how the hardware handles the execution of DXR workloads. You keep saying 'specially coded for; like its something Nvidia is restricting, which still means you do not understand.

I think you need to read this:
in its entirety.
 
> Especially in the various train scenes, the game looks natural and well lit with RTX on [...]

You must certainly be joking, right?

The right side of the train carriage there in your Metro Exodus screenshots is just one big row of windows. On a bright sunny day (as can be seen outside the carriage windows in the screenshots), the interior of the carriage should be rather brightly lit, and absolutely not look as dark as a cellar. Frankly, from all three screenshots, the one with "RTX off" seems to be "more" natural (relatively speaking), as it is the one with the least dark carriage interior.

Now, how the game looks and feels in motion with "RTX on" I can't judge, but I really can't figure out how one would be able to agree with the argument of the game looking "natural and well lit with RTX on" while looking at those screenshots.

And with regard to the screenshots of the The Two Colonels DLC, I can't avoid concluding that the "RTX off" scenery/visuals have been willfully botched to substantially look worse than their "RTX on" counterparts. Yeah, render almost the whole scene near black so it appears much, much darker than the already dark "RTX on" output. How am I supposed to take those comparison screenshots (or that DLC itself) seriously with a straight face?

> We suspect and hope this coming year of ray tracing will be a lot better than the first.

I hope so too, but as others already opined, the breakthrough will likely come with the upcoming console generation. So far, the current state of real-time RT on consumer hardware is nothing more than a proof of concept.
 
> Especially in the various train scenes, the game looks natural and well lit with RTX on [...]

You must certainly be joking, right?

The right side of the train carriage there in your Metro Exodus screenshots is just one big row of windows. On a bright sunny day (as can be seen outside the carriage windows in the screenshots), the interior of the carriage should be rather brightly lit, and absolutely not look as dark as a cellar. Frankly, from all three screenshots, the one with "RTX off" seems to be "more" natural (relatively speaking), as it is the one with the least dark carriage interior.

Now, how the game looks and feels in motion with "RTX on" I can't judge, but I really can't figure out how one would be able to agree with the argument of the game looking "natural and well lit with RTX on" while looking at those screenshots.

And with regard to the screenshots of the The Two Colonels DLC, I can't avoid concluding that the "RTX off" scenery/visuals have been willfully botched to substantially look worse than their "RTX on" counterparts. Yeah, render almost the whole scene near black so it appears much, much darker than the already dark "RTX on" output. How am I supposed to take those comparison screenshots (or that DLC itself) seriously with a straight face?

> We suspect and hope this coming year of ray tracing will be a lot better than the first.

I hope so too, but as others already opined, the breakthrough will likely come with the upcoming console generation. So far, the current state of real-time RT on consumer hardware is nothing more than a proof of concept.

hm, so you think wooden surfaces can bounce light like walls with white paint ? Global Illumination make part of the scene not shone directly by the sun darker and that is how it is in real life. You cant be looking at evenly lit corridors and think it to be more natural looking.

Nvidia is shipping out RTX hardwares for developers now and future DXR games are being developed on RTX hardware atm. One thing for certain is the next iteration of RTX will be better implemented than the previous, just look at Control and the newly released COD Modern Warfare.

Yes when DXR is ubiquitous the current buyers of Navi will be left in the dust. If you think Navi can handle DXR take look at how 1080Ti play Modern Warfare with RTX
https://www.pcgameshardware.de/Call...tracing-Test-Release-Anforderungen-1335580/2/
 
Last edited:
hm, so you think wooden surfaces can bounce light like walls with white paint ?
No, I don't think that. But what has that to do with what I argued in my previous comment? Just because the corridor is not painted in white, would you think the corridor has to be almost completely dark despite one side of the corridor just being windows? Come on, you can't be serious.

It would make a stronger point if one were to use, you know, actual nature as a reference for the argument about what is or is not "natural" and what is or is not "well lit". Here, let me do it first. Some photos of wooden train carriage corridors on bright days:


Even with curtains in front of the window, look how illuminating this picture is:

www.monkeyshrine.com/wp-content/uploads/2017/01/Corridor-on-Russian-train-train-7-Vladivostok.jpg
Yes, it has a bright carpet and some weak-ish ceiling light on, but pay attention to the wooden wall/doors on the right side and how they are illuminated by the daylight coming through the windows

Sure, with regard to the photos one could start arguing how much the chosen exposure time would affect the perceived brightness in those photos, but it is still pretty clear from the pictures that on a bright day, the corridor interior would look anything but dark...
 
Last edited:
Yeah sorry to say what you are talking about has nothing to do with RTX, how lit the the corridors is has no importance here, that depends on what brightness setting you use or the atmosphere that the developers want. The importance of RTX is what is being lit and what is not.
Take a look at your photo and you will see that area in red circles are not lit due to no light being shone there
201972da1f30-b175-4a4b-8371-f3c34e98aa15.jpg


Another photo of train corridor
20191be8fa9b-d650-4ba0-aef0-34ea815621be.jpg

Do you see how those areas in circle are not lit ?

Now look at the non RTX picture that Techspot use
201909925c98-1cf2-400d-b8f9-b05354d5d828.jpg


It is not natural that those area in red circles are being lit, because no light is being shone there, just compare it with the RTX on picture and you will notice that RTX on will make it more photo-realistic. Of course developers can imitate these shadows, by spending ton of human resources to create fake shadows, or just use RTX.
(I don't know why the sunshine in the RTX ON Ultra is less intense than RTX High, if that is what you meant, probably different time in the game has different intensive; you know, just like real life.)
On another note, when the author said "the game looks natural and well lit" with RTX, I think he meant the scenes are being lit correctly.
 
Last edited:
Yeah sorry to say what you are talking about has nothing to do with RTX, how lit the the corridors is has no importance here, that depends on what brightness setting you use or the atmosphere that the developers want. The importance of RTX is what is being lit and what is not.
You might want to revisit my first comment here on the topic again. It is pretty clear what I was talking about, and it was not about some display or game brightness settings or artistic choices made by the designers/developers. Note the quote I began my first comment with. As with your previous comment, it seems you again missed what I argue about.

Also, I am quite impressed how you managed to completely ignore the brightly lit wall/doors there in the photos. How about instead of making a couple of tiny red circles there in the images, just make a big red circle around the brighty lit wall/doors in the photos, and then make an equally big red circle around the very dark wall/doors in the "RTX on" screenshots. Try it, it's not that difficult... ;-)
 
You might want to revisit my first comment here on the topic again. It is pretty clear what I was talking about, and it was not about some display or game brightness settings or artistic choices made by the designers/developers. Note the quote I began my first comment with. As with your previous comment, it seems you again missed what I argue about.

Also, I am quite impressed how you managed to completely ignore the brightly lit wall/doors there in the photos. How about instead of making a couple of tiny red circles there in the images, just make a big red circle around the brighty lit wall/doors in the photos, and then make an equally big red circle around the very dark wall/doors in the "RTX on" screenshots. Try it, it's not that difficult... ;-)

Yeah you might wanna read the last part of my post since I added one last sentence when you are typing your post.
Again you misunderstood the author, what he meant by "well lit" is that the scenes in game are being lit correctly

Take 2 seconds to change your brightly lit corridor to this, just change the brightness and contrast, tbh the stock photo look overexposed.
2019eebd997a-a1a1-4aaf-ba01-40ab78ec95a5.jpg

Do you see the shadows now ? Metro Exodus is a FPS horror game, of course the developers want it to look dark. And well if you can't handle horror games, just turn the brightness all the way up, that will put ya at ease.
 
Last edited:
The only difference with Turing architecture, as Steve has stated in reviews, is how the hardware handles the execution of DXR workloads. You keep saying 'specially coded for; like its something Nvidia is restricting, which still means you do not understand.

I think you need to read this:
in its entirety.

Again, you are mistaken as to what Nvidia is claiming and how it works. You are using marketing, instead of actual DXR benches. (The code for GTX and RTX are different....)

For DXR only the developer has to develop for DirectX.
For RTX, Developers have to spend extra time, writing code special code to get the RT cores to work... (it doesn't just work)

Many of these Developers didn't want to spend the extra time, so Jensen started working with them, so he could claim his special RT cores speed things up.... (True, but ONLY IF THEY CODE FOR IT...!) But, using the same exact code, the RTX are only slightly faster than GTX. Because they are innately, faster cards.

That is the key point... GTX cards, can do DXR.
In the future, there are going to be very few, true "RTX On" games. Because (again), most of the Developer's will be using strait DirectX. And once the Consoles land, nobody will be worried about Nvidia's specific hardware, or even code for it.... unless Nvidia pays them.

Ampere will be more like Navi.... because of this.

Not to many Developer today are going to go down the route they did with Nvidia's PhysX, when up-coming Consoles won't support nvidia's proprietary method.. Can't use the excuse, just because some rich kids bought a 2080ti. That is Nvidia's problem to set up games that make specific use of their specific RT cores.
 
Last edited:
Again, you are mistaken as to what Nvidia is claiming and how it works. You are using marketing, instead of actual DXR benches. (The code for GTX and RTX are different....)

For DXR only the developer has to develop for DirectX.
For RTX, Developers have to spend extra time, writing code special code to get the RT cores to work... (it doesn't just work)

Many of these Developers didn't want to spend the extra time, so Jensen started working with them, so he could claim his special RT cores speed things up.... (True, but ONLY IF THEY CODE FOR IT...!) But, using the same exact code, the RTX are only slightly faster than GTX. Because they are innately, faster cards.

That is the key point... GTX cards, can do DXR.
In the future, there are going to be very few, true "RTX On" games. Because (again), most of the Developer's will be using strait DirectX. And once the Consoles land, nobody will be worried about Nvidia's specific hardware, or even code for it.... unless Nvidia pays them.

Ampere will be more like Navi.... because of this.

Not to many Developer today are going to go down the route they did with Nvidia's PhysX, when up-coming Consoles won't support nvidia's proprietary method.. Can't use the excuse, just because some rich kids bought a 2080ti. That is Nvidia's problem to set up games that make specific use of their specific RT cores.

Funny that COD modern warfare support DXR on Pascal on day 1 without any patch, and the performance on 1080Ti with DXR is horrible

Without DXR
201906248cfc-b02d-43ee-9b01-9f8c26afe8b8.jpg


With DXR
2019eb32500c-c3a0-4721-9ed0-2f179bad58db.jpg


Source https://www.pcgameshardware.de/Call...tracing-Test-Release-Anforderungen-1335580/2/

Yeah 1080Ti is pretty much slower or equal to 2060 on any game with DXR on. Let say AMD has the resource the support DXR, without the dedicated hardwares there is just no hope for Navi to compete with RTX cards on DXR performance. That's why AMD is ignoring DXR for now and hope that people like cheap rasterization performance better. Good luck spending more money for higher end Navi that can't use DXR though.
 
Last edited:
I am glad I didnt get sucked into nvidias BS with RT.

I am still running a Vega 56 I got last December for $229 on sale and will most likely wait until 2021 to upgrade, and even then, only if they are priced right. The 20xx series are not worth what they are charging at all. The AMD 5700/XT arent worth it either since they are basically GTX1080 performance and those can be had a lot cheaper than the 5700 series.
 
I am glad I didnt get sucked into nvidias BS with RT.

I am still running a Vega 56 I got last December for $229 on sale and will most likely wait until 2021 to upgrade, and even then, only if they are priced right. The 20xx series are not worth what they are charging at all. The AMD 5700/XT arent worth it either since they are basically GTX1080 performance and those can be had a lot cheaper than the 5700 series.

Jup price to performance the 5700/XT lose to Vega 56 and offer nothing new feature wise. Sadly AMD doesn't make any money off of Vega 56 so they just kill it off.
Buying GPU without DXR right now just does not make any sense though. Wanna save money ? then you shouldn't be playing games in the first place. If you do enjoy games and it literally saves you money from doing other stupid thing (drinking, smoking weed, etc) then why save a few dollars but take away the best experience you can have.
 
Last edited:
Back