Is Ray Tracing Worth the FPS Hit? 36 Game Performance Investigation

It doesn’t take a fanboy to understand that nvidia sponsored RT implementations are not optimized for AMD cards, heck, they are not even optimized for nvidia cards. The only thing they are optimized for, is making all current gen cards obsolete, so they can sell you a new one. They didn’t even try to make it run on anything but the 4090
 
Anyone remember the days of having seperate 2d and 3d cards? Maybe the power play is to make a ray tracing or Path Tracing card?

This is exactly what I wrote in a Hardware Unboxed video comments thread on YouTube when Ray Tracing was first a thing. It absolutely requires a separate card with nothing but RT cores on it to be effective. It's just too intensive a process to be just an add-on to existing graphics card technology. And six years later, the horrendous ROI proves it.
 
When ray tracing was first a thing, I posted a comment on a Hardware Unboxed YouTube thread predicting that ray-tracing would take off and replace rasterization and DLSS was a dead technology. Wow, that didn't age well, did it? ;) It's okay, I don't mind being wrong. :D What I do mind, however, is the extent to just how wrong I was but that's all on NVIDIA, not me.

The implementation of ray-tracing technology has been absolutely horrendous. NVIDIA refuses to give consumers the necessary amount of RT technology at an affordable price, whether it be an add-on card or a more generous amount of RT cores in GPU cards. So even the highest-end cards can't run the technology at a satisfactory level for consumers. So it's not worth it to enable it. So now the adoption rate sinks. And now RT which could and should be the next revolution in PC graphical technology is nothing more than an also-ran 6 years later.

DLSS is massively improved, and actually took off the way ray-tracing was expected to instead. But I'd argue that even this is not a good thing, because NVIDIA is just using it as a substitute for the proper actions they should be taking, namely giving us more RT cores at an affordable price. And even with DLSS we still can't get acceptable results from games that actually implement proper ray traced visuals well.

This isn't rocket science. Look at how fast rasterized 3D graphics became a standard in our lifetime and look how long it's taking ray-tracing to do so. If it ever does. If you don't make a technology available at an affordable price en masse, it will not be adapted en masse. Until NVIDIA pulls its head out of Jensen's ***, nothing is going to change.
 
This is exactly what I wrote in a Hardware Unboxed video comments thread on YouTube when Ray Tracing was first a thing. It absolutely requires a separate card with nothing but RT cores on it to be effective. It's just too intensive a process to be just an add-on to existing graphics card technology. And six years later, the horrendous ROI proves it.
Well this Ray Tracing fad went from "a cool gimmick" to "everyone needs it all the time and you're gonna pay 200% the price of raster for the privledge"

Don't get me wrong, I see RT and PT as awesome tech, I'm happy to see it. The thing is, I wish it was optional. I wish I could buy $500 GPU and stick a $2-300 RT card in later if I wanted it. I don't like how midranged cards that can't run RT properly ANYWAY have die space taken up on them for it. Whether your GPU can use RT effectively or not, you are still paying for it in the form of die space.

Heck, bring back socketed chips and stick an RT co processor on the back of a card GPU. Even on highend cards, you are forced to pay for ray tracing whether or not you even plan to use it.

The consumers really got boned with how nVidia implemented RT, forcing AMD to follow suit.
 
I just got a 7900 XTX for a bargain, and I do not regret it. Coming from a 3080 Rog Strix, the performance uplift is very good. I also got a UWQHD monitor, so that extra VRAM helps, even if 24 GB is overkill.
AMD's AFMF 2 feature, that generates frames *outside the games* at driver level is REALLY efficient, and I think AMD is onto something here. The quality is really good and getting around 100-140+ FPS in CP 2077 with raytracing on high is quite the feat IMHO.
AMD drivers are really good in comparison to Nvidia's prehistorical interface.
And my last AMD/Ati card was an AMD Radeon HD 7970 Ghz Edition almost 15 years ago.

Overall, I found I am not too much into Raytracing. If I can enable it, and if it makes a difference, I will use it, but if it makes no visible difference, then it's off.
The games where it *really* makes a difference are rather few, and with newer cards needing a private power plant, I think I will keep my current setup for quite a while...
 
No...

What I am saying is that Tim and Steve are now doing what Nvidia asked them to do... what they were so vocal about not going to do in the name of journalistic impartiality...

(Ricochet)

🏹 --> :dizzy:

Tim and Steve are evaluating a boatload of games to see if RT is worth it 4 years later. Many people try things again when the situation changes, though some do prefer to make a decision and then permanently stick their head in the sand afterwards regardless of any change.

And their answer here is mostly No with a few exceptions. And Path Tracing will be the future of graphics some time from now. Seems reasonable, so what parts of their conclusions are a problem?
 
Well this Ray Tracing fad went from "a cool gimmick" to "everyone needs it all the time and you're gonna pay 200% the price of raster for the privledge"

Don't get me wrong, I see RT and PT as awesome tech, I'm happy to see it. The thing is, I wish it was optional. I wish I could buy $500 GPU and stick a $2-300 RT card in later if I wanted it. I don't like how midranged cards that can't run RT properly ANYWAY have die space taken up on them for it. Whether your GPU can use RT effectively or not, you are still paying for it in the form of die space.

Heck, bring back socketed chips and stick an RT co processor on the back of a card GPU. Even on highend cards, you are forced to pay for ray tracing whether or not you even plan to use it.

The consumers really got boned with how nVidia implemented RT, forcing AMD to follow suit.

I totally agree on this. High end GPU get RT, lower tier cards should get faster GPUs for the same money. Or same performance but cheaper. But, that's not in Nvidia's or AMD's interest so...
 
I had an 2070 which I bought Just to play Metro Exodus with RT and DLSS.
It was a só bad e xperience with temporal artifacts and RT only in global ilumination (The car light didnt cast grass shadows at night as the sun would do on day).
In a few days I sold it to a friend.

Sometimes we have to test something before buying and the online review werent enough.
 
My personal experience with ray tracing effects: absolutely GREAT in Control, nice in many other games, almost distracting in DOOM Eternal.

And absolutely heinous, performance-killing **** in Elden Ring. I have a GeForce RTX 3080 10G, by the way.
 
this is so wrong it should be a criminal offense.

if you have top shelf gear you arent playing at 1440p and especially 1080(ffs a ps2 could crank 1080i)

you burn all that cash to play at 4k, so for those folks(llike myself and probably many other forum lurkers)4k is all you really care about because if you arent playing there youve just burned money, im still on the fence with RT, yes it looks great but its rare because it needs devs to implement it correctly for it to shine and many dont do that so you end up with wasted performance and terrible image quality.

they may be because unreal engine is just a dog, when its bolted onto custom built engines RT seems to shine.
More dots doesn't do s--t in regards to making something look better. All it does is make it clearer, but not better. Good AO and RT and AA and such is what makes a game image pop, which is why it will always be better to go to 1440p and max everything else instead of playing at 4K and having to run the game at Medium like a re-re, but you do what you want. All I know, is that I am enjoying gaming at 1440p and 150+ FPS with everything 100% maxed and it will forever and ever look better than 4K with Med settings at barely 45-60.
 
While RT does improve visuals, I generally dont use it because performance hit is too significant. I feel if you really want great visuals, you don’t need to run RT and top end GPUs. Just get an OLED monitor, and those lighting effects will be more impactful with almost no performance lost.
 
It's 99% certain RDNA4 8800XT will smack the 7900XTX in RTing if that's your thing. I can see why AMD are desperately trying to evaporate stockpiles of 7900 cards. 8880XT at least as strong as 7900XT in raster, much faster than 7900XTX in RTing, < $600, much higher AI performance, and a lot lower power. I wonder if Nv!diots will still wax lyrical about $1200+ 5080 or $2000+ 5090.
 
Now show the numbers when not using DLSS or FSR to increase frame rates artificially ie: 4K Full RT no DLSS or FSR
I bet it ain't that great when you can't use other software to render at a lower resolution and FPS will tank even further
 
"RTX 4090, all three path-traced games run at below 50 FPS at 4K using DLSS Quality upscaling. And that's a high-end, $1,600 graphics card. "
Love how he calls it a $1600 graphics card then provide direct links to the 4090 at $2500 and the cheapest pny model at $2300.

Ray tracing not ready for prime time and often looks worse with more glitches. It it were free, no loss in fps maybe but if you have to use Frame gen ugghh to get it forget it. not worth it at all.

 
this is so wrong it should be a criminal offense.

if you have top shelf gear you arent playing at 1440p and especially 1080(ffs a ps2 could crank 1080i)

you burn all that cash to play at 4k, so for those folks(llike myself and probably many other forum lurkers)4k is all you really care about because if you arent playing there youve just burned money.

I don't think I've burned my money. I have a 7900XTX, 7800x3d and an AW2725DF. I would go so far as to say that no GPU on the market can perform at 4k natively to a satisfactory level and they are at best 1440p cards. That's before you even start thinking about RT. I want at least 120FPS natively and the 4090 only get 114FPS, according to TPU. Some people might be happy with that or using an upscaled image and frame generation to play on their 4k screens. Great, I'm glad they're happy. I want a different experience.
 
Last edited:
The thing about race tracing is that if I, with my limited knowledge, made a game and added it, the game would look like a crappy unreal asset collections.
I remember when NV advertised it in upcoming Metro exodus.
I thought it looked nice, but so did the game with RT off.
It is not a magic trick, a talented person still needs to build visuals for ray tracing to shine.
 
My Argument is always that I won't pay a premium for fancy shadows. What is surprising is how far AMD came. Obviously Nvidia is a lot faster but compared to AMD's first attempt it is a huge leap. AMD selling point was never raytracing though as they believe in pure performance. The 4090 cost over 40k here. The 4080 is 25k and the 7900XTX is 19k. For 40k they better bloody well make sure I can raytrace at 8k as you can get a descend second hand car with that money. There are no value in getting a 4090 here.
 
Traditional raster pipelines got so good at emulating or 'faking' the effects of what ray tracing would look like they made it far less impactful when it finally became viable (if not very performant) to do in real time.

I do believe that ray tracing can transform how a game looks for the better, but it relies entirely on what the developer is trying to achieve. While the current consoles are weak then the majority of games will continue to be developed with ray tracing as an afterthought, rather than integral to the visual appeal.

Until we see next gen consoles with muscular ray tracing acceleration and AI upscaling across the board then it'll still only be impactful in a few technical triumphs on PC like Metro, Alan Wake 2 or Cyberpunk.
 
Techspot in 2020

https://www.techspot.com/news/87946-ugly-side-nvidia-rollercoaster-ride-shows-when-big.html

In a tersely-worded email, Nvidia told Hardware Unboxed (and by extension TechSpot) that it would no longer be providing them with GeForce Founders Edition review units. The stated reason? Spending too little time focusing on RTX ray tracing, as opposed to raster performance. Hardware Unboxed, apparently, did "not see things the same way that we (Nvidia), gamers, and the rest of the industry do."

Techspot in 2024

"Based on visual analysis, we firmly believe path tracing is the future of high-end PC visuals because it's the most likely to transform how a game looks and give that next-generation feel"
Literally 4 years of difference in the tech world dude.
 
Back