Radeon RX 7900 XTX vs. GeForce RTX 4080

RT will become EVERYTHING (full replacement for standard rasterization) in the future. Far far future. When video cards are 1000 times faster than they are now. And then many many years later, physics will become everything. We won't have separate graphics cards, audio chips and physics chips. Everything will be simulated by the physics chips.
Yeah, that's what people don't understand. Nobody is saying that ray-tracing is bad or stupid because the concept is amazing. The problem is that nVidia pushed it onto the market before it was ready (when did the RTX 20-series come out again?) and with current hardware, it's both stupid and bad to implement it. I mean, unless you're rocking a $1,600 RTX 4090 for gaming (in which case you have a whole other set of problems to address).
:laughing:
But until then, the only real and noticeable use of RT are shadows in dynamically generated worlds. To me it's irrelevant whether shadows move a tiny bit when the Sun moves a tiny bit. But when you generate new objects in real time (in procedurally generated worlds) there are no shadows at all. The game needs to generate them from scratch, in real-time. For every freaking object. Because there are no precomputed lightmaps for objects that didn't exist at design time. So real-time shadows and ambient occlusion must be really fast. And that's where hardware supported raytracing can really speed things up.

As opposed to procedural worlds, in worlds made entirely by designers there's no huge need for real-time shadows, as shadows can be precomputed and stored to textures. Yes, they are static and won't follow the Sun movement, so just fix the Sun position and problem solved. Or create separate lightmaps for morning, day, evening and night.
That's some great insight Hodor and I completely agree with you! :D
 
Havok's history is irrelevant. It works well and wasn't held behind a wall like PhysX was. Not all open-source standards are functionally better (OpenCL comes to mind) but they are better in the sense that anyone can use them. How many nVidia owners are thankful to AMD that FSR works on their cards when nVidia, the company that they stupidly chose to support, decided to completely abandon them?

I don't care if PhysX is now open-source because it only became that way when nobody wanted to use it anymore. I don't understand why you try so hard to defend nVidia because you don't strike me as someone who is new to the scene. Do you not remember this?:
Hot Hardware: nVidia's drivers disable PhysX if a Radeon card is detected in your PC
Does THAT come across as "open-source" to you? Are you really going to sit there and tell me that nVidia was being magnanimous by making PhysX open-source after they milked it for everything that they could? Come on man, nobody buys that.

Agreed. Please understand that people like me who don't care about it only don't care about it YET. It's still not really ready for prime-time but perhaps it will be in the next GPU generation. When it can be used effectively by everything but the lowest-end cards, only then will I consider it worth looking at because as it stands now, most cards' rasterisation performance is too sensitive to the hit that they take from having it enabled.

I don't consider a technology that has so slight of an effect worth paying hundreds more to use. Out of curiosity, I tried out The Witcher III with RT turned on and honestly, it still looks exactly like the game I remember. I couldn't tell any difference from having RT turned on. When I'm playing a game, I don't stop to admire the reflections and I DEFINITELY don't notice where the shadows are. In fact, back when I was still using my R9 Fury, I had to turn some settings down in Godfall and AC: Odyssey when my RX 5700 XT was sent to XFX for RMA and the first thing I lowered was shadows all the way down to minimum and then I turned off motion blur. The effect it had on my gaming experience was literally none.

I tried playing Cyberpunk 2077 with RT on and had to search for things that were different because the differences were not immediately apparent. Maybe future applications of RT will be game-changers but I don't see it ever having the same effect that hardware tessellation did. As far as I'm concerned, CP2077 looks functionally the same with RT off. I'm fully enjoying it without being a disciple of Jensen Huang which is something that too many people have lost sight of.

People need to give their heads a shake. Seriously, who looks at where the sun is supposed to be when gaming and thinks "Those shadows aren't 100% accurate, this is negatively impacting my gaming experience!". Show me one person who does and I'll show you someone who belongs in the loony bin. :laughing:
No doubt. Buy a 7900 XTX and use the $200-$300 on IDK, faster memory, nvme, a new display, upgrade your cpu, more DLC, whatever. It's a no brainer. Go team red.
 
Ray tracing is not something bad. Clearly it delivers in terms of visual quality. But for me, this is not something critical, more like a quality of life improvement. The key issue is the performance penalty with RT on. While some may feel that hardware will catch up with the RT requirement, it is partially true if you are using say a RTX 4090 to play a RT title a few years back. If you look at new game titles support RT, the performance remains poor. The RTX 4090 again being hailed as a GPU that allows 4K gaming at high refresh rate in the RT titles tested just 6 months back, is now looking like a 4K 60 FPS enabling GPU now, with another 1.5 years to go before you see something meaningfully more powerful arrive. So I feel one is not really losing much in terms of opting for RDNA3 GPUs if they are cheaper and hopefully AMD will rectify most of the driver issues and work on optimization.
 
AMD needs to lower prices.

If they want their GPUs to sell, 7900 XT needs to be priced lower than 4070 Ti and 7900XTX needs to cost less than 4080.

People are not paying Nvidia prices for AMD hardware (and less features). It's as simple as that. 9 out of 10 won't even consider buying AMD unless price per dollar is much better.
 
Surprising to see Steve's constant "underwhelming" feeling towards the 7900XTX.

I thought the 7900XTX's $200 reduction with similar or better better performance compared to the 4080 is a no-brainer.

A $1000 GPU is not a no-brainer regardless of whose name is on it. It's indicative of the problem of this generation of GPUs. Too expensive.
Yes, if it's much cheaper it would be easier to recommend. But looking at the trends of current flagship releases of all makers, 1K seems to be the base. Sad but true. And why should AMD sell at cheap prices anyway, when Nvidia keeps pushing up the prices?

Because buyers feel that NVidia is gouging us on price. AMD could take some serious market share, but it seems that they also want to gouge us. I suspect that driving margin is fueling both company's pricing strategy. Neither of them has done well over the past year.
And who cares about the RTX fad anyway (other than the die-hard proponents)? It reminds me of the Virge cards (the GX variant at least) which were known as 3D 'de'celerators those days due to dragging down the performance. Soon I believe, the RTX fad will simmer down like the Nvidia 'hairworks'.

No one cares because it cost $1600 to get a GPU that can handle it. When a $500 GPU can handle it well, then everyone will care. I doubt it will simmer down. If I'm spending $1000+ on a GPU then it had better handle RT to some extent.
I would wait for another 6 months or so to upgrade from my 5700XT, though. And I wonder if there will be a "7950XTX"...
There might be a 7950, I've heard that AMD could make a higher performing GPU, but it comes at the expense of drawing more power.
 
Sure, this is very true, but what ended up happening to PhysX? Oh yeah, it was eventually superseded by Havok, the engine that ATi (and then AMD) chose to support. You can still find games with Havok EVERYWHERE but PhysX? Not so much.
Not entirely true. There are many games still using Physx, according to PCGamer, over 700 including games like Tiny Tina's, Returnal, Star Wars Jedi, Marvell's Midnight Suns, Warhammer and more. Many of these games were released in 2023 and 2022. I can't find much on Havok games released after 2020.
 
FineWine is a barrel of copium for early adopters.

The 7900xtx will never be a 4090 killer, ever. In 5-6 years, it may end up being 5-7% faster then a 4080, when it no longer matters and the RX 9700 is faster then the 7900 ever was.

This is what happened to the 7000 series. Sure, it ended up being 5% faster then the 680, but by the time that happened the 1080 was out and nobody cared.

Not exactly the same thing, that generation of GCN suffered geometry bottlenecks, plus GCN overall suffers from severe occupancy issues. Most gaming workloads barely scales beyond the first 11 CUs, making hardware utilization a diffficult task. Plus RDNA 3 for the best (Or worst) has some compiler optimizations involved to allow the dual FP32 utilization in flight similar to what Ampere and Ada Lovelace does (Unlike Turing which was hard coded to be 64 threads for INT code or 64 for FP32 data)
 
Back