Radeon RX 7900 XTX vs. GeForce RTX 4080

Avro Arrow

Posts: 3,608   +4,640
RT will become EVERYTHING (full replacement for standard rasterization) in the future. Far far future. When video cards are 1000 times faster than they are now. And then many many years later, physics will become everything. We won't have separate graphics cards, audio chips and physics chips. Everything will be simulated by the physics chips.
Yeah, that's what people don't understand. Nobody is saying that ray-tracing is bad or stupid because the concept is amazing. The problem is that nVidia pushed it onto the market before it was ready (when did the RTX 20-series come out again?) and with current hardware, it's both stupid and bad to implement it. I mean, unless you're rocking a $1,600 RTX 4090 for gaming (in which case you have a whole other set of problems to address).
:laughing:
But until then, the only real and noticeable use of RT are shadows in dynamically generated worlds. To me it's irrelevant whether shadows move a tiny bit when the Sun moves a tiny bit. But when you generate new objects in real time (in procedurally generated worlds) there are no shadows at all. The game needs to generate them from scratch, in real-time. For every freaking object. Because there are no precomputed lightmaps for objects that didn't exist at design time. So real-time shadows and ambient occlusion must be really fast. And that's where hardware supported raytracing can really speed things up.

As opposed to procedural worlds, in worlds made entirely by designers there's no huge need for real-time shadows, as shadows can be precomputed and stored to textures. Yes, they are static and won't follow the Sun movement, so just fix the Sun position and problem solved. Or create separate lightmaps for morning, day, evening and night.
That's some great insight Hodor and I completely agree with you! :D
 
Havok's history is irrelevant. It works well and wasn't held behind a wall like PhysX was. Not all open-source standards are functionally better (OpenCL comes to mind) but they are better in the sense that anyone can use them. How many nVidia owners are thankful to AMD that FSR works on their cards when nVidia, the company that they stupidly chose to support, decided to completely abandon them?

I don't care if PhysX is now open-source because it only became that way when nobody wanted to use it anymore. I don't understand why you try so hard to defend nVidia because you don't strike me as someone who is new to the scene. Do you not remember this?:
Hot Hardware: nVidia's drivers disable PhysX if a Radeon card is detected in your PC
Does THAT come across as "open-source" to you? Are you really going to sit there and tell me that nVidia was being magnanimous by making PhysX open-source after they milked it for everything that they could? Come on man, nobody buys that.

Agreed. Please understand that people like me who don't care about it only don't care about it YET. It's still not really ready for prime-time but perhaps it will be in the next GPU generation. When it can be used effectively by everything but the lowest-end cards, only then will I consider it worth looking at because as it stands now, most cards' rasterisation performance is too sensitive to the hit that they take from having it enabled.

I don't consider a technology that has so slight of an effect worth paying hundreds more to use. Out of curiosity, I tried out The Witcher III with RT turned on and honestly, it still looks exactly like the game I remember. I couldn't tell any difference from having RT turned on. When I'm playing a game, I don't stop to admire the reflections and I DEFINITELY don't notice where the shadows are. In fact, back when I was still using my R9 Fury, I had to turn some settings down in Godfall and AC: Odyssey when my RX 5700 XT was sent to XFX for RMA and the first thing I lowered was shadows all the way down to minimum and then I turned off motion blur. The effect it had on my gaming experience was literally none.

I tried playing Cyberpunk 2077 with RT on and had to search for things that were different because the differences were not immediately apparent. Maybe future applications of RT will be game-changers but I don't see it ever having the same effect that hardware tessellation did. As far as I'm concerned, CP2077 looks functionally the same with RT off. I'm fully enjoying it without being a disciple of Jensen Huang which is something that too many people have lost sight of.

People need to give their heads a shake. Seriously, who looks at where the sun is supposed to be when gaming and thinks "Those shadows aren't 100% accurate, this is negatively impacting my gaming experience!". Show me one person who does and I'll show you someone who belongs in the loony bin. :laughing:
No doubt. Buy a 7900 XTX and use the $200-$300 on IDK, faster memory, nvme, a new display, upgrade your cpu, more DLC, whatever. It's a no brainer. Go team red.
 

Watzupken

Posts: 789   +673
Ray tracing is not something bad. Clearly it delivers in terms of visual quality. But for me, this is not something critical, more like a quality of life improvement. The key issue is the performance penalty with RT on. While some may feel that hardware will catch up with the RT requirement, it is partially true if you are using say a RTX 4090 to play a RT title a few years back. If you look at new game titles support RT, the performance remains poor. The RTX 4090 again being hailed as a GPU that allows 4K gaming at high refresh rate in the RT titles tested just 6 months back, is now looking like a 4K 60 FPS enabling GPU now, with another 1.5 years to go before you see something meaningfully more powerful arrive. So I feel one is not really losing much in terms of opting for RDNA3 GPUs if they are cheaper and hopefully AMD will rectify most of the driver issues and work on optimization.
 

KofeViR

Posts: 107   +60
AMD needs to lower prices.

If they want their GPUs to sell, 7900 XT needs to be priced lower than 4070 Ti and 7900XTX needs to cost less than 4080.

People are not paying Nvidia prices for AMD hardware (and less features). It's as simple as that. 9 out of 10 won't even consider buying AMD unless price per dollar is much better.