Radeon RX 7900 XTX vs. GeForce RTX 4080

"But, DX12 does allow you to enable ray tracing and with the Ultra RT preset enabled… the game ran like crap. Sure it looked nice, but we went from over 200 smooth frames per second at 1440p, to under 100 fps.

The RTX 4080 played reasonably well with 66 fps on average and 1% lows of 48 fps, it wasn't amazing, and certainly not a $1,200 experience, but it was playable. The 7900 XTX, on the other hand, was a stuttery mess, completely unplayable and a bad experience. Both sucked at 4K, so you will need to use upscaling here."

"....but whether or not it's worth buying at $1,000 will depend on how much stock you place in ray tracing performance."

Hmmm, "how much stock?"... Let me think this over for a nano second: Ah yes, ABSOLUTELY zero stock!!

Stop pushing the RT nonsense and making a big deal out of it in those increasingly ludicrous tests!!

There are hardly any games that use this RT gimmick (well, 5% MAYBE of all games?? Is that even worth mentioning??) and if they do, they suck big time at >70 fps for a $1200+ - $2000+ cards!! (your own words, in article above).

Sometimes I think not all the blame lies with greedy manufacturers, but also with certain websites pushing the game manufacturers' crazy, super expensive gimmicks....to justify the obscenely expensive GPUs.!!
 
Please stop the RT bs, pushed down our throats by the influencers because its what their nvidia marketing overlords pays them to say.
RT is currently a gimmick that tanks performance for no visual benefits (except maybe 5 or so games).

There are thousands of good PC games, plus what you can play via console emulation that dont have neither really need RT.
Speaking as someone who does not care about RT at all, there is still a middle road here: RT is fine to those who play the games that support it AND care about the visual difference, it just shouldn't be the be-all and end-all.

A good analogy to this, for me, is the ol' PhysX tech. There are hardly any games that ever really used this; Mafia would show extra particles if you shot up a building, and Mirror's Edge had that cloth-sim bit with the helicopter shooting at you through the scaffolding, but that was about it. It stuck around as an nVidia-exclusive tech for several years, but in the grand scheme of things, didn't really matter at all.

But then, there was Borderlands 2.

The visual difference in that game with PhysX versus without it was astonishing, primarily because that game's appeal largely centers around absolutely ludicrous amounts of particles and explosions and general chaotic destruction. Blowing up a barrel of sludge, then throwing a black hole grenade and watching all of the goo on the ground get sucked up into it was unbelievably satisfying, and the game was absolutely chock full of effects like that... But, since PhysX was nVidia proprietary tech, playing the game on an AMD card simply wouldn't show it, and when you've played it for so long with the PhysX on, the game looks positively bland without it.

Now, there are millions of people who have never and will never play Borderlands 2, and many of the ones who DID also didn't care about it. Objectively, it was simply unnecessary; we have also reached a point where such physics are commonly available in software and no longer require proprietary tech to exist, so ALL titles can take advantage of it as they see fit. However, for me personally during the many years I played BL2, PhysX was an absolutely indispensable part of the experience that genuinely helped me decide to stick with nVidia cards.

Fast forward to Ray Tracing, and it's the same thing again, except this time, I personally am not one of the people that cares. If I can enable it and maintain decent performance in a game I'm playing, cool, but it will not sell me on any product. It's fine for manufacturers to advertise it as a feature, and I would honestly say I don't even mind the nVidia card being more expensive because it has better RT performance, even though both these cards are too damn expensive to begin with. What I do want, however, is to see reviewers stop going "X card is bad because its RT performance is bad." nVidia is objectively better at RT, but being better at RT doesn't objectively make nVidia BETTER.

But for the buying public, live and let live. I firmly believe that most people out there don't actually care about RT at all, but for the few people that actually do enjoy it, I will happily let them have their fun. I don't have to follow their lead, any more than all console gamers need to immediately go out and buy a PC. We can all enjoy whatever we want, let's just stop buying overpriced products in general.
 
Entirely academic to me at this price, but the 71 point swing between the top & bottom of the charts feels like it is a lot bigger than it should be for two cards that both claim to be implementing the DX12 / DX11 API.

Not sure if TechSpot does this kind of reporting but it would be interesting to hear quotes from the game developers on both outer edges for their perspective on why their game performs so differently on the two platforms.

 
@AdamNovagen

I agree wholeheartedly with you.

Thats why I have stated on other posts, that RT is currently a gimmick because the current hardware and software are simply between 3 to 5 gens away (at least in my opinion).
How would that be satisfied? When we can buy a GPU that produces at least 60 FPS @ 4K with all RT options on and the GPU cost less than US$500.
Right now, the performance hit is beyond insane for very little return.

We will one day get there, but right now, the RT nonsense is abused to make the weak minded ones desire this for no real reasons besides the marketing push and coincidentally, into Dear Leader Jenses arms.
There are literally thousands of PC games that many here have not played yet that are fantastic and dont have RT, yet all you see is the same nonsense "You need RT in your life yesterday!".

Talking about PhysX, until today, I am pissed at how it was implemented in all Arkham games (which I adore and always go back to replay them) since it forced me to stay with nvidia if I wanted those effects.
Thats the main reason why I hate Nvidia, because their proprietary tech main function is to keep you locked in and I refuse to support this or be limited like this. Plus, I really like open standards.

Like the eyecandy provided by PhysX, RT will eventually get there and hopefully will add something tot he gameplay, which is something that absolutely nobody mentions and somehow seems that its no longer important in gaming.
 
I'm sticking with my trusty 1080ti. I don't mind being a few generations behind. It still does what I need and I'm not paying 800usd plus for a new gpu. If I did upgrade it would be to a 3080 class GPU should I find one that is priced right.



 
Surprising to see Steve's constant "underwhelming" feeling towards the 7900XTX.

I thought the 7900XTX's $200 reduction with similar or better better performance compared to the 4080 is a no-brainer.

Yes, if it's much cheaper it would be easier to recommend. But looking at the trends of current flagship releases of all makers, 1K seems to be the base. Sad but true. And why should AMD sell at cheap prices anyway, when Nvidia keeps pushing up the prices?

And who cares about the RTX fad anyway (other than the die-hard proponents)? It reminds me of the Virge cards (the GX variant at least) which were known as 3D 'de'celerators those days due to dragging down the performance. Soon I believe, the RTX fad will simmer down like the Nvidia 'hairworks'.

I would wait for another 6 months or so to upgrade from my 5700XT, though. And I wonder if there will be a "7950XTX"...
 
Last edited:
The problem with RT is the massive performance hit. Nvidia may have cushioned that impact better than AMD, but it is still a massive performance hit, depending on the complexity of the RT implemented. And truth to be told, many people will claim the goodness of RT images, but the reality is that they won't know even if RT is switched off because we don't run the same game side by side, with and without RT enabled. So assuming there is a bug in the game where switching RT on, don't actually do that, they will still assume that RT is on because the image quality is really not that lacking. To me, there is little bearing on say shadow quality, realistic light source, etc, on how fun a game really is. They are nice to have, but I don't look for them when playing a game with good story and gameplay. In fact, despite using an Ampere RTX 3080 12GB, I rarely play any RT titles. The only ones that I played were, Control and Guardians of the Galaxy.
 
Because AMD was targeting 1440p gaming, not high-end 4K, with the release of the 5000 series.

Ah, sure, obviously... 9 means 4k, 7 means 1440p - that makes totally sense!
So a 7970/7950 were 4k cards and 7750 was for 1440p, right?

The people following this "logic" are probably the same people who are surprised about that mere 10% market share AMD got on dedicated GPUs...
 
Ah, sure, obviously... 9 means 4k, 7 means 1440p - that makes totally sense!
So a 7970/7950 were 4k cards and 7750 was for 1440p, right?

The people following this "logic" are probably the same people who are surprised about that mere 10% market share AMD got on dedicated GPUs...
That's quite a leap and a ridiculous exaggeration for effect.

5700XT was first Gen Navi and they knew the performance was on par with their flagship at the time which was Radeon VII and to certain extent Vega 64. They were target mid/high mid range which was 1440p and was priced accordingly at $400.
 
Ah, sure, obviously... 9 means 4k, 7 means 1440p - that makes totally sense!
In AMD's grand scheme of RDNA nomenclature, yes -- that's what they were aiming for. x900/x800 is targeted at high-end graphics at 4K; x700/x600 at 1440p; x500/x400 at 1080p.
So a 7970/7950 were 4k cards and 7750 was for 1440p, right?
Different architecture, different times.
 
In AMD's grand scheme of RDNA nomenclature, yes -- that's what they were aiming for. x900/x800 is targeted at high-end graphics at 4K; x700/x600 at 1440p; x500/x400 at 1080p.

Different architecture, different times.
Adding to that, I could be remembering wrong, but back then, GPU's were released from the bottom up or middle up, not like now, halo first, then rest down.
 
Adding to that, I could be remembering wrong, but back then, GPU's were released from the bottom up or middle up, not like now, halo first, then rest down.
The only reason AMD was doing that was simply because they didnt HAVE a high end GPU ready, or ever. Vega was released over a year after polaris. rDNA had no high end competitor.

Go back to when AMD was more competitive, and the likes of evergreen and early GCN launched with the high end GPUs first, the 7970/290x came out before the rest of the lineup.

Nvidia has always launched their xx8x series cards first, with the exception of the likes of the 4090.
 
A good analogy to this, for me, is the ol' PhysX tech. There are hardly any games that ever really used this; Mafia would show extra particles if you shot up a building, and Mirror's Edge had that cloth-sim bit with the helicopter shooting at you through the scaffolding, but that was about it. It stuck around as an nVidia-exclusive tech for several years, but in the grand scheme of things, didn't really matter at all.

But then, there was Borderlands 2.

The visual difference in that game with PhysX versus without it was astonishing, primarily because that game's appeal largely centers around absolutely ludicrous amounts of particles and explosions and general chaotic destruction. Blowing up a barrel of sludge, then throwing a black hole grenade and watching all of the goo on the ground get sucked up into it was unbelievably satisfying, and the game was absolutely chock full of effects like that... But, since PhysX was nVidia proprietary tech, playing the game on an AMD card simply wouldn't show it, and when you've played it for so long with the PhysX on, the game looks positively bland without it.

Now, there are millions of people who have never and will never play Borderlands 2, and many of the ones who DID also didn't care about it. Objectively, it was simply unnecessary; we have also reached a point where such physics are commonly available in software and no longer require proprietary tech to exist, so ALL titles can take advantage of it as they see fit. However, for me personally during the many years I played BL2, PhysX was an absolutely indispensable part of the experience that genuinely helped me decide to stick with nVidia cards.
Sure, this is very true, but what ended up happening to PhysX? Oh yeah, it was eventually superseded by Havok, the engine that ATi (and then AMD) chose to support. You can still find games with Havok EVERYWHERE but PhysX? Not so much.
 
Surprising to see Steve's constant "underwhelming" feeling towards the 7900XTX.
Yeah, it's like he moves the goalposts. He's "underwhelmed" by the performance of the RX 7900 XTX even though its performance MATCHES a card that costs $200 more, a card that he calls "a great performer". His wording is showing a completely unfair green bias as nVidia seems to be able to do no wrong while AMD can do no right.

Just look at what Steve Walton said about the RTX 4070 Ti:
Now look at what everyone else said about the RTX 4070 Ti.

We'll start with Steve Burke:
Then BPS Customs (whom Paul's Hardware used as his template):

Now to get REALLY crazy, let's see what the guys who are typically green fanboys say with LTT:
And then the ULTIMATE nVidia fanboy, JayzTwoCents:
When JayzTwoCents, of all people, pans a GeForce card, there's no way that it should be given a score of 80 and positively reviewed. If Jay says that the card is terrible, there is no entity on the planet who should argue with him because he NEVER says things like that.

This is why it no longer says "Elite" under my avatar.
 
Last edited:
Sure, this is very true, but what ended up happening to PhysX? Oh yeah, it was eventually superseded by Havok, the engine that ATi (and then AMD) chose to support. You can still find games with Havok EVERYWHERE but PhysX? Not so much.
You might've missed the rest of my post, because that was exactly my point: PhysX was a good enough benefit to me personally to make me stick with nVidia AT THAT TIME, and it's now basically irrelevant. I expect RT as it now to go exactly the same way, and it happens to NOT be a benefit I personally care about, nor do I think the majority of gamers care either. I just think there's no need to make fun of the few folks that genuinely enjoy it.
 
Sure, this is very true, but what ended up happening to PhysX? Oh yeah, it was eventually superseded by Havok, the engine that ATi (and then AMD) chose to support. You can still find games with Havok EVERYWHERE but PhysX? Not so much.
Havok was bought by Intel, back in 2007, and then sold to Microsoft in 2015. It's had far more money thrown at it, for promoted use in games, than Nvidia could do with PhysX. Besides, Nvidia bought Aegia to remove a hardware competitor from the market; the software itself is now open source.
I expect RT as it now to go exactly the same way, and it happens to NOT be a benefit I personally care about, nor do I think the majority of gamers care either.
RT isn't going to disappear - AMD, Intel, and Nvidia have made their intentions clear as to the direction GPUs are taking. Microsoft's DXR is four years old now, but has only seen one revision so far; the next one will probably coincide with the next generation of GPUs.
 
You might've missed the rest of my post, because that was exactly my point: PhysX was a good enough benefit to me personally to make me stick with nVidia AT THAT TIME, and it's now basically irrelevant. I expect RT as it now to go exactly the same way, and it happens to NOT be a benefit I personally care about, nor do I think the majority of gamers care either. I just think there's no need to make fun of the few folks that genuinely enjoy it.
I was agreeing with you and continuing on with it. I did give your post a like before I responded to it. You were 100% correct and I understood you just fine. ;)
 
Havok was bought by Intel, back in 2007, and then sold to Microsoft in 2015. It's had far more money thrown at it, for promoted use in games, than Nvidia could do with PhysX. Besides, Nvidia bought Aegia to remove a hardware competitor from the market; the software itself is now open source.
Havok's history is irrelevant. It works well and wasn't held behind a wall like PhysX was. Not all open-source standards are functionally better (OpenCL comes to mind) but they are better in the sense that anyone can use them. How many nVidia owners are thankful to AMD that FSR works on their cards when nVidia, the company that they stupidly chose to support, decided to completely abandon them?

I don't care if PhysX is now open-source because it only became that way when nobody wanted to use it anymore. I don't understand why you try so hard to defend nVidia because you don't strike me as someone who is new to the scene. Do you not remember this?:
Hot Hardware: nVidia's drivers disable PhysX if a Radeon card is detected in your PC
Does THAT come across as "open-source" to you? Are you really going to sit there and tell me that nVidia was being magnanimous by making PhysX open-source after they milked it for everything that they could? Come on man, nobody buys that.
RT isn't going to disappear - AMD, Intel, and Nvidia have made their intentions clear as to the direction GPUs are taking. Microsoft's DXR is four years old now, but has only seen one revision so far; the next one will probably coincide with the next generation of GPUs.
Agreed. Please understand that people like me who don't care about it only don't care about it YET. It's still not really ready for prime-time but perhaps it will be in the next GPU generation. When it can be used effectively by everything but the lowest-end cards, only then will I consider it worth looking at because as it stands now, most cards' rasterisation performance is too sensitive to the hit that they take from having it enabled.

I don't consider a technology that has so slight of an effect worth paying hundreds more to use. Out of curiosity, I tried out The Witcher III with RT turned on and honestly, it still looks exactly like the game I remember. I couldn't tell any difference from having RT turned on. When I'm playing a game, I don't stop to admire the reflections and I DEFINITELY don't notice where the shadows are. In fact, back when I was still using my R9 Fury, I had to turn some settings down in Godfall and AC: Odyssey when my RX 5700 XT was sent to XFX for RMA and the first thing I lowered was shadows all the way down to minimum and then I turned off motion blur. The effect it had on my gaming experience was literally none.

I tried playing Cyberpunk 2077 with RT on and had to search for things that were different because the differences were not immediately apparent. Maybe future applications of RT will be game-changers but I don't see it ever having the same effect that hardware tessellation did. As far as I'm concerned, CP2077 looks functionally the same with RT off. I'm fully enjoying it without being a disciple of Jensen Huang which is something that too many people have lost sight of.

People need to give their heads a shake. Seriously, who looks at where the sun is supposed to be when gaming and thinks "Those shadows aren't 100% accurate, this is negatively impacting my gaming experience!". Show me one person who does and I'll show you someone who belongs in the loony bin. :laughing:
 
Last edited:
People need to give their heads a shake. Seriously, who looks at where the sun is supposed to be when gaming and thinks "Those shadows aren't 100% accurate, this is negatively impacting my gaming experience!".

RT will become EVERYTHING (full replacement for standard rasterization) in the future. Far far future. When video cards are 1000 times faster than they are now. And then many many years later, physics will become everything. We won't have separate graphics cards, audio chips and physics chips. Everything will be simulated by the physics chips.

But until then, the only real and noticeable use of RT are shadows in dynamically generated worlds. To me it's irrelevant whether shadows move a tiny bit when the Sun moves a tiny bit. But when you generate new objects in real time (in procedurally generated worlds) there are no shadows at all. The game needs to generate them from scratch, in real-time. For every freaking object. Because there are no precomputed lightmaps for objects that didn't exist at design time. So real-time shadows and ambient occlusion must be really fast. And that's where hardware supported raytracing can really speed things up.

As opposed to procedural worlds, in worlds made entirely by designers there's no huge need for real-time shadows, as shadows can be precomputed and stored to textures. Yes, they are static and won't follow the Sun movement, so just fix the Sun position and problem solved. Or create separate lightmaps for morning, day, evening and night.
 
Last edited:
I'm sticking with my trusty 1080ti. I don't mind being a few generations behind. It still does what I need and I'm not paying 800usd plus for a new gpu. If I did upgrade it would be to a 3080 class GPU should I find one that is priced right.

Have a look on eBay, every day, several times a day. I did this and found an Asus ROG Stric 3080 OC Gaming for 620 euros, almost new, as the guy who had it (well, them, 2 cards bought by his employer to do spreadsheets... ????) had barely used it! No traces of use or dust on the fans, as they probably have been used very little displaying windows desktop... Be patient and scout the secong hand market very carefully, it does pay in the end.
 
Back