AMD Radeon RX 6900 XT Review: Can AMD Take the Performance Crown?

But he is NOT correct in this sort of thinking. Its simplistic and wrong. CPU and GPU and tech in general always needs the 'stupid high end' that will never make much sense to 99% of people. Guess what? Those products *are not aimed at you*. Please try to understand this.
They exist for very good reasons. You and Steve can stamp feet all day about it or actually try to understand the R&D behind why these sort of products exist in many areas in all of society. If you dont want to take time to understand why, thats fine, but you both remain wrong.
The problem with that kind of thinking is that the same price hikes and problems trickle down to the rest of the stack. The PC market does need ultra high end products, but it doesn't need ultra stupid prices. You are not buying a professional card with the extra bells and whistles that come with it, the 3090 is a consumer grade product with the artificial limits put in place to differentiate it from the Titan and Quadro class cards.
 
The problem with that kind of thinking is that the same price hikes and problems trickle down to the rest of the stack.
No they don't -- and if forty years of evidence to the contrary in the computer industry doesn't make you see that, nothing will.

NVidia has 13,000+ employees, with a combined annual salary of several billion dollars. None of those employees, I point out, actually manufacture these graphics chips, they simply design them, market them, and write drivers for them. Those salaries have to be amortized into the cost of these boards. You think the measly margins on low-end cards will cover those billions of dollars spent every year? Bleeding-edge products provide an outsized share of financing the R&D for new technology.

The PC market does need ultra high end products, but it doesn't need ultra stupid prices.
So you want ridiculously expensive products, you simply don't want to pay for them? Hey, I have an idea. Have the government place a maximum price of $500 on all graphics cards. Or why not $100? Then everyone can afford the latest and greatest technology, right?

the 3090 is a consumer grade product with the artificial limits put in place to differentiate it from the Titan and Quadro class cards.
If it didn't have those limits, why would anyone buy a Titan or Quadro? And, if they didn't buy those cards, NVidia would be forced to charge gamers an even higher percentage of their R&D costs, and you'd pay even more. So don't criticize the existence of these cards. Be thankful for them.
 
No they don't -- and if forty years of evidence to the contrary in the computer industry doesn't make you see that, nothing will.

NVidia has 13,000+ employees, with a combined annual salary of several billion dollars. None of those employees, I point out, actually manufacture these graphics chips, they simply design them, market them, and write drivers for them. Those salaries have to be amortized into the cost of these boards. You think the measly margins on low-end cards will cover those billions of dollars spent every year? Bleeding-edge products provide an outsized share of financing the R&D for new technology.

So you want ridiculously expensive products, you simply don't want to pay for them? Hey, I have an idea. Have the government place a maximum price of $500 on all graphics cards. Or why not $100? Then everyone can afford the latest and greatest technology, right?

If it didn't have those limits, why would anyone buy a Titan or Quadro? And, if they didn't buy those cards, NVidia would be forced to charge gamers an even higher percentage of their R&D costs, and you'd pay even more. So don't criticize the existence of these cards. Be thankful for them.
Wow, "40 years of evidence" - can back that up that claim with numbers? If not then I'll just say that in 40 years we've seen the top end cards go up and UP and UP in price and the midrange went from 200$ to 400$. Remember when 4-500$ was the most you would pay for the best single GPU card?

I don't want "ridiculously expensive products" and also pay low for them. I want prices to reflect their value. Paying 200% for 10-15% is not decent. That's an indecent mark-up even with the extra VRAM and the same goes for AMD's 1000$ card too.

At that price point I expect professional cards and you argument of "why would anyone buy Titans or Quadro" doesn't make sense. Quadro cards start from 1000$ and go up to 5000$. The Titan was expensive because it also included some of the features that were meant only for Quadro cards so that people could get the best of both worlds.The 3090 doesn't have them.

The 3090 and 6900XT are just a glorified trophies and you defending their "value" is just... weird. There is no value to be had there. People should not get used to such prices because once this becomes normal people like you appear that defend such practices to the detriment of consumers and the whole stack of products go up in price.
 
Great unbiased review! Congrats! I wished I could buy a 3070 to upgrade my 1070 but availability and current prices are just outrageous.
Alternate.de has a availability for the 3070....starting at € 200 over the FE‘s MSRP.
Hopefully prices are coming down early next year as I refuse to buy either an AMD or nVidia based card for more than 5-10% over MSRP. And in that case, the AIB model has to be better than the FE / reference card and not just clocked higher and louder.
But he is NOT correct in this sort of thinking. Its simplistic and wrong. CPU and GPU and tech in general always needs the 'stupid high end' that will never make much sense to 99% of people. Guess what? Those products *are not aimed at you*. Please try to understand this.
They exist for very good reasons. You and Steve can stamp feet all day about it or actually try to understand the R&D behind why these sort of products exist in many areas in all of society. If you dont want to take time to understand why, thats fine, but you both remain wrong.
Imho, the „stupid high end“ should be just that - offer the “wow“ factor even if it‘s expensive.

Offering just a little bit more of the same isn‘t it.
 
I'm pretty certain that the RT version isn't going to make your kid any better at visual or logical design building. The skill set that Minecraft cultivates doesn't depend on RT capability. The things that I've seen that have been built in Minecraft can be just mind-blowing and that was long before the RT version existed. Your kid will be just as well off with the java version. :D
Absolutely agree. My point was that my kid prefers the java version because it offers more in terms of playability to him (mods, online play)....
The newer version may look better but he doesn‘t care - it’s function over form. And this actually makes me happy.
 
True. Many of the young 'uns, especially the millennials started to have the mindset that the most expensive is the best. Or at least buy to brag. Which is absolutely childish and wasteful. Wisdom and greedy foolishness never go together.

Being sold on "luxury" moniker makes the buyers look like dunce actually.
In some cases expensive is actually the best. This is often the case with bespoke products or the ones made by perfectionists in small series where they put an amazing amount of thought and effort into building the best product for the use case.

At the same time, these are not things you can use to brag as most people either would not know them or identify them as luxury items, which makes them all the nicer.
 
Wow, "40 years of evidence" - can back that up that claim with numbers? If not then I'll just say that in 40 years we've seen the top end cards go up and UP and UP in price
Well, let's just look back only ten years or so, shall we? NVidia released the Geforce GTX 295 at $500. In today's dollars: $650 or so. The 3090 is ten times faster, and not even twice as expensive. Or, a performance per dollar increase of more than 600%.

Or we can revert 20 years, to the days of the Geforce 256: more than 100x slower than the 3090, but compare the two cards in performance per (inflation-adjusted) dollars, and the 3090 is some 70 times cheaper!

Now: here's the real question. Where do you think NVidia acquired the revenues to finance that massive increase in performance-per-dollar? You think they got it by selling budget cards at razor-thin margins?

I want prices to reflect their value. Paying 200% for 10-15% is not decent. That's an indecent mark-up even with the extra VRAM
You're cherry-picking comparisons. A much better comparison is the 3090 to the prior generation: the 2080ti. For an extra 25%, you get 45% better performance and more than twice as much VRAM. Quite a bargain, eh?

Or, instead of comparing the 3090 to the 3080, let's flip that around and compare the 3080 to its big brother. You get 90% of the performance for half the price ... even if you do give up most of your VRAM. WTG NVidia!
 
I can't imagine paying top money for a GPU to play.... Minecraft.

But you do you.
I’m just saying, paying $999 and not getting ray tracing/DLSS etc is a bit rubbish. I mean it’s not as if anyone is buying these parts for their value for money.
 
Well, let's just look back only ten years or so, shall we? NVidia released the Geforce GTX 295 at $500. In today's dollars: $650 or so. The 3090 is ten times faster, and not even twice as expensive. Or, a performance per dollar increase of more than 600%.

Or we can revert 20 years, to the days of the Geforce 256: more than 100x slower than the 3090, but compare the two cards in performance per (inflation-adjusted) dollars, and the 3090 is some 70 times cheaper!
Heck, the Atari 800 was introduced for $999.95. That‘s $ 3,430.50 in today‘s money. Look at the performance and features you get with a $35 Raspberry PI vs. that. I think perf/$ is probably 100,000 times that of the Atari.

And don‘t even get me started with the first pocket calculator....
 
Well, let's just look back only ten years or so, shall we? NVidia released the Geforce GTX 295 at $500. In today's dollars: $650 or so. The 3090 is ten times faster, and not even twice as expensive. Or, a performance per dollar increase of more than 600%.

Or we can revert 20 years, to the days of the Geforce 256: more than 100x slower than the 3090, but compare the two cards in performance per (inflation-adjusted) dollars, and the 3090 is some 70 times cheaper!

Now: here's the real question. Where do you think NVidia acquired the revenues to finance that massive increase in performance-per-dollar? You think they got it by selling budget cards at razor-thin margins?

You're cherry-picking comparisons. A much better comparison is the 3090 to the prior generation: the 2080ti. For an extra 25%, you get 45% better performance and more than twice as much VRAM. Quite a bargain, eh?

Or, instead of comparing the 3090 to the 3080, let's flip that around and compare the 3080 to its big brother. You get 90% of the performance for half the price ... even if you do give up most of your VRAM. WTG NVidia!
Just because the card is faster now doesn't mean that we should think of it in terms of the performance and price value of the 256. That's just flawed reasoning. The top end was 500$ (650$ as you said after inflation), period.

FYI you don't know the margins NVIDIA makes so commenting on them is a moot point. the RTX 2080 Ti is rumoured to have had a 40% profit margin for NVIDIA (at the mythical 999$ MSRP, not the real street prices). Don't say "WTG" when NVIDIA was ripping you off with exorbitant prices and huge profit margins. You are just acknowledging that you got ripped off and you liked it. There is a reason why the 2000 series heavily undersold, smart people waited for something that made sense.

I didn't cherry pick any comparison. I just stated the facts: the GPU prices of today are too high at the high end which in turn are driving prices of the lower end up too. It's a lose-lose situation at the moment created by both Nvidia and AMD.

I actually thank you for comparing the 3090 to the 2080ti. It just shows just how badly valued both are.
 
Not sure how much you love those 4 games but that number is too low for me to consider dropping the coin on one of the RTX cards.

But suppose you were in the market for a new card, would you buy the one that has substandard RT performance and compatibility, for no other particular benefits to speak of?
 
How many total games do you own? (I'm at 1,398 according to GOG most of which do not have ray tracing - still rocking a 1080Ti)......

Not sure. About 400 or so. I'd get the card that plays those 1,398 AND supports RT but that's just me.
 
But suppose you were in the market for a new card, would you buy the one that has substandard RT performance and compatibility, for no other particular benefits to speak of?

Considering my list of games have zero RT support that I play that's an easy answer :)
 
Absolutely agree. My point was that my kid prefers the java version because it offers more in terms of playability to him (mods, online play)....
The newer version may look better but he doesn‘t care - it’s function over form. And this actually makes me happy.
He's a smart kid. I think I can see where he gets it from. :D
 
Thanks for the thoughtful feedback. I am just wondering, could it be that with HBM before the memory was the one starving for data from the chip. Just wondering #wink
Absolutely. It was complete overkill and still would be today. That's why I think that "special" VRAM is just a marketing ploy to make people think one is better than the other.

I look at VRAM pretty much the way I look at regular RAM. I'd rather have more of the slow stuff than less of the fast stuff because nothing is slower than using a page file. :laughing:
 
Hi, just to say that I bought a 3090 and I'm not trying to compensate for anything. Nor am I pushing it in anyone's face that I have a 3090.

I was going to buy a 3080 but I got fed up waiting to replace my near 5 year old 1080 so I bought the 3090 instead. Is it overpriced? You betcha? Would I have bought it under normal circumstances? Not a chance...

However, these are not normal circumstances and graphics cards (and tech in general) at all levels are very hard to come by - almost everything is sold out, at least in the UK. Life is short (global pandemics have a way of resolving focus) and I did not want to wait until say April next year to turn up settings in the games I play (I'm at 5120x1440). I wanted an upgrade now and I could afford it so I bought the 3090. I know quite a few people in the same boat as me and who bought for similar reasons.

BTW I started playing on the ZX Spectrum so that gives my age away - I'm no Millennial with an e-peen.
As you say, you don't talk about it all the time because you know better. I was of course not taking into account the availability factor, I was just talking in general. Of course your reasons for your purchase are completely valid.

The ZX Spectrum... wow. I'll date myself for you in return. My favourite Atari game was "Combat" and my first "real computer" (well, it wasn't technically "mine") was a TRS-80 Color Computer with a whopping 32K of RAM. I must give my parents credit though because they gave me a CoCo II (64K) for my birthday and back then, that was expensive as hell and we weren't exactly rich. My stepdad was a manager at a Radio Shack so he probably got a great deal on it.

We must be pretty close in age because we both started in the early 80s. That means you understand the saying "I RLL'ed the MFM out of it!". :laughing:
 
Why miss out, though? It's not like there are any compelling reasons to explicitly pick Navi over Nvidia.
This is actually a very good point. The pricing of the ATi card isn't compelling, the performance isn't compelling and the feature-set isn't compelling. I'd say that despite the "Radeon Resurgence", AMD managed to snatch defeat from the jaws of victory. AMD knows CPUs but it's ATi that knows GPUs and the corporate structure isn't vertically integrated. ATi is still internally called ATi.

I really think that ATi was better off without AMD acting like an albatross around its neck. Left to their own devices, I have no doubt that ATi could have defeated nVidia with what they have because they would have understood that they're not the market leader, they're the alternative and would have priced them accordingly. They would also have recognised that with their limited ray tracing capability and (more importantly) l lack of an answer for DLSS 2.0, their cards were worth a good deal less than their nVidia counterparts even if they were on par in rasterisation performance.

AMD doesn't recognise this and even worse, they didn't recognise that for most people, in the GPU space, AMD is more of an off-brand than Intel. They should have left the ATi branding intact. Right now, nVidia holds similar rasterisation performance, similar pricing (except for the RTX 3090 but that's irrelevant), superior ray tracing performance (I don't care about this but enough people do), better features like DLSS 2.x and, with the obvious exception of the RTX 3090, their pricing is about on par with AMD as well.

Right now, there is no compelling reason to buy an RX 6000-series card unless it's the only thing that you can get your hands on. I say this as someone who has had an all-AMD rig since my Phenom II X4 940/XFX Radeon HD 4870 rig over ten years ago.
 
Last edited:
But suppose you were in the market for a new card, would you buy the one that has substandard RT performance and compatibility, for no other particular benefits to speak of?
Good question. RT right now is a bonus, but if you mostly play multi player games, speed rules over reflective puddles.
If the point comes where it essentially makes no difference wrt rendering performance and is also used to improve gameplay, then it is a deciding factor.

Other than that, the reason to go for Radeon right now would be because I appreciate choice, so that makes supporting the smaller player worth it. Also, I do not feel I have sacrificed anything by buying AMD based cards that were admittedly at the lower end of the market.

But there‘s a caveat: I am not willing to pay market leader prices for a product that isn‘t market leading. CPU wise AMD is there, GPU wise they have a good way to go, still, although they are making progress.
 
This is actually a very good point. The pricing of the ATi card isn't compelling, the performance isn't compelling and the feature-set isn't compelling. I'd say that despite the "Radeon Resurgence", AMD managed to snatch defeat from the jaws of victory. AMD knows CPUs but it's ATi that knows GPUs and the corporate structure isn't vertically integrated. ATi is still internally called ATi.

I really think that ATi was better off without AMD acting like an albatross around its neck. Left to their own devices, I have no doubt that ATi could have defeated nVidia with what they have because they would have understood that they're not the market leader, they're the alternative. AMD doesn't recognise this and even worse, they didn't recognise that for most people, in the GPU space, AMD is more of an off-brand than Intel. They should have left the ATi branding intact.

If AMD didn't buy ATI someone else would have they were bleeding money.
 
Back