Five Years Later: Revisiting the GeForce GTX 970

Great revisit, Steve! This will be very helpful for plenty of 290/970 owners contemplating their path forward. Appreciate the hard work.



Oh, you haven't seen the Hardware Unboxed videos where he unboxes the monthly cash stipend from nvidia? It's beautiful. Sometimes he streams it live and you can watch as he straight up just types in extra fps for the nvidia cards based on how many stacks of hundreds they sent him.

Sound preposterous? Yes. Cool, I was just trying to keep my story as whack-a-doo as your constant "AMD oppression" conspiracy hunt.

I mean, you couldn't pick a worse target to accuse of clandestine bias... the gold standard of rigorous benchmarking who uses n=30+ for his data and conclusions, and who consistently recommends AMD products when the data/value warrants it.

Also, they're 5 year old cards! I feel like I'm taking crazy pills!
And yet you can bring not one relevant fact to prove me wrong. False equivalencies and appeals to ridicule are cheap tactics that don't convince anyone that is capable of critical thinking.

Your first post was rather long winded so we are a bit confused on why you are so fired up, but I will take a shot at it:

1.) "Steve should have used the R9 390 instead" --- Well the GTX 970 was launched right in the middle if I remember correct, so it could have gone either way. Using the R9 290 helps package it as a 5 year test.

2.) "Steve didn't want to use the R9 390 because it would have really shown the GTX 970 getting smoked since it has 8 GB of vram" ---- Go to the Hardware Unboxed patreon page. Steve is nice enough upload every single test graph. There really is no evidence of the 4 GB cards being slower than the 8 GB cards save Wolfenstein 2. We all know that game is trash with anything less than 6 GB of vram when using mein settings. No other games showed evidence of vram limitation from what I saw as the RX 570 4 GB was always about 15% slower than the RX 580 8 GB in the rest of the games.

3.) "Using said R9 390 would have shown an even bigger gap at 1440p" --- Stop trying to cherry pick an ideal situation. You complained also that Steve did not use enough 2018 or 2019 games yet expect him to test at resolution that very few would use if they still had the card today. Also, 1440p does not increase vram consumption by much and the whole CPU bottleneck thing makes me want to unscrew my head.
 
Your first post was rather long winded so we are a bit confused on why you are so fired up, but I will take a shot at it:

1.) "Steve should have used the R9 390 instead" --- Well the GTX 970 was launched right in the middle if I remember correct, so it could have gone either way. Using the R9 290 helps package it as a 5 year test.
First of all, that is not what I said. I ASKED why specifically the R9 290 and not the R9 390. Secondly, my first post here is about INCLUDING the R9 390, not replacing the R9 290 with the R9 390. He himself says it's the exact same GPU. So... Wouldn't it be fair to put both in the best positive light? Yeah.

2.) "Steve didn't want to use the R9 390 because it would have really shown the GTX 970 getting smoked since it has 8 GB of vram"
Not what I said. This is the second strawman argument already. What is it with people and strawman arguments? I said his own previous article painted a different picture between what he himself says is the same GPU and the GTX 970. I never said that that picture is the R9 390 smoking the GTX 970. I did say that R9 390 AIBs were significantly faster than R9 290 reference cards. And even then, I never said it was because of the RAM in that context.

---- Go to the Hardware Unboxed patreon page. Steve is nice enough upload every single test graph. There really is no evidence of the 4 GB cards being slower than the 8 GB cards save Wolfenstein 2. We all know that game is trash with anything less than 6 GB of vram when using mein settings. No other games showed evidence of vram limitation from what I saw as the RX 570 4 GB was always about 15% slower than the RX 580 8 GB in the rest of the games.
Quote from GamersNexus;
"The 8GB RX 480 is now at 83.3FPS AVG, 67.3FPS 1% low, and 61.3FPS 0.1% lows. The average is only slightly faster than the 4GB card, at 80FPS, but the 0.1% lows on the 4GB card are 28.7% slower than the 8GB card."
https://www.gamersnexus.net/guides/2503-amd-rx-480-4gb-vs-8gb-benchmark-is-it-worth-it

3.) "Using said R9 390 would have shown an even bigger gap at 1440p" --- Stop trying to cherry pick an ideal situation.
Sticking to 1080p only is not cherrypicking an ideal situation for 3.5GB? The point was that there were legitimate reasons for picking the R9 cards over the GTX 970 back then. In his article he's implying that basically there weren't, and that's false.

You complained also that Steve did not use enough 2018 or 2019 games
Oh really? Where did I do that?

yet expect him to test at resolution that very few would use if they still had the card today.
As I already said previously, framerates under 40 didn't stop him from doing that in his previous article. Why not include it this time? Old article is here;
https://www.techspot.com/review/1410-gtx-970-radeon-390-years-later/

Also, 1440p does not increase vram consumption by much and the whole CPU bottleneck thing makes me want to unscrew my head.
Couldn't care less whether it theoretically does or not. It needed to be tested. That's what benchmarks are all about, aren't they? To separate the theory from the reality.

The 290 and 390 both have 8GB VRAM. The 390 is a rebranded 290 with clock tweaks, and is what started the rebrandeon meme.
You're wrong. All the R9 290 had 4GB, and the 290X had 8GB. The R9 390 and 390X both had 8GB. Try googling any R9 290 with 8GB. You won't find one.

And oh, Steve. Just FYI. I'm not trying to undermine your work. I'm trying to make you improve it. See it as constructive criticism for your next articles. I don't think my arguments are unreasonable. Do you?
 
Did you lie before, or are you lying now? Because you're contradicting your own benchmarks.
In almost every benchmark you did before, Nvidia GTX 970 was faster than AMD R9 290. As can be seen on your own pages:
https://www.techspot.com/review/885-nvidia-geforce-gtx-970-gtx-980/page3.html

But now you've changed the story, and GTX-970 is now 3% slower than R9-290. How is that possible?
1. Did you rig the benchmarks?
2. Did you choose games that support the new version of the truth?
3. Or does Nvidia degrade their drivers for older video cards? Which would again contradict your own findings, where you said that Nvidia is not doing that.

What's the truth? Because there can't be 10 different truths. There's only one. The rest are lies or mistakes.
I can see what you mean. Take, for example, the almighty GTX 780, it's mostly on par with 7970 GHZ Edition in the games tested lately. But it was trading blows with R9 290/x back then. I don't know what's going on here but I doubt it's anything to do with techspot
 
In 5 years, these cards have nearly quadrupled in power and we still don't have a $1000 card that can run a game in Ultra settings at 4K 60FPS.
 
First of all, that is not what I said. I ASKED why specifically the R9 290 and not the R9 390. Secondly, my first post here is about INCLUDING the R9 390, not replacing the R9 290 with the R9 390. He himself says it's the exact same GPU. So... Wouldn't it be fair to put both in the best positive light? Yeah.
Steve already answered this. He didn't want to run all of the benchmarks again. Add like 10% for the 390 on most games and around 50% for Wolf 2. There done.

Not what I said. This is the second strawman argument already. What is it with people and strawman arguments? I said his own previous article painted a different picture between what he himself says is the same GPU and the GTX 970. I never said that that picture is the R9 390 smoking the GTX 970. I did say that R9 390 AIBs were significantly faster than R9 290 reference cards. And even then, I never said it was because of the RAM in that context.
Not a strawman argument. I had to paraphrase due to your obsessive whining.

Quote from GamersNexus;
"The 8GB RX 480 is now at 83.3FPS AVG, 67.3FPS 1% low, and 61.3FPS 0.1% lows. The average is only slightly faster than the 4GB card, at 80FPS, but the 0.1% lows on the 4GB card are 28.7% slower than the 8GB card."
https://www.gamersnexus.net/guides/2503-amd-rx-480-4gb-vs-8gb-benchmark-is-it-worth-it
That particular quote from another site was pulled from the most vram hogging game (black ops) which was not even tested here. Most of the other games did great with 4 GB. To make matters worse, the 8 GB card actually had more bandwidth. Cherry pick much?

Sticking to 1080p only is not cherrypicking an ideal situation for 3.5GB? The point was that there were legitimate reasons for picking the R9 cards over the GTX 970 back then. In his article he's implying that basically there weren't, and that's false.
Given the same texture settings, 1440p does not consume much more vram than 1080p. I have built an entire thread on [H] showing this.

Oh really? Where did I do that?
Guess I didn't follow what you said here: "why is it that if we take a list of say 2015 games, and a list of 2018 games, the R9 290 generally is a better competition in the newer games compared to the older games?"

As I already said previously, framerates under 40 didn't stop him from doing that in his previous article. Why not include it this time? Old article is here;
https://www.techspot.com/review/1410-gtx-970-radeon-390-years-later/
Well that test was done 2 year older ago and with the R9 390, so make of it what you will.

Couldn't care less whether it theoretically does or not. It needed to be tested. That's what benchmarks are all about, aren't they? To separate the theory from the reality.
Reality is that if you are still using these cards, the vast majority will use them on 1080p. The vRam tests are not theoretical:
https://hardforum.com/threads/the-slowing-growth-of-vram-in-games.1971558/
[/QUOTE]
 
Couldn't care less whether it theoretically does or not. It needed to be tested. That's what benchmarks are all about, aren't they? To separate the theory from the reality.

So please, go ahead, do your own benchmarks to prove what you think you already know and we will gladly take it with a grain of salt. Then proceed to criticize, nitpick and question the integrity of said benchmark, request you test more cards/games/resolutions, with and without overclocking every GPU in your benchmark...
 
Not a strawman argument. I had to paraphrase due to your obsessive whining.
Aaaand this is where the "conversation" ends.

So please, go ahead, do your own benchmarks to prove what you think you already know and we will gladly take it with a grain of salt. Then proceed to criticize, nitpick and question the integrity of said benchmark, request you test more cards/games/resolutions, with and without overclocking every GPU in your benchmark...
If I had the cards I would.
 
I dunno what conspiracy you're looking for but there ain't one here.

That was 2014. The problem with 2014 is that games released in 2015-2019 hadn't happened yet because 2015-2019 hadn't happened yet.

The.
Games.
Are.
DIFFERENT.

Steve benchmarked todays games because *today* people are playing *today's* games. Weird but true, eh? He also commented just above there —have a look— that NVidia doesn't degrade their drivers. He's done the tests.


Really??? The games are so much different for one card, but not for another? The games are so much different that card that was slower is now faster? That's the lamest excuse I've ever heard. All the games were faster on Nvidia before, but now the games are faster on AMD. Same Nvidia and same AMD. That's really incredible. I wonder who would believe in such an explanation. Except you, of course.
 
Really??? The games are so much different for one card, but not for another? The games are so much different that card that was slower is now faster? That's the lamest excuse I've ever heard. All the games were faster on Nvidia before, but now the games are faster on AMD. Same Nvidia and same AMD. That's really incredible. I wonder who would believe in such an explanation. Except you, of course.

Dude, you're funny.

You know why I believe Steve? Because he does the tests. You know, actual work to prove or disprove things, as opposed to just posting random opinions on the 'net.

Go do your own tests and publish them. Until then, your opinions convince nobody.
 
Ahh no it's not mate, we've benchmarked the improvements brought about by new drivers in plenty of games over the years.
Well, you're convinced. I'm still not convinced. I'm not saying drivers didn't bring improvements. But I am saying that if it was (solely) due to drivers, why is it that if we take a list of say 2015 games, and a list of 2018 games, the R9 290 generally is a better competition in the newer games compared to the older games?

It's a 290 vs 970 comparison and frankly I didn't feel like benchmarking the same GPU in 33 games again.
I can understand that. Benchmarking is definitely a chore... But why did it need to be the R9 290 and not the R9 390 in the first place? The reason I say it, is because;

1) This review (https://www.techspot.com/review/1410-gtx-970-radeon-390-years-later/page12.html) by you, shows the R9 390 a lot closer to the GTX 970 than the R9 290 ever got back then. Also, just a tip for next time, please do mention if they are reference cards or not. I didn't see it in the article (or I'm really blind lol). If you had to benchmark reference cards, then I get it. The R9 390 only were available as reference cards from XFX and chances are you don't have one. If this is the case, disregard the other points below.

2) The R9 390 was a competition for the GTX 970 for slightly longer. 10 months after the R9 290, the GTX 970 was released as you mentioned. The R9 290 competed for 9 months with the GTX 970, until the R9 390 was released. It was slightly improved to be direct competition for the GTX 970 up until the RX480 was release, which was 12 months later. The GTX 970 was released between them. You basically gave nVidia the slight upper hand by having their card that released almost a year later specifically designed to tackle the competition, and (accidentally) robbing AMD of the same chance. And then you compare it to the GTX 1060 and RTX 2060...? What is this? Advertisement for nVidia?

3) Considering the R9 390 AIBs were around ~15% faster than the reference R9 290, had double the RAM, and was equal or cheaper in price than the GTX 970... Yeah... It's not exactly as black and white as this article is painting that exact same GPU. The previously linked article of yours comparing AIB versions of the GTX 970 and the R9 390 tells quite a different story. It makes this article seem like a way to discredit the ones that recommended AMD back then, despite there being completely legitimate reasons to do so. If I had to do so today, I would still recommend an AIB R9 390 over any GTX 970.

4) It was well-known that the R9 390 stretched its legs more at 1440p compared to the GTX 970. Two reasons. CPU overhead, and RAM. Yes. I get it. Looking at the 1080p framerates, it doesn't seem that these cards are viable for 1440p anymore. But that didn't stop you back then in your article stated at #1. Also, the R9 290 was not viable for 1440p due to its 4GB RAM limitation. This would be a great way to test if RAM really is a limit or not, by comparing the 8GB vs the 3.5GB at 1440p. The current argument that 3.5GB is fine, not only do I find it incomplete, I honestly find it appalling to put a product in a good light after nVidia deliberately deceived its customers.

Cheers.

I'm glad somebody else came to say it. AMD's real GTX 970 challenger was the R9 390. I think that it should have been included in this revisit article.
 
I'm glad somebody else came to say it. AMD's real GTX 970 challenger was the R9 390. I think that it should have been included in this revisit article.

I'm not sure why no one's told you guys this... but the 390 is the 290 :dizzy::bomb:
 
I'm not sure why no one's told you guys this... but the 390 is the 290 :dizzy::bomb:
That the 390 uses the same GPU doesn't necessarily mean it's the same graphics card. The 300 series had some tweaks for the memory controller, for one. Generally, also better memory was used to reach higher clocks. Little bugs were ironed out, and throttling was a thing of the past.

But thank you for showing us how 'neutral' you really are.
 
That the 390 uses the same GPU doesn't necessarily mean it's the same graphics card. The 300 series had some tweaks for the memory controller, for one. Generally, also better memory was used to reach higher clocks. Little bugs were ironed out, and throttling was a thing of the past.

But thank you for showing us how 'neutral' you really are.

So it wasn't just a factory overclocked 290 with 8GB's of VRAM? Sure as hell performed like that was the situation ;) Don't worry about how 'neutral' I am, we all know you're a full blown AMD fanboy :p
 
So it wasn't just a factory overclocked 290 with 8GB's of VRAM? Sure as hell performed like that was the situation ;) Don't worry about how 'neutral' I am, we all know you're a full blown AMD fanboy :p
You call it factory overclocked based on what? It's a new series using the same GPU. You really expected the same clocks? You purposefully leave out the so-called 'factory overclocked' card to put the 3.5GB card in a positive light. And I'm the fanboy. Good going there. It's well-known that the R9 390 was a much better competition than the R9 290, but you purposefully want to avoid that situation, don't you?

I wrote a whole list of reasons why you should have including the R9 390. Your reasoning has been a petty "I didn't feel like it" and "it's the same card anyway". PROVE it's the same card then.

You really are immature, you know that? Calling your audience fanboys because they disagree with you and your shallow arguments is pathetic. Expected better from you...
 
You call it factory overclocked based on what? It's a new series using the same GPU. You really expected the same clocks? You purposefully leave out the so-called 'factory overclocked' card to put the 3.5GB card in a positive light. And I'm the fanboy. Good going there. It's well-known that the R9 390 was a much better competition than the R9 290, but you purposefully want to avoid that situation, don't you?

I wrote a whole list of reasons why you should have including the R9 390. Your reasoning has been a petty "I didn't feel like it" and "it's the same card anyway". PROVE it's the same card then.

You really are immature, you know that? Calling your audience fanboys because they disagree with you and your shallow arguments is pathetic. Expected better from you...

I was just calling you a fanboy, for being a fanboy.
 
I was just calling you a fanboy, for being a fanboy.
Yeah. If one doesn't like the 3.5GB GTX 970 or doesn't like the leaving out of a card which would make the comparison fair, one MUST be a fanboy, right? Pot kettle.

This conversation is over.
 
You call it factory overclocked based on what? It's a new series using the same GPU. You really expected the same clocks? You purposefully leave out the so-called 'factory overclocked' card to put the 3.5GB card in a positive light. And I'm the fanboy. Good going there. It's well-known that the R9 390 was a much better competition than the R9 290, but you purposefully want to avoid that situation, don't you?

I wrote a whole list of reasons why you should have including the R9 390. Your reasoning has been a petty "I didn't feel like it" and "it's the same card anyway". PROVE it's the same card then.

You really are immature, you know that? Calling your audience fanboys because they disagree with you and your shallow arguments is pathetic. Expected better from you...

I was just calling you a fanboy, for being a fanboy.

Oh come on Steve, don't you know doubling the vram would have doubled the performance in this test?!

Somehow the argument that the R9 390 is mostly an overclocked R9 290 is a "shallow argument" despite no real evidence to the contrary.

I came for the quality work. I stayed for the fanboy lashing out.
 
More appeals to ridicule. More straw man arguments. More name-calling. When one is out of arguments, that's all that's left, right? And most importantly, the ones that are the most eager to call others fanboys, are often fanboys themselves. The saddest thing is when the bias is so strong, that one is unaware of it in oneself.

I'll simply leave with this...;

When the RX 480 had a (non-)issue with power consumption through the PCI-E slot, it was a huge deal that made the card so-called dangerous and unreliable.
When the GTX 970 deliberately deceived nVidia customers regarding its usable memory size, it was still a great card to buy.

When the GTX 970 had 3.5GB and was a competitor of the R9 390 with 8GB, the GTX 970 is fine.
When the R9 Fury X with 4GB was a competitor to the 980 Ti with 6GB, Fury X has too little VRAM.
(in before "Bbbbut 1440p!!!!")

When AMD had the superior product in terms of power consumption in the late 2000s, only the speed mattered.
When AMD had the best speed with the R9 290 release, their power consumption mattered.

When AMD had driver faults, those were a deal breaker.
When nVidia had drivers that killed their cards, they're still good enough to buy.

When AMD brought FreeSync, they were just copying nVidia and having an inferior version of G-sync.
When nVidia started supporting FreeSync, nVidia is great for doing so.

When nVidia brings out overpriced cards, they are justified in doing so because features, or speed.
When AMD brings out equivalently performing & priced cards (Radeon VII), they are overpriced because they failed to help lower nVidia prices.

And the list goes on and on. The sad thing is that the majority are not even aware that they are doing it. And when reporters fall into this category, it's a truly sad time for gamers. People see nVidia as the default without realizing it. And even sadder is the fact that when this is brought forward, one is accused of victim playing or being an AMD fanboy. But seeing these things does not make one a fanboy. It makes one aware of the mind share that nVidia has. It makes one... Independent of the herd. It makes one able to make a real choice.

At this point, AMD can bring a $400 card performing like a 2080Ti, and people will find some excuse not to buy it. Just like the RX570 is now cheaper than a 1050Ti and gets you two additional games, and yet, Steam is littered with 1050 Ti cards and barely any RX 570s to be seen. All of this is rational consumers picking out the best options. Right? RIGHT? Can't go wrong with nVidia!

The goal posts shift every time in favor of nVidia. Go ahead. Prove me wrong. Or will we only see name-calling and fallacies? Or maybe total silence?
 
I think you may be under the spell of these things you are accusing everyone else of. You mention several monumentally detrimental events or characteristics of AMD cards that I've never heard people use as a factor for their purchasing decisions:
  • When the RX 480 had a (non-)issue with power consumption through the PCI-E slot, it was a huge deal that made the card so-called dangerous and unreliable.
  • When the R9 Fury X with 4GB was a competitor to the 980 Ti with 6GB, Fury X has too little VRAM.
  • When AMD had the best speed with the R9 290 release, their power consumption mattered.
  • When AMD had driver faults, those were a deal breaker.
  • When AMD brought FreeSync, they were just copying nVidia and having an inferior version of G-sync.
  • When AMD brings out equivalently performing & priced cards (Radeon VII), they are overpriced because they failed to help lower nVidia prices.
When my acquaintances, friends, and I buy video cards it's typically based on what is the fastest card I can purchase with my specific budget TODAY. The problem for the last few years has been that AMD has been a little late to bring that performance to the game and we're supposed to be expected to wait a year or two for drivers to mature and future games to take advantage of AMD's work. By that point I'll probably just buy the newer, faster nVidia product if my performance isn't where I want it to be. I'm pretty sure everyone here wants there to be better competition between nVidia and AMD and a third serious player wouldn't hurt either, but when it comes to GPUs AMD has struggled to keep up and I believe this is why nVidia can beat us over the head with 2080Ti pricing. Everyone (except nVidia share holders) HATES the price of the 2080Ti and if AMD did bring something with equivalent performance for a reasonable price (I hate to say it but < $800 USD sounds reasonable these days) you can bet everyone would be all over that and nVidia would be forced to adjust. They just can't give me a 2080Ti performing card that can only perform that way in 20% of the games unless I'm willing to wait a year for drivers to mature enough to get that up to 50% of the games while forcing me to buy a bigger powersupply and better fans.
 
I think you may be under the spell of these things you are accusing everyone else of. You mention several monumentally detrimental events or characteristics of AMD cards that I've never heard people use as a factor for their purchasing decisions:
  • When the RX 480 had a (non-)issue with power consumption through the PCI-E slot, it was a huge deal that made the card so-called dangerous and unreliable.
  • When the R9 Fury X with 4GB was a competitor to the 980 Ti with 6GB, Fury X has too little VRAM.
  • When AMD had the best speed with the R9 290 release, their power consumption mattered.
  • When AMD had driver faults, those were a deal breaker.
  • When AMD brought FreeSync, they were just copying nVidia and having an inferior version of G-sync.
  • When AMD brings out equivalently performing & priced cards (Radeon VII), they are overpriced because they failed to help lower nVidia prices.
When my acquaintances, friends, and I buy video cards it's typically based on what is the fastest card I can purchase with my specific budget TODAY. The problem for the last few years has been that AMD has been a little late to bring that performance to the game and we're supposed to be expected to wait a year or two for drivers to mature and future games to take advantage of AMD's work. By that point I'll probably just buy the newer, faster nVidia product if my performance isn't where I want it to be. I'm pretty sure everyone here wants there to be better competition between nVidia and AMD and a third serious player wouldn't hurt either, but when it comes to GPUs AMD has struggled to keep up and I believe this is why nVidia can beat us over the head with 2080Ti pricing. Everyone (except nVidia share holders) HATES the price of the 2080Ti and if AMD did bring something with equivalent performance for a reasonable price (I hate to say it but < $800 USD sounds reasonable these days) you can bet everyone would be all over that and nVidia would be forced to adjust. They just can't give me a 2080Ti performing card that can only perform that way in 20% of the games unless I'm willing to wait a year for drivers to mature enough to get that up to 50% of the games while forcing me to buy a bigger powersupply and better fans.
You know... I wrote a whole post tackling every single point to prove it... But, I really don't need to. I'm quite sure that I'm not the one under a spell. I will show you... I will tackle two of the most important points. The most important ones where you yourself, will be the protagonist in one of them. The other protagonist, is going to be Steve... Let's start with Steve...

When AMD had the best speed with the R9 290 release, their power consumption mattered.
Firstly, I meant the R9 290X, to avoid confusion. I will edit that. But going on-topic... You say that is never used as a purchasing decision... WOW. Ok... Even IF these are not used as direct reasons for purchase, they do contribute to the mind share of how nVidia is always superior, and THAT ultimately has brought us to the state of gaming where we are today, where nVidia can charge what they want. If people want AMD to compete, they need to start supporting AMD when AMD deserves it. Let me quote what our beloved Steve himself said regarding the R9 290X;

"Pros: Similar performance to the GTX Titan at nearly half the cost -- a gutsy play that should provoke a response from Nvidia.
Cons: It's hot enough to remind us of Fermi and it's still priced like a premium card compared to more mainstream Radeons."
https://www.techspot.com/review/727-radeon-r9-290x/page11.html

Although the following point is not directly power consumption related... Even at half the cost of the Titan while having similar performance, he is STILL complaining about the price. What is the reason that such a price is not justified if it has performance similar to Titan? Tell me. Give me one good reason. You think that is normal? The only reason I can think of, is that in his mind, Radeon MUST be cheaper than nVidia. Most likely because in his mind, they are inevitably worse than nVidia and must charge less. Even though it is already half the price for the same performance, that was still considered a con rather than a pro. Let THAT sink in. Take all the time you need, and then come back and tell me that I am the one under some sort of spell.

The R9 290X was equal to or faster than the Titan when it came out, at half the price and that remained that way for 10 months. How many people bought it? Why didn't so many people buy it? Ah but the power consumption... Yeah. Let's tackle that. Because like Steve himself said, it was hot enough to remind us of Fermi. But people still bought Fermi though. WHY??? The performance crown? Why does that work for Fermi and not for the R9 290X????? Did the Radeon competition outsell Fermi? So what is it? Performance crown, or power consumption? Yeah... Everyone that's honest knows the answer. It's whichever one nVidia is better at. That's why Fermi can sell and the R9 290X can't.

Now it's your turn on the stage...

When AMD had driver faults, those were a deal breaker.
Driver faults are never used as a purchasing decision? REALLY? Firstly, it's the standard response that you get, when you talk about the HD 4850/HD 4870 being superior. That is given as the reason that barely anyone bought it compared to Fermi.
Secondly... The driver argument is only viable if the card is slower at the time of release. In the majority of cases, it isn't. In the majority of cases, the cards are on equal footing, or slightly faster, and the gap only grows over time. And yet the same excuse is used... Even if the card is substantially cheaper...

Lastly... The ace in the hole. To quote your own posts, right here...

"The problem for the last few years has been that AMD has been a little late to bring that performance to the game and we're supposed to be expected to wait a year or two for drivers to mature and future games to take advantage of AMD's work. By that point I'll probably just buy the newer, faster nVidia product if my performance isn't where I want it to be."
And
"They just can't give me a 2080Ti performing card that can only perform that way in 20% of the games unless I'm willing to wait a year for drivers to mature enough to get that up to 50% of the games while forcing me to buy a bigger powersupply and better fans."

Remember that you just that "When AMD had driver faults, those were a deal breaker" is NEVER used as a factor for their purchasing decisions... Oh... You just used that as a reason yourself....
...
Oops.
...
But forget all this. I'm the fanboy. I'm the biased one. I'm the one under a spell. ;)

Right now, it's all on power consumption. Don't forget power consumption. We will forget though, if AMD ever becomes more efficient, just like that neat little fact is still swept under the rug when it comes to Ryzen vs Intel CPUs. Because it's all about performance.... Preferably, single threaded performance....

Yes. People want good AMD cards so they can buy cheaper nVidia. And obviously that's not going to work to create more competition. The current mentality is biased towards nVidia and it is not going to change the gaming space for the better. It is only going to make it worse. Blaming AMD is easy. But maybe, just maybe, it's not only AMD that is to blame.
 
Back