definitely, it's just that pricing now it's completely insane. What the Fury was worth (AMDs highest end at the time) you can't even buy a RTX 4070 for, maybe a RTX 4060 once it comes out. Might be 4050 Territory, the very bottom of the stack.
I couldn't agree more but the problem isn't so much nVidia as the people who buy their cards without a second thought. I still
see people posting here, who should
know better, still referring to cards in terms of RTX 30-this or RTX 40-that. If their brains are so hard-wired as to only think in terms of nVidia part numbers, they're screwed. They have to use a Jedi mind trick on themselves?
I had to train my brain to be able to hold two sets of numbers in the same stack for both CPUs and GPUs because I started to fall into that category myself with FX, Ryzen and Radeon numbers. I started to lose track of where Intel CPUs and nVidia GPUs fell into the stack and when I realised this, I stopped treating the two number sets differently. Instead of "AMD Numbers" and "Intel Numbers" I trained my brain to call that category "CPU Numbers". In the same way I trained my brain to have "GPU Numbers" that simply treated Radeon and GeForce numbers equally. That way of thinking made it easy to keep track of two stacks without having to think about flipping back and forth.
It's like, take this number: 67245691
Is it easier to remember six, seven, two, four, five, six, nine, one or is it easier to remember sixty-seven, twenty-four, fifty-six ninety-one?
The human brain is indeed a crazy device.
I can think of some other reasons it didn't sell well.
Performance wasn't that impressive at launch
The price was rather high
It only had 4GB of memory
It's AMD (the much much much smaller player)
It was however a card that aged extremely well performance wise. It started out rather underwhelming but many driver updates later it was a beast only held back by the small amount of (but dang fast) memory it had.
You're preaching to the choir here. I have two of these beauties still and their longevity has been nothing less than spectacular.
The R9 Fury truly was the GTX 1080 Ti of its day.
I replaced my GTX 970 with a Fury non-x after finding a very good deal on the used market and that was an amazing upgrade.
I believe it. I actually got mine six years ago during the mining boom of 2017. Back then, the RX 580 was over $700CAD, a price that I was not willing to pay for that card. However, out of nowhere, Newegg suddenly had brand-new Sapphire Radeon R9 Fury Nitro+ OC for about $350CAD. I couldn't believe what I was seeing.
I remember feeling a bit trapped because I had twin HD 7970s in crossfire but crossfire stopped being a real thing around that time. This meant that I was stuck with a card that was pretty badly outdated at the time but there was no chance that I was going to spend that much money on a Polaris card and I'd rather play on a console than buy a GeForce card. I remember spending hours researching it, because it seemed way too good to be true. I knew that this card was faster than the RX 580 so why was it half the price and why were new ones suddenly available years after production ended? Nevertheless, I pulled the trigger and got one.
The relief I felt that I was going to have a card that was usable again was palpable because I honestly thought that I was screwed. A couple of weeks later, I saw this video from Greg Salazar:
I actually bought a second, refurbished model for about $150CAD from Newegg about six months later to use as a backup card. To this day, these things can both still game. I've played games like Godfall, AC: Odyssey and Far Cry 6 on an R9 Fury while my XFX RX 5700 XT was gone for RMA (twice). I was in no hurry to pop it back in when it finally arrived because the R9 Fury just worked and I was still having a blast gaming with it. I'm actually curious to see what effect FSR will have on it because that 4GB of HBM makes hi-res gaming impossible.