Graphics card prices have become detached from reality, making advertised MSRPs almost meaningless. In this article, we examine real-world GPU prices and reveal which offer the best value.
Graphics card prices have become detached from reality, making advertised MSRPs almost meaningless. In this article, we examine real-world GPU prices and reveal which offer the best value.
Used market is asking equal or more than original MSRP for RDNA3 and ADA cards.
RDNA2 and Ampere cards prices stagnated for more than 2 years now.
New cards are at least 20% more than MSRP. If you can find one.
What a great time to buy GPU's! Right?
I hate to break it to you, but Intel is not currently making Battlemage chips. As far as I've been told, Intel is using all TSMC capacity to make CPU's/APUs and their dGPU is being put on the back burner. I do know that their dGPUs are in "active development" so Intel does intend on making GPUs, it's just that they aren't currently making any new ones.Have curiosity why didn't you include the Intel Battle mage because that's what I'm looking at
That doesn't make any sense they already released the card I can't imagine them suddenly stopping production. Are we going to get the better versions maybe not good I will definitely like to get the $250 version that was reviewed here two months agoI hate to break it to you, but Intel is not currently making Battlemage chips. As far as I've been told, Intel is using all TSMC capacity to make CPU's/APUs and their dGPU is being put on the back burner. I do know that their dGPUs are in "active development" so Intel does intend on making GPUs, it's just that they aren't currently making any new ones.
I, too, would love to get my hands on one of Intel's cards. The cards are out, but they aren't being restocked which might be why you're having difficulty finding one. I'd love to get my hands on a B580, but I've been told it could be awhile before they com back into stock. AMD, nVidia and now intel are all relying on TSMC which is creating a chip shortage.That doesn't make any sense they already released the card I can't imagine them suddenly stopping production. Are we going to get the better versions maybe not good I will definitely like to get the $250 version that was reviewed here two months ago
Have curiosity why didn't you include the Intel Battle mage because that's what I'm looking at
The B580 simply isn't in the same performance class as the other cards being discussed here. Future higher end models would be a better comparison, if Intel actually makes any. More to the point, the B580 is also unavailable at MSRP, and that's if you can find one at all.
Yeah, well, if we start factoring the cost of electricity into the equation (which I do and most people don't), then gaming has been dead for like 10 years. There are NO 4K or 1440p cards on the market, with at least 10GB of RAM (because 8GB is already seemingly insufficient), that consume less than 150W of power. Hell, I just did a benchmark on my own GPU (RX 7800XT, which I bought months ago), and at 1080p high on Cyberpunk 2077, the GPU package power is around 260W from the wall. Granted, it's undervolted to keep the temps down, but still...This only accounts for initial cost. Unless you’re a kid and your parents pay for your electricity, there is also a cost to run it. I’m more concerned with total cost of ownership than just initial cost.
Yep yep it is but this is why I'm on my 5600 XT and I'm playing older games I don't need to play the new stuff anymore besides it all sucksYeah, well, if we start factoring the cost of electricity into the equation (which I do and most people don't), then gaming has been dead for like 10 years. There are NO 4K or 1440p cards on the market, with at least 10GB of RAM (because 8GB is already seemingly insufficient), that consume less than 150W of power. Hell, I just did a benchmark on my own GPU (RX 7800XT, which I bought months ago), and at 1080p high on Cyberpunk 2077, the GPU package power is around 260W from the wall. Granted, it's undervolted to keep the temps down, but still...
Which is to say, if you want to engage in the enthusiast-class GPU market or even slightly below that, while taking "dollars per kilowatt hour" into account, just forget about it. That's a losing battle.
Yeah future proofing is more important than cost at purchase, because keeping a card twice as long is worth a 20% or 40% premium.Personally I don't think the cost per frame is interesting, I rather buy the best or somewhat the best, skipping 1 or sometimes 2 generations.
My previous GPU was a 2080TI that I used for about 4 years, costing me about 27 bucks per month at 1300 bucks. I could have sold it but gave it to my brother to use in his computer.
I purchased the 4090 in May 2023, so I'm nearing two years of usage, which is about 90 bucks per month thus far. If I buy another PC in 2 years it would be at about 45 bucks per month...and then I'll likely give it away to a family member.
That's the way I look at all my purchases, I don't look at how much the initial cost is, but how much per month it costs and how long I expect to use it.
If I purchased a 4070 it would've cost me 760 bucks initially in May 2023 and cost me 21 bucks per month after 24 months. But then again I'd be more likely to purchase a replacement after 24 months as it wouldn't last as long given I play at 4k.
Depending on the cost of the replacement it would cost me slightly less, about the same or more than the 4090 would when used for 4 years.
But maybe I'm the only one that approaches it this way.
I "cheaped" out on a computer back in the days by going with people's comments that an i5 processor would be more than enough for gaming. It was the i5's from when the first generation came out. Forgot which GPU I had in it though.Yeah future proofing is more important than cost at purchase, because keeping a card twice as long is worth a 20% or 40% premium.
I made the mistake of buying a 7900XTX at release and rapidly found it obsolete in anything with ray tracing - it was a flagship card that struggled to run these games. Swapped in a 4080S and couldn't be happier. It turns out 20% "savings" on the AMD card was completely short-sighted.
Well, as it turns out, most gamers are of the opinion that raytracing doesn't produce enough ROI for the cost of admission...and they are right. Rasterization may be considered "obsolete" because it's not the new hotness (it's like, what 30 years old at least), but that's beside the point. Two things can be true, at the same time: raytracing is the future AND the hardware is currently not worth the price of admission.Yeah future proofing is more important than cost at purchase, because keeping a card twice as long is worth a 20% or 40% premium.
I made the mistake of buying a 7900XTX at release and rapidly found it obsolete in anything with ray tracing - it was a flagship card that struggled to run these games. Swapped in a 4080S and couldn't be happier. It turns out 20% "savings" on the AMD card was completely short-sighted.