Cost Per Frame: Best Value Graphics Cards in Early 2025

What an amazing effort to compile all of this. Kudos.

Sadly, no amount of lamenting about high prices does anything about the reality that cards are still barely able to stay in stock. The market says they're worth way more than MSRP.
 
Used market is asking equal or more than original MSRP for RDNA3 and ADA cards.
RDNA2 and Ampere cards prices stagnated for more than 2 years now.
New cards are at least 20% more than MSRP. If you can find one.

What a great time to buy GPU's! Right?
 
Pricing is absurd right now, and AMD’s stance “we can’t do anything, we just sell the chip” is BS. Many companies control their products pricing, they could easily step in and say hey, this is to much, make their own reference cards, or lower the chip cost to the AIBs with a clause they won’t gouge prices because they can with NVIDIA. They’re not going to gain any market share with these prices.
 
Used market is asking equal or more than original MSRP for RDNA3 and ADA cards.
RDNA2 and Ampere cards prices stagnated for more than 2 years now.
New cards are at least 20% more than MSRP. If you can find one.


What a great time to buy GPU's! Right?

I'm just not buying used or new GPUs from AMD or NVIDIA at all anymore. Intel is where my next gpu purchase will be.
 
Too bad there are no good gpu prices nowadays..
because the prices keep increasing day by day especially for the older generations..
 
This only accounts for initial cost. Unless you’re a kid and your parents pay for your electricity, there is also a cost to run it. I’m more concerned with total cost of ownership than just initial cost.
 
Personally I don't think the cost per frame is interesting, I rather buy the best or somewhat the best, skipping 1 or sometimes 2 generations.

My previous GPU was a 2080TI that I used for about 4 years, costing me about 27 bucks per month at 1300 bucks. I could have sold it but gave it to my brother to use in his computer.

I purchased the 4090 in May 2023, so I'm nearing two years of usage, which is about 90 bucks per month thus far. If I buy another PC in 2 years it would be at about 45 bucks per month...and then I'll likely give it away to a family member.

That's the way I look at all my purchases, I don't look at how much the initial cost is, but how much per month it costs and how long I expect to use it.

If I purchased a 4070 it would've cost me 760 bucks initially in May 2023 and cost me 21 bucks per month after 24 months. But then again I'd be more likely to purchase a replacement after 24 months as it wouldn't last as long given I play at 4k.

Depending on the cost of the replacement it would cost me slightly less, about the same or more than the 4090 would when used for 4 years.

But maybe I'm the only one that approaches it this way.
 
Let's wait 2 more months before we can talk about value in the GPU market. I have my eyes fixed on a 9070 PowerColor Red Devil but it's 780euros now, if it would go down to 640-670 then probably I'll buy it as my final GPU.
 
Have curiosity why didn't you include the Intel Battle mage because that's what I'm looking at
I hate to break it to you, but Intel is not currently making Battlemage chips. As far as I've been told, Intel is using all TSMC capacity to make CPU's/APUs and their dGPU is being put on the back burner. I do know that their dGPUs are in "active development" so Intel does intend on making GPUs, it's just that they aren't currently making any new ones.
 
I hate to break it to you, but Intel is not currently making Battlemage chips. As far as I've been told, Intel is using all TSMC capacity to make CPU's/APUs and their dGPU is being put on the back burner. I do know that their dGPUs are in "active development" so Intel does intend on making GPUs, it's just that they aren't currently making any new ones.
That doesn't make any sense they already released the card I can't imagine them suddenly stopping production. Are we going to get the better versions maybe not good I will definitely like to get the $250 version that was reviewed here two months ago
 
That doesn't make any sense they already released the card I can't imagine them suddenly stopping production. Are we going to get the better versions maybe not good I will definitely like to get the $250 version that was reviewed here two months ago
I, too, would love to get my hands on one of Intel's cards. The cards are out, but they aren't being restocked which might be why you're having difficulty finding one. I'd love to get my hands on a B580, but I've been told it could be awhile before they com back into stock. AMD, nVidia and now intel are all relying on TSMC which is creating a chip shortage.

It sucks as my laptop has an Arc chip in it and, to be honest, I'm very happy with its performance.
 
Have curiosity why didn't you include the Intel Battle mage because that's what I'm looking at

The B580 simply isn't in the same performance class as the other cards being discussed here. Future higher end models would be a better comparison, if Intel actually makes any. More to the point, the B580 is also unavailable at MSRP, and that's if you can find one at all.
 
The B580 simply isn't in the same performance class as the other cards being discussed here. Future higher end models would be a better comparison, if Intel actually makes any. More to the point, the B580 is also unavailable at MSRP, and that's if you can find one at all.

I don't care if its in the same class, I care about cost per frame, and frankly I refuse to spend more than $300 on a video card, its absurd to spend more, and I did the math on inflation, the "midrange" adjusted for inflation is still $100 more expensive than it should be, 60 class cards should be around $270-330 and same for AMD's midrange and thats adjusted for inflation, and right now the only one offering a real midrange is Intel, and it's why my last card was an RX 5600XT I got on ebay because I refuse to pay these stupid prices. I happily play on my 29in LG 60hz 2560x1080 monitor and no interest in an upgrade on my monitor it does what I need it to do, and I tried a 4k, I had to set the DPI to 200% to read anything negating its advantages
 
This only accounts for initial cost. Unless you’re a kid and your parents pay for your electricity, there is also a cost to run it. I’m more concerned with total cost of ownership than just initial cost.
Yeah, well, if we start factoring the cost of electricity into the equation (which I do and most people don't), then gaming has been dead for like 10 years. There are NO 4K or 1440p cards on the market, with at least 10GB of RAM (because 8GB is already seemingly insufficient), that consume less than 150W of power. Hell, I just did a benchmark on my own GPU (RX 7800XT, which I bought months ago), and at 1080p high on Cyberpunk 2077, the GPU package power is around 260W from the wall. Granted, it's undervolted to keep the temps down, but still...

Which is to say, if you want to engage in the enthusiast-class GPU market or even slightly below that, while taking "dollars per kilowatt hour" into account, just forget about it. That's a losing battle.
 
Yeah, well, if we start factoring the cost of electricity into the equation (which I do and most people don't), then gaming has been dead for like 10 years. There are NO 4K or 1440p cards on the market, with at least 10GB of RAM (because 8GB is already seemingly insufficient), that consume less than 150W of power. Hell, I just did a benchmark on my own GPU (RX 7800XT, which I bought months ago), and at 1080p high on Cyberpunk 2077, the GPU package power is around 260W from the wall. Granted, it's undervolted to keep the temps down, but still...

Which is to say, if you want to engage in the enthusiast-class GPU market or even slightly below that, while taking "dollars per kilowatt hour" into account, just forget about it. That's a losing battle.
Yep yep it is but this is why I'm on my 5600 XT and I'm playing older games I don't need to play the new stuff anymore besides it all sucks
 
I wonder what percentage of your readers have any interest in these cards over $500.. I also wonder how many, like me, are gaming on 1080P which doesn't get a look in on your review.. I'll be hanging on to my 3070 and AW2521HFA for a few years longer it seems. I'm used to playing in performance mode(Fortnite) and I don't think I would see any advantage after spending $1K on a new GPU.
 
Zero interest in changing the current 3070/5700x3d I have now with anything, since I'm playing older titles mostly and the recent ones I lower the settings until I have at least 60fps.
Cards in range $400-500? Only if 4080 or 4090 performance reach this price point.
 
Raster-only analysis is being deliberately useless in 2025, as is ignoring upscaling.

Techspot has that axe to grind, and how actual gamers play is safely ignored. Steam tells us that 4K is rare, actual gamers use upscaling solutions, and ray tracing is slowly becoming mandatory.
 
Personally I don't think the cost per frame is interesting, I rather buy the best or somewhat the best, skipping 1 or sometimes 2 generations.

My previous GPU was a 2080TI that I used for about 4 years, costing me about 27 bucks per month at 1300 bucks. I could have sold it but gave it to my brother to use in his computer.

I purchased the 4090 in May 2023, so I'm nearing two years of usage, which is about 90 bucks per month thus far. If I buy another PC in 2 years it would be at about 45 bucks per month...and then I'll likely give it away to a family member.

That's the way I look at all my purchases, I don't look at how much the initial cost is, but how much per month it costs and how long I expect to use it.

If I purchased a 4070 it would've cost me 760 bucks initially in May 2023 and cost me 21 bucks per month after 24 months. But then again I'd be more likely to purchase a replacement after 24 months as it wouldn't last as long given I play at 4k.

Depending on the cost of the replacement it would cost me slightly less, about the same or more than the 4090 would when used for 4 years.

But maybe I'm the only one that approaches it this way.
Yeah future proofing is more important than cost at purchase, because keeping a card twice as long is worth a 20% or 40% premium.

I made the mistake of buying a 7900XTX at release and rapidly found it obsolete in anything with ray tracing - it was a flagship card that struggled to run these games. Swapped in a 4080S and couldn't be happier. It turns out 20% "savings" on the AMD card was completely short-sighted.
 
Yeah future proofing is more important than cost at purchase, because keeping a card twice as long is worth a 20% or 40% premium.

I made the mistake of buying a 7900XTX at release and rapidly found it obsolete in anything with ray tracing - it was a flagship card that struggled to run these games. Swapped in a 4080S and couldn't be happier. It turns out 20% "savings" on the AMD card was completely short-sighted.
I "cheaped" out on a computer back in the days by going with people's comments that an i5 processor would be more than enough for gaming. It was the i5's from when the first generation came out. Forgot which GPU I had in it though.

The computer struggled even at the slightest multitasking, that being a YouTube video and a game at the same time. The computer would constantly freeze/stutter.

I replaced the computer within a year with a twice more expensive one with i7-processor and 1080TI card and the computer lasted me far longer.

For some things cheaping out is not a great idea, sure not everyone can afford it but then again some things can last ages.

Same with desk chairs, why make yourself suffer from back or even *** pain going for some cheap trash (including racer model chairs that bend your arms forward), rather than buying something that will last you 10+ years and is actually comfortable and good for your back.
 
Yeah future proofing is more important than cost at purchase, because keeping a card twice as long is worth a 20% or 40% premium.

I made the mistake of buying a 7900XTX at release and rapidly found it obsolete in anything with ray tracing - it was a flagship card that struggled to run these games. Swapped in a 4080S and couldn't be happier. It turns out 20% "savings" on the AMD card was completely short-sighted.
Well, as it turns out, most gamers are of the opinion that raytracing doesn't produce enough ROI for the cost of admission...and they are right. Rasterization may be considered "obsolete" because it's not the new hotness (it's like, what 30 years old at least), but that's beside the point. Two things can be true, at the same time: raytracing is the future AND the hardware is currently not worth the price of admission.

A GIGABYTE GeForce RTX 4070 Windforce OC V2 12GB costs $750 on Newegg discounted (it's normally $900), because Nvidia doesn't make a 10GB card and you need more than 8GB to run at higher than 30 fps on Medium. It's not just "kind of expensive"; it's too expensive, for the little benefit it brings to the table. The fact that some games are now mandating it—games which would be served just as well using rasterized mesh shading—doesn't suddenly make it "worth it". Those games are forcing users' hands, specifically because the cost of raytracing is prohibitive and publishers know this. Nvidia knows this too, but they need gamers to feel compelled to buy into it. Classic chicken-and-egg scenario.

So, the fact that the RX 7900XTX has markedly worse raytracing performance while having spectacular rasterization performance is not considered a problem. If you need raytracing, because the game won't even boot otherwise, that's not the win for Nvidia that you think it is.
 
Back