Cost Per Frame: Best Value Graphics Cards in Early 2023

Cost per frame clearly is not a good thing to show what card to buy The RX 6500XT and the rtx 3050 are not the sweet spot for cheapish gaming regardless of the numbers.
I was about to post something just like this. There needs to be a "quality" factor. If a GPU can't do 60 fps reliably, why would I buy it?
 
Cost per frame clearly is not a good thing to show what card to buy The RX 6500XT and the rtx 3050 are not the sweet spot for cheapish gaming regardless of the numbers.
I was about to post something just like this. There needs to be a "quality" factor. If a GPU can't do 60 fps reliably, why would I buy it?

Ok, so you have two metrics on the graphs...

1) Cost per frame (lower is better)
2) Average frame rate > that's your quality factor.

Pretty much self-explanatory.
 
Wait, hold up a minute, you expect all reviewers to go back and update their reviews every time the price changes in your region?
No, under normal circumstances, it's not needed so I don't expect reviewers to go back every time they do a review. Are you new to this or something? I've been on tech forums since 2008 and I've never seen the market turned on it's head like this. I wouldn't expect reviewers to do such a thing under normal circumstances because normal circumstances have never warranted it. I'm only pointing out that it would be a good idea for the RX 6000 / RTX 30 generation of cards and that's because of the silicon shortage's perfect storm that came with the combination of COVID, the second great mining craze, the release of new CPUs from both AMD and Intel, the release of new GPUs from both AMD and nVidia and the release of both the new Playstations and Xboxes ALL AT THE SAME TIME.

This situation meant that the prices of the cards in question didn't settle into robust price brackets like they always did before. I don't know how long you've been paying attention to this but during this time, for the first time ever, a video card's MSRP became completely meaningless! So meaningless in fact, that nVidia released the RTX 3080 12GB without an MSRP! It was the first video card ever released without an MSRP and it was done for two reasons. The first being that since nobody could get a card at MSRP anyway, it was pointless. The other was nVidia being arrogant because they knew that ANY available card at that performance level would sell out instantly.

Since these reviews were based on MSRPs which turned out to be invalid the reviews themselves are invalid. Now don't get me wrong, I don't blame Steve for the fact that MSRPs weren't valid because in his entire career doing this, MSRPs were always trustworthy and you could expect that your initial day-1 review would stand the test of time. I don't think that the current RX 7000 generation of cards will need to be revisited because those MSRPs are actually turning out to be fairly reliable. The GeForce MSRPs on the other hand, not so much.

These days, the idea that the RTX 3050 is a 33% better buy than the RX 6600 XT would be considered patently insane but back when the review was made, perhaps it really was that way. Now, if Techspot were to just go back and make some (very) minor adjustments to the articles, history would look far more fondly on Techspot than anyone who didn't do the same. Things like this show a dedication that people respect and an admirable mental flexibility. It wouldn't even have to be Steve or Tim that does it because anyone with even a passing understanding of statistics could fix the numbers but just adding a paragraph at the end of the review explaining that prices have settled far away from their MSRPs and thus, the hierarchy had changed.
What are you smoking?
If you're blind to the fact that this is a unique situation as opposed to historically typical, it's YOU that is smoking something, not me.
 
Ok, so you have two metrics on the graphs...

1) Cost per frame (lower is better)
2) Average frame rate > that's your quality factor.

Pretty much self-explanatory.
True, the metrics are on the graph, but the graph doesn't reflect the valuation of both metrics. That's what I'm asking for, take $/frame against average frames. If avg frames are below a certain number (eg 60 fps) then it doesn't belong on the chart at all.
 
No, under normal circumstances, it's not needed so I don't expect reviewers to go back every time they do a review. Are you new to this or something? I've been on tech forums since 2008 and I've never seen the market turned on it's head like this. I wouldn't expect reviewers to do such a thing under normal circumstances because normal circumstances have never warranted it. Blah Blah Blah...
So you do expect reviewers to update their reviews every time the price changes in your region, because it's not a normal circumstance.
I'm just going to leave it here, I'm sure other commentators will read this and they can make up their own mind on the ridiculousness of your expectations.

What's funny is Techspot do regularly put out GPU pricing updates and articles that put GPU's head-to-head over time that shows the difference in value. A lot of other reviewers don't do that.

Also I don't mean to be rude but, if I read a review for a product I want, let's use the 6950XT as an example, all seems great in the review and the MSRP seems fair. I go off and read a few more reviews and if generally its what I'm after, I then go shopping for it. If the price is way too high or I can't find it for MSRP, I just don't buy it.

You make it sound like you read just Techspot's review and you buy products regardless of price just because a review told you it's a good product and you now seem angry the original review didn't warn you the prices were so high.
 
So I am Nvidia or AMD or any other company that sells hardware ...
1 question: How many desktop pc are on the planet ?
1answer : probably millions or billion
2 question: how many gpus are sold every year ;
2 answer: Overall, there were 6.9 million discrete GPUs shipped in Q3 of 2022. For the year-over-year period, 14 million GPUs were shipped, a 42% decline. That includes just GPUs from the big names: AMD, Intel, and Nvidia. The GPU market hit similar doldrums right before the pandemic, shipping just 7.4 million units.30 Δεκ 2022
That's just the first answer on Google!!
So in a very fast way you have all your answers on why GPU s are so expensive!! Demand is much more than offer so ...
Nvidia and AMD or any other can sell anything they want on what ever price they want and we as gamers can't do sh-it to change that !! Make a buy once for 10 years that's the only solution !
 
I'm happy with my 3060ti that I paid £530 for in August 2021, at the time I did want a 6700XT but they were costing £700 and most 3060ti's were selling for £650.
18 months later I can say I'm glad I got the 3060ti in terms of its feature set and it's superior encoder that's been extremely useful for playing wireless VR with Virtual Desktop to my Quest 2.
 
I'm still salty I got the RX 6600 for $300. I initially thought I'd just use it for a stopgap 1080p120 setup, but I wound up getting a 1440p165 monitor so it became even more worthless for me. This stopgap is beginning to look more and more like a long term setup, with neither GPU company willing to release a strong mid-range card at mid-range prices, and Intel unable to get their drivers in order.
 
So I am Nvidia or AMD or any other company that sells hardware ...
1 question: How many desktop pc are on the planet ?
1answer : probably millions or billion
2 question: how many gpus are sold every year ;
2 answer: Overall, there were 6.9 million discrete GPUs shipped in Q3 of 2022. For the year-over-year period, 14 million GPUs were shipped, a 42% decline. That includes just GPUs from the big names: AMD, Intel, and Nvidia. The GPU market hit similar doldrums right before the pandemic, shipping just 7.4 million units.30 Δεκ 2022
That's just the first answer on Google!!
So in a very fast way you have all your answers on why GPU s are so expensive!! Demand is much more than offer so ...
Nvidia and AMD or any other can sell anything they want on what ever price they want and we as gamers can't do sh-it to change that !! Make a buy once for 10 years that's the only solution !
That's kinda why I'm glad that the RX 6800 XT has 16GB of VRAM. The Leafs might win the cup before I need to upgrade my video card. Hell, I could still use my R9 Fury today if it had more than 4GB of VRAM because the rasterisation engine is potent enough that the lack of VRAM becomes the limiting factor.

Just wait and see what happens to all those people who bought t. Having a card that you know has a potent enough GPU but not enough VRAM is one of the most gear-grinding experiences that you can ever have. The RTX 3080 with only 10GB of VRAM is guaranteed to do that to everyone who bought one. That's only 2GB more than my RX 5700 XT, a previous-generation level-7 gaming card! I expect that the RTX 3080 will last about half as long because long before 16GB isn't enough, 10GB will be borderline unusable and then they'll have to give Jensen even more of their money. That's called 'Hook, Line and Sinker!". :laughing:
 
As mentioned above, right now (a OC'ed) 6950 XT for 699 USD is the sweet spot (w/o raw RT performance) - even for 4K.
Yep. It just goes to show you how bad things have become when a card that was a refresh of the RX 6900 XT, a card that was a terrible value even at MSRP, is now the sweet spot.
 
I'm still salty I got the RX 6600 for $300. I initially thought I'd just use it for a stopgap 1080p120 setup, but I wound up getting a 1440p165 monitor so it became even more worthless for me. This stopgap is beginning to look more and more like a long term setup, with neither GPU company willing to release a strong mid-range card at mid-range prices, and Intel unable to get their drivers in order.
Well, why on Earth would you buy a 1440p165Hz monitor when you knew that you had an RX 6600? That problem is pretty much self-inflicted.

On the positive side, you got one of (if not the) best value cards of that generation so you sure could've done worse. The other thing is, despite your monitor being 1440p, depending on the size, 1080p might look almost exactly the same. I've gamed at 1080p on a 55" 4K panel and it looked fine. Hell, I played ACE COMBAT 7 and it looked glorious. I was shocked to find out later that, as a PS4 port, AC7's max resolution was only 720p.

In any case, don't worry, you'll be ok. You'll see that both resolutions will look just fine on your screen.
 
Well, why on Earth would you buy a 1440p165Hz monitor when you knew that you had an RX 6600? That problem is pretty much self-inflicted.

On the positive side, you got one of (if not the) best value cards of that generation so you sure could've done worse. The other thing is, despite your monitor being 1440p, depending on the size, 1080p might look almost exactly the same. I've gamed at 1080p on a 55" 4K panel and it looked fine. Hell, I played ACE COMBAT 7 and it looked glorious. I was shocked to find out later that, as a PS4 port, AC7's max resolution was only 720p.

In any case, don't worry, you'll be ok. You'll see that both resolutions will look just fine on your screen.
Found a great sale, so I pulled the trigger. I'm not going to base my lifetime monitor decisions on an unrelated GPU purchase months prior. And I don't know anybody that only uses one GPU lifetime per monitor.

If GPU pricing never truly improves then I simply won't be playing new games on my PC going forward, which isn't ideal but I'll live. If it does eventually return to a semblance of sanity, I can just get a new GPU that fits my needs better. The current GPU will go into my backup build, or I will otherwise keep it as a backup, because you just never know when a tech component can fail/have issues.
 
Found a great sale, so I pulled the trigger. I'm not going to base my lifetime monitor decisions on an unrelated GPU purchase months prior. And I don't know anybody that only uses one GPU lifetime per monitor.
There's nothing wrong with that but it doesn't change the fact that on most monitors that are under 40", you can't really tell the difference between 1080p and 1440p.
If GPU pricing never truly improves then I simply won't be playing new games on my PC going forward, which isn't ideal but I'll live. If it does eventually return to a semblance of sanity, I can just get a new GPU that fits my needs better. The current GPU will go into my backup build, or I will otherwise keep it as a backup, because you just never know when a tech component can fail/have issues.
Don't worry, it will get better. It has to or the market will disappear and that will badly hurt nVidia (since that's all they do).
 
Back