bexwhitt
Posts: 678 +423
Cost per frame clearly is not a good thing to show what card to buy The RX 6500XT and the rtx 3050 are not the sweet spot for cheapish gaming regardless of the numbers.
I was about to post something just like this. There needs to be a "quality" factor. If a GPU can't do 60 fps reliably, why would I buy it?Cost per frame clearly is not a good thing to show what card to buy The RX 6500XT and the rtx 3050 are not the sweet spot for cheapish gaming regardless of the numbers.
Cost per frame clearly is not a good thing to show what card to buy The RX 6500XT and the rtx 3050 are not the sweet spot for cheapish gaming regardless of the numbers.
I was about to post something just like this. There needs to be a "quality" factor. If a GPU can't do 60 fps reliably, why would I buy it?
No, under normal circumstances, it's not needed so I don't expect reviewers to go back every time they do a review. Are you new to this or something? I've been on tech forums since 2008 and I've never seen the market turned on it's head like this. I wouldn't expect reviewers to do such a thing under normal circumstances because normal circumstances have never warranted it. I'm only pointing out that it would be a good idea for the RX 6000 / RTX 30 generation of cards and that's because of the silicon shortage's perfect storm that came with the combination of COVID, the second great mining craze, the release of new CPUs from both AMD and Intel, the release of new GPUs from both AMD and nVidia and the release of both the new Playstations and Xboxes ALL AT THE SAME TIME.Wait, hold up a minute, you expect all reviewers to go back and update their reviews every time the price changes in your region?
If you're blind to the fact that this is a unique situation as opposed to historically typical, it's YOU that is smoking something, not me.What are you smoking?
True, the metrics are on the graph, but the graph doesn't reflect the valuation of both metrics. That's what I'm asking for, take $/frame against average frames. If avg frames are below a certain number (eg 60 fps) then it doesn't belong on the chart at all.Ok, so you have two metrics on the graphs...
1) Cost per frame (lower is better)
2) Average frame rate > that's your quality factor.
Pretty much self-explanatory.
So you do expect reviewers to update their reviews every time the price changes in your region, because it's not a normal circumstance.No, under normal circumstances, it's not needed so I don't expect reviewers to go back every time they do a review. Are you new to this or something? I've been on tech forums since 2008 and I've never seen the market turned on it's head like this. I wouldn't expect reviewers to do such a thing under normal circumstances because normal circumstances have never warranted it. Blah Blah Blah...
That's kinda why I'm glad that the RX 6800 XT has 16GB of VRAM. The Leafs might win the cup before I need to upgrade my video card. Hell, I could still use my R9 Fury today if it had more than 4GB of VRAM because the rasterisation engine is potent enough that the lack of VRAM becomes the limiting factor.So I am Nvidia or AMD or any other company that sells hardware ...
1 question: How many desktop pc are on the planet ?
1answer : probably millions or billion
2 question: how many gpus are sold every year ;
2 answer: Overall, there were 6.9 million discrete GPUs shipped in Q3 of 2022. For the year-over-year period, 14 million GPUs were shipped, a 42% decline. That includes just GPUs from the big names: AMD, Intel, and Nvidia. The GPU market hit similar doldrums right before the pandemic, shipping just 7.4 million units.30 Δεκ 2022
That's just the first answer on Google!!
So in a very fast way you have all your answers on why GPU s are so expensive!! Demand is much more than offer so ...
Nvidia and AMD or any other can sell anything they want on what ever price they want and we as gamers can't do sh-it to change that !! Make a buy once for 10 years that's the only solution !
Yep. It just goes to show you how bad things have become when a card that was a refresh of the RX 6900 XT, a card that was a terrible value even at MSRP, is now the sweet spot.As mentioned above, right now (a OC'ed) 6950 XT for 699 USD is the sweet spot (w/o raw RT performance) - even for 4K.
Well, why on Earth would you buy a 1440p165Hz monitor when you knew that you had an RX 6600? That problem is pretty much self-inflicted.I'm still salty I got the RX 6600 for $300. I initially thought I'd just use it for a stopgap 1080p120 setup, but I wound up getting a 1440p165 monitor so it became even more worthless for me. This stopgap is beginning to look more and more like a long term setup, with neither GPU company willing to release a strong mid-range card at mid-range prices, and Intel unable to get their drivers in order.
Found a great sale, so I pulled the trigger. I'm not going to base my lifetime monitor decisions on an unrelated GPU purchase months prior. And I don't know anybody that only uses one GPU lifetime per monitor.Well, why on Earth would you buy a 1440p165Hz monitor when you knew that you had an RX 6600? That problem is pretty much self-inflicted.
On the positive side, you got one of (if not the) best value cards of that generation so you sure could've done worse. The other thing is, despite your monitor being 1440p, depending on the size, 1080p might look almost exactly the same. I've gamed at 1080p on a 55" 4K panel and it looked fine. Hell, I played ACE COMBAT 7 and it looked glorious. I was shocked to find out later that, as a PS4 port, AC7's max resolution was only 720p.
In any case, don't worry, you'll be ok. You'll see that both resolutions will look just fine on your screen.
There's nothing wrong with that but it doesn't change the fact that on most monitors that are under 40", you can't really tell the difference between 1080p and 1440p.Found a great sale, so I pulled the trigger. I'm not going to base my lifetime monitor decisions on an unrelated GPU purchase months prior. And I don't know anybody that only uses one GPU lifetime per monitor.
Don't worry, it will get better. It has to or the market will disappear and that will badly hurt nVidia (since that's all they do).If GPU pricing never truly improves then I simply won't be playing new games on my PC going forward, which isn't ideal but I'll live. If it does eventually return to a semblance of sanity, I can just get a new GPU that fits my needs better. The current GPU will go into my backup build, or I will otherwise keep it as a backup, because you just never know when a tech component can fail/have issues.