The Best GPUs: Early 2023 Update

True enemy is 4k resolution. Without 4k we wouldnt even need high end graphics.
:)
Although I haven't tried 4K gaming, I tend to think 4K gaming is useless. I think 2K is already perfectly fine for gaming, that's right. I dunno, maybe I would change my mind if I tried it but I doubt it. 2K and 165 Hz looks real fine to me as it is.
Without 4K, the extra horsepower could go towards improving graphics instead of resolution.
 
GPUs are becoming more and more unrealistic. This super price hike is just part of the "elite" movement. I will NEVER spend more than $800 on a GPU, especially knowing that in 2 years that card will be considered obsolete. Nvidia will eventually kill their own sales by thinking that the majority of gamers/content creators will continue to buy cards (that are 10% better) for 60% more. It's a pathetic strategy that stems from pure GREED. Unless people stop buying these highly overrated and overpriced pieces of Chinese metal it will never end. Look at the price of other hardware that hasn't skyrocketed, that alone tells you it's all bullcrap!
 
This is probably the case, and this is why I never buy the most expensive computer parts froms amazon, apart from SSDs and small components. In fact, I never buy GPUs brand new any more, same with CPUs BTW.
I do buy new parts, but I don't buy them online anymore. Now, I'm fortunate in the fact that I have three Canada Computers and two Memory Express locations within a 20-minute drive radius from home. That way I know that I'm not buying from a scalper or someone who's going to try to sell me a fake part.
 
Currently, the best GPU is the one you already have in your PC.
When I bought my PC, I got a big enough power supply to put a 2nd GPU in, since at the time 11+ or so years ago..? maybe a few more now.. 1250watts, and the ~$1200+ graphics card was basically the best you could get without going into the Quadro's or whatever they were. I didn't get a 2nd card, and 5-6 years later MS released a new version of Directx with prob. a dozen lines of code to check an arbitrary level/age of your card and just wouldn't run if it didn't pass, despite running the previous latest tripple-A games with all settings maxed. I doubt they made games that required cards that could run last months top games on max settings, but the games were so much more awesome they wouldn't run on the same card even with minimal settings. Money hoarders. (Like when Bill sold DOS to IBM for $1,000,000, then went back to his college and bought it off (I'd say friend, but that'd be a blatant lie) a fellow student, who actually wrote it - for $10,000. If only the guy's parents had a bit of money, (not even like Bill's parents apparently were) and the guy said no. Bill just sold it for $1 Million to IBM, no doubt signed contracts and all.. would he have been in a tad of trouble.. but I guess $10k back then especially was a lot to the guy, and obviously Bill didn't say - I sold it for a million to IBM already and signed contracts, so the cash will be gushing in for years at least.. do you think I could buy it from you for $10k?
So I bought a new $1200 GPU when they totally scammed tons of people.. this time, that wasn't close to the most expensive gaming card you could get.
And - a $1,600 RTX 4090 "Flagship"? Even the RTX 6000 Ada Gen. with 48GB of GDDR6 ECC is $6,800 (but limit 5 per customer.)
Gotta get a new workstation soon, and was hoping for an H100 80GB - but they're "estimating" $33k, one of those slick custom data units that go *way* faster than PCIe 4.0, and bi-directional, and it seems anywhere to anywhere, concurrently, bi-directionally, and all at the same time - CPU to a GPU, RAM to the drive-storage, and 2 other GPU's to each other, all at once, and every one seems to be slowed by the slowest of the two transferring. Essentially not even using the PCI bus significantly. - But 8 of those H100's.. I don't know how much those custom cases are, but even if it's only $30k (unlikely), that's still $300k. I'll probably see if I can get a case with the screaming fast data transfer, and *maybe* - if I'm lucky and can get one with only 2 of the A100 80GB - Tom's says an Nvidia A100 80GB card can be purchased for $13,224, and then says whereas an Nvidia A100 40GB can cost as much as $27,113 at CDW - saying depends on the reseller.. and even those probably won't be available from Nvidia for some months..(?)

My ~6 year old $1200 (CDN) is far, far from a half decent GPU these days. 2nd to the bottom of CUDA compatible GPU's.. (prices above are mostly US, though I should get the 25% discount, at least.)
 
I finally purchased the Gigabyte "Eagle" 6700xt, should be delivered today. I'm on 2k resolution and Corsair RM650 psu AM5 (7600/650m). Over the years I've used both ATI and Nvidia gpus (ATI yes back in 2006 lol). Never had issues with either. I'll be upgrading from 1650 super. The only issue I faced with nvidia is that if you have a DAC connected with USB you might face serious coil whine type noise when the GPU is under load (gaming mostly). It's a known nvidia issue. I hope I'll get rid of it now that I'm moving to AMD again. I was able to fix that issue using toslink instead of USB but that caps the bandwidth at 92000 while the DAC rated at 384000. This also means I had to use an old sound card just to get toslink.
 
Last edited:
GPUs are becoming more and more unrealistic. This super price hike is just part of the "elite" movement. I will NEVER spend more than $800 on a GPU, especially knowing that in 2 years that card will be considered obsolete. Nvidia will eventually kill their own sales by thinking that the majority of gamers/content creators will continue to buy cards (that are 10% better) for 60% more. It's a pathetic strategy that stems from pure GREED. Unless people stop buying these highly overrated and overpriced pieces of Chinese metal it will never end. Look at the price of other hardware that hasn't skyrocketed, that alone tells you it's all bullcrap!
COVID and mining craze could also be the reason for this recent hike. Hopefully things should settle down soon.
 
Although I haven't tried 4K gaming, I tend to think 4K gaming is useless. I think 2K is already perfectly fine for gaming, that's right. I dunno, maybe I would change my mind if I tried it but I doubt it. 2K and 165 Hz looks real fine to me as it is.
Without 4K, the extra horsepower could go towards improving graphics instead of resolution.
That's exactly my reason to never upgrade to 4k display. With 4k gpu requirements go crazy high. 2k seems sweet spot (that's my poor man gaming brain talking) hehe.
 
I finally purchased the Gigabyte "Eagle" 6700xt, should be delivered today. I'm on 2k resolution and Corsair RM650 psu AM5 (7600/650m).
If I were to buy a Gigabyte, that's what I'd get as well. I never saw the sense in paying an extra $100 for a card that looks fancy and/or has (a maximum of) 5% better performance than the standard model. People who buy AORUS, ROG Strix, Nitro or Merc cards are nuts. The only time I ever had a top-end model was my Sapphire R9 Fury Nitro and that was because Newegg was selling it for literally half the price of the RX 580 and it was the only model they had.
Over the years I've used both ATI and Nvidia gpus (ATI yes back in 2006 lol).
Nothing wrong with that. Hell, my first ATi card (EGA Wonder) was back in 1988! :laughing:
 
COVID and mining craze could also be the reason for this recent hike. Hopefully things should settle down soon.
If only it were that simple. Corporate greed is why this recent hike occurred. The two players in the game are testing the waters to see just how stupid and driven by consumerism that people today are. They want to know exactly how deep they can go when they bend you over before you scream in pain.

Some people can take it much deeper than others. I am not one of those people.
 
That's exactly my reason to never upgrade to 4k display. With 4k gpu requirements go crazy high. 2k seems sweet spot (that's my poor man gaming brain talking) hehe.
I actually like using my 4K panel but it's limited to 60Hz and that's where I think the line should be drawn. Visually, 2160p and 1440p aren't that different and if you get used to one, the other remains perfectly enjoyable. OTOH, if you get used to the input lag at 120Hz, going down to 60Hz can be a major killjoy (or so I've been told). This is why I do NOT want a display that's faster than 60Hz. A person is much happier when they're perfectly satisfied with less. I'm sure that 120Hz gaming is better, but, I can't crave what I don't know. I'm perfectly happy at 60Hz and I equate 120Hz gaming to having an expensive addiction. ;)
 
I actually like using my 4K panel but it's limited to 60Hz and that's where I think the line should be drawn. Visually, 2160p and 1440p aren't that different and if you get used to one, the other remains perfectly enjoyable. OTOH, if you get used to the input lag at 120Hz, going down to 60Hz can be a major killjoy (or so I've been told). This is why I do NOT want a display that's faster than 60Hz. A person is much happier when they're perfectly satisfied with less. I'm sure that 120Hz gaming is better, but, I can't crave what I don't know. I'm perfectly happy at 60Hz and I equate 120Hz gaming to having an expensive addiction. ;)
I did the same lol. My 2k display is 60 hz :)
 
That's easy, the RX 7900 XTX. Either one of those cards is super-overkill so I'll just take the cheaper one.

A ref 7900XTX that is loud and hot + risk of faulty vapour chamber for insane hotspot temp? No thanks.

I'd take 4080 over 7900XTX any day, when prices are close. 7900XTX needs to be a custom card and atleast 200 dollars cheaper to make me consider it over 4080.

4080 have way better features, uses 50-60 watts less and much better ray tracing perf.
 
A ref 7900XTX that is loud and hot + risk of faulty vapour chamber for insane hotspot temp? No thanks.

I'd take 4080 over 7900XTX any day, when prices are close. 7900XTX needs to be a custom card and atleast 200 dollars cheaper to make me consider it over 4080.

4080 have way better features, uses 50-60 watts less and much better ray tracing perf.
The vapor chamber issue was found to be isolated to a limited number of AMD reference model cards and if the issue arises from one that you have, AMD will replace it.
 
The vapor chamber issue was found to be isolated to a limited number of AMD reference model cards and if the issue arises from one that you have, AMD will replace it.
Sounds like something you would love to deal with, after seeing insane hotspot temps on your brand new card.

Why buy a ref when it's worse than a custom? It is like 40dB under load + Risk of faulty card still

With AMDs inferior feature set, price should be way lower. Nvidia put the 4080 at 1199 because they can. Nvidia said they can lower prices at any time.

4080 will be at 899 or 999 before years end for sure. (4080 Ti incoming at 1199-1299)

This willl force AMD to put 7900XTX at 799 or below. AMD can't ask the same price as Nvidia when their feature set is lacking. Only a true AMD fanboy can claim AMD is on par with Nvidia when it comes to features, because they are not even close.

DLSS, DLAA, DLDSR, NvEnc, Shadowplay, RTX IO just to name a few, AMD don't have a single feature that is superior at this point. They don't even have an answer to DLSS 3, DLAA and DLDSR which are amazing features. DLDSR is highly underrated and can transform older games + needs zero game support to work. Simply downscaling with help of Tensor cores meaning you get the fps boost from DLSS on top of way better graphics. This is great for 1080p and 1440p gamers. Downscaling 4K acts like crazy good AA solution.

DLSS can easily replace AA in many games, delivering less jaggies + higher perf.

AMD GPU is doing worse than ever right now. AMD seems to not even bother with competition (offing last gen cards with discount instead of releasing 7800 series, for example).

You can act like AMD GPUs are going well all you want, but they are not.
Steam HW Survey puts Nvidia at 85% dGPU marketshare right now. It was 80% like 1 year ago and 75% 2 years ago. AMD is doing worse and worse. They need GOOD VALUE CARDS ASAP to stay competitive. I am talking about PURE RASTER PERF here because AMD are years behind on RT and features.

Intel is coming fast for the low to mid-end GPU segment. When Intel changes GPU manufacuring to OWN NODE (Intel 4 and 20nm + Eventually 18nm) then Intel will be able to undercut AMD with ease (TSMC is milking AMD), hell maybe even Nvidia's low to mid-end will get a hard time... However, Nvidia pretty much owns the high-end segment for now. Nothing comes close to 4090 in 4K+ gaming.
 
Last edited:
Sounds like something you would love to deal with, after seeing insane hotspot temps on your brand new card.

Why buy a ref when it's worse than a custom? It is like 40dB under load + Risk of faulty card still

With AMDs inferior feature set, price should be way lower. Nvidia put the 4080 at 1199 because they can. Nvidia said they can lower prices at any time.

4080 will be at 899 or 999 before years end for sure. (4080 Ti incoming at 1199-1299)

This willl force AMD to put 7900XTX at 799 or below. AMD can't ask the same price as Nvidia when their feature set is lacking. Only a true AMD fanboy can claim AMD is on par with Nvidia when it comes to features, because they are not even close.

DLSS, DLAA, DLDSR, NvEnc, Shadowplay, RTX IO just to name a few, AMD don't have a single feature that is superior at this point. They don't even have an answer to DLSS 3, DLAA and DLDSR which are amazing features. DLDSR is highly underrated and can transform older games + needs zero game support to work. Simply downscaling with help of Tensor cores meaning you get the fps boost from DLSS on top of way better graphics. This is great for 1080p and 1440p gamers. Downscaling 4K acts like crazy good AA solution.

DLSS can easily replace AA in many games, delivering less jaggies + higher perf.

AMD GPU is doing worse than ever right now. AMD seems to not even bother with competition (offing last gen cards with discount instead of releasing 7800 series, for example).

You can act like AMD GPUs are going well all you want, but they are not.
Steam HW Survey puts Nvidia at 85% dGPU marketshare right now. It was 80% like 1 year ago and 75% 2 years ago. AMD is doing worse and worse. They need GOOD VALUE CARDS ASAP to stay competitive. I am talking about PURE RASTER PERF here because AMD are years behind on RT and features.

Intel is coming fast for the low to mid-end GPU segment. When Intel changes GPU manufacuring to OWN NODE (Intel 4 and 20nm + Eventually 18nm) then Intel will be able to undercut AMD with ease (TSMC is milking AMD), hell maybe even Nvidia's low to mid-end will get a hard time... However, Nvidia pretty much owns the high-end segment for now. Nothing comes close to 4090 in 4K+ gaming.
I wasn't being combative and saying Nvidia sucks. You clearly have a fetish with them and it shows in all your postings. That's fine, you like them and you seem to want to constantly defend them and bash AMD for whatever the underlying issue is, I don't care.

I was simply stating the facts about the vapor chamber issue. It was in a limited number of reference models and AMD will replace it for you. I was just offering a piece of mind for those that AMD has identified the issue and has offered a solution for anyone that might be experience it, nothing more.

I've had few issues with GPUs in the past, my most recent one was when my 3080Ti die on me after just about 2 months of owning it. No card/company is perfect, sometimes they have issues. Even driver support on both sides have had awful experiences where they've killed cards.

Get over it. Move on. Your desire to pound on people that Nvidia is superior for whatever reasons you list is getting old.
 
I wasn't being combative and saying Nvidia sucks. You clearly have a fetish with them and it shows in all your postings. That's fine, you like them and you seem to want to constantly defend them and bash AMD for whatever the underlying issue is, I don't care.

I was simply stating the facts about the vapor chamber issue. It was in a limited number of reference models and AMD will replace it for you. I was just offering a piece of mind for those that AMD has identified the issue and has offered a solution for anyone that might be experience it, nothing more.

I've had few issues with GPUs in the past, my most recent one was when my 3080Ti die on me after just about 2 months of owning it. No card/company is perfect, sometimes they have issues. Even driver support on both sides have had awful experiences where they've killed cards.

Get over it. Move on. Your desire to pound on people that Nvidia is superior for whatever reasons you list is getting old.
Im considering AMD but is a die hard Nvidia fanboy? Makes sense :laughing:

I am simply saying that AMD needs to lower prices because 9 out of 10 won't even consider buying AMD unless they offer superior value and I am not just talking about 5-10% for the same money.

AMD lacks features and overall performance in lesser popular titles and especially early access titles. This is why some people WON'T EVEN CONSIDER an AMD GPU. This is pure facts. AMD spends their limited time (way smaller software dev team than Nvidia) in optimization GPU BENCHMARKS and POPULAR AAA TITLES that gets benchmarked alot. Nvidia pretty much always perform better in lesser popular titles and early access games for this exact reason; The Developer KNOWS 85% will be using a Nvidia GPU. So why spend time optimizing for AMD and Intel GPUs when they share the remaining 15% (Source; Steam HW Survey)

AMD need to have way better performance per dollar because of lacking features and subpar experience in alot of games. They can't demand Nvidia prices, because they are not Nvidia.

7900XT and 7900XTX is getting cheaper and cheaper, because AMD knows this too.

I would not be surpriced if 7900XTX hits 700-800 dollars by Q4 this year. And this is when I will pick one up, if they insist on keeping the 900-1000 dollar mark, I will go 4080 Ti for 1200-1300 dollars instead without even thinking twice.
 
What about Intel ARK graphic cards?
As the focus of the article is “Given a specific budget, which is the graphics card you should buy?”, none of the Arc series can really be recommended as there are better choices for every budget. However, if you want a card for a very specific purpose, then the A750 is ideal for encoding video streams.
 
Back