CPU and GPU Availability and Pricing Update: March 2021

Hell no, and there's nothing that could ever get me to sign up. They still fight "wars" like it's the 19th century instead of just bombing the piss out of anyone that screws with us until they can't screw with anyone anymore and walking away. Troops on the ground should NOT be necessary in the 21st century. The only reason they still do it is so certain people can make billions on rebuilding while soldiers get used as cops and die every day for no reason.
Fair enough. Then in ALL fairness, you are free to do with your vote what you want and pick isolationist leaders.

Now, for my promised answer. The US learned a hard lesson from World War II and Pearl Harbor specifically, and that is forward presence means when blood is spilled, it's in somebody else's backyard, not yours. Keeping the wars on foreign shores means less of them here.

Is that reasonable?
 
No, it is not.
And here in Europe it costs more than 450€.
For an 8 core it is ridiculous. I know that criticize AMD here is not popular, but I really hope they will take an hit in market share for being greedy.

And just to clarify again: I’m writing from a 5800X.
Oh, please. Until Ryzen launched, a high-performance 8-core CPU was priced at double that. 450€ is high under the circumstances, but it's far from "ridiculous".
 
The whole company "Nvidia" is a joke
Limiting MY graphics I pay for with my HARD earned money, NO THANK YOU, AMD all the way, NVIDIA can go suck BIG FOOTS left toe
 
The whole company "Nvidia" is a joke
Limiting MY graphics I pay for with my HARD earned money, NO THANK YOU, AMD all the way, NVIDIA can go suck BIG FOOTS left toe
I hear Sasquatch likes foot play :p

Honestly though, I don't care AMD or Nvidia, just gimme the features I want and gimme them fast. Right now, Nvidia is scratching that itch, but 2 years from now I might have an RDNA4 GPU in my machine. It just depends.

Bias for or against primary hardware designers like AMD, Intel or Nvidia only costs you in the long run. Whether you acknowledge that cost or justify it in your own mind with whatever trending reason supports your bias, it doesn't change the reality which is you cannot go wrong with ANY of those 3 brands highest end or even mid range goodies.

NOBODY can claim CPU X brand can game fine while CPU Y brand cannot, that just isn't a factor at all. Not even a NOTICEABLE difference can be found outside of deep benchmarking. Same really goes for GPUs, though since feature sets vary between GPU brands more than CPU brands in noticeable ways, one brand or the other might have an advantage to be exploited to my benefit.

Some people have really strong biases, I used to be one of you. I used to be a PURE AMD CPU fanatic, and ANYTHING but Nvidia. But then the Century turned. I grew up and make better tech choices today that last me for years. AMD is still a part of that equation, but no longer the ONLY part of it.
 
I hear Sasquatch likes foot play :p

Honestly though, I don't care AMD or Nvidia, just gimme the features I want and gimme them fast. Right now, Nvidia is scratching that itch, but 2 years from now I might have an RDNA4 GPU in my machine. It just depends.

Bias for or against primary hardware designers like AMD, Intel or Nvidia only costs you in the long run. Whether you acknowledge that cost or justify it in your own mind with whatever trending reason supports your bias, it doesn't change the reality which is you cannot go wrong with ANY of those 3 brands highest end or even mid range goodies.

NOBODY can claim CPU X brand can game fine while CPU Y brand cannot, that just isn't a factor at all. Not even a NOTICEABLE difference can be found outside of deep benchmarking. Same really goes for GPUs, though since feature sets vary between GPU brands more than CPU brands in noticeable ways, one brand or the other might have an advantage to be exploited to my benefit.

Some people have really strong biases, I used to be one of you. I used to be a PURE AMD CPU fanatic, and ANYTHING but Nvidia. But then the Century turned. I grew up and make better tech choices today that last me for years. AMD is still a part of that equation, but no longer the ONLY part of it.

It's not just a matter of features. nVidia has bloated their drivers to the extreme over the years, and there's just too much convoluted junk in their UIs. Problem is AMD isn't doing much better in that regard. My main issue with nVidia though is the same reason I dislike Intel. You can google it and see all the anti-competitive, underhanded **** both companies did to keep AMD down over the years. I've always built with AMD CPUs, and probably always will. I don't even like Intel's naming conventions, if you can even call that numbered garbage naming.
 
It's not just a matter of features. nVidia has bloated their drivers to the extreme over the years, and there's just too much convoluted junk in their UIs. Problem is AMD isn't doing much better in that regard. My main issue with nVidia though is the same reason I dislike Intel. You can google it and see all the anti-competitive, underhanded **** both companies did to keep AMD down over the years. I've always built with AMD CPUs, and probably always will. I don't even like Intel's naming conventions, if you can even call that numbered garbage naming.

Parts of that I can agree with absolutely, parts of it felt irrelevant, and parts pure opinion, but all completely fair. As a former fanboy of AMD for a bit over a decade before I began allowing Intel into my builds, all of the anti-competitive stuff you mentioned WERE a big deal. 15+ years ago. AMD has done quite well for itself ever since.

Nvidia took awhile to earn it's way into my builds as well for similar reasons. However, Geforce 3 was my first because ATI was slow to the show with Radeon 8500 and wasn't worth waiting for anyway in the end. Geforce 4 held it's ground well until ATI released R300, and I owned every high end variant of that chip from R300 (9700 Pro), to R350 (9800 Pro), and even R360 (9800XT).

This is where I learned my lesson. Geforce released the first true Leaf Blower card to try to answer R300 and failed miserably. And bounced back brilliantly with 6800GTX, 7800GTX, and 8800GTX was a pure beast. Going back and forth serves MY needs, not some company that doesn't give 2 flips about me.
 
Last edited:
HOWEVER, I have to admit GREAT gratitude to AMD over the years, not just for the general effect on controlling prices in the industry and fighting for more open source approaches, but even on a more focused point that AMD used to send me a LOT of samples. I even got banned from samples for 2 years after posting a review of the Outsideloop Afterburner GFD that required ripping the slot A cartridge open to use. They forgave me and I was in on Thunderbird's launch, first purple core Dresden chips, and even the first Athlon MP 1.2Ghz pair on Tyan Thunder (combo) shipped to anybody with beta bios and all.

I do have a historic love for AMD, but I won't be ruled by that when they fall behind in key categories for me. Right now, they are ahead for CPU, behind for GPU, just putting it as unbiased as possible. The GPU lead is only a feature lead, not a real performance lead, but it's a lead that affects me directly. 2 gens from now I will re-evaluate who has what and what satisfies MY needs at that time.
 
Oh, please. Until Ryzen launched, a high-performance 8-core CPU was priced at double that. 450€ is high under the circumstances, but it's far from "ridiculous".
oh yes, and in 2000 maybe an 8 core would have cost $2000... and impossible to make in 1990, so your point is ?
In 2021 that price for an 8 core is ridiculous, no matter how hard the AMD apologists would try...
 
oh yes, and in 2000 maybe an 8 core would have cost $2000... and impossible to make in 1990, so your point is ?
In 2021 that price for an 8 core is ridiculous, no matter how hard the AMD apologists would try...
I am not an AMD apologist. I buy either brand when it presents me the best IPC versus clock speed I can find. End of story. Intel owned that crown for 15 years unopposed and I wouldn't look twice at AMD's offerings until they took that crown for themselves.

Ryzen 5000 so firmly took that crown that even Intel's attempted answer to it falls short overall and runs WAY hotter, sucks 100 watts more, and all of that to fail to pass up Zen 3 (Deja'vu of Athlon64 versus Pentium IV). However, up until that point, not even Zen 2 interested me when my i7 7700K thoroughly trounced Ryzen 3000 chips in gaming raw minimum FPS (the single most important gaming PC factor in my purchasing book, within budget constraints).

I disagree with your claim that it takes AMD apologists to think 449 is reasonable for an 8 core 16 thread chip in 2021 when it's the fastest 8 core, 16 thread chip for gaming you can get and runs WAY cooler than the Intel alternative. The price premium over Intel is as absolutely warranted as it was when it was reversed and Intel had the IPC + clock speed lead.

Balancing clock and IPC is firmly in AMD's court right now and for some time to come. Premium for that is warranted. I paid that premium when it was Intel giving me what I wanted, and will just as willingly do so if it's AMD. Premium is as premium does.

Edit of reference to above point made : I used Intel exclusively for 15 years, but for 10 years before that exclusively AMD. Gimme IPC + clock speed balance with less heat and less wattages sucked away and I buy that chip, even if it costs more.

Edit 2 - Reworded to be less aggressive sounding, my apologies. Your opinion is not BS and is only fair as everybody should be able to express theirs (even if I think it WRONG! :p KIDDING!)
 
Last edited:
I guess I'm trying to say I could have snagged a dirt cheap Core i9 9900K and gotten great gaming performance for the dollar spent, but it would not have been the best 8 core/16 thread experience I can have and the premium is only around 50-60 bucks. Worth it, to me.

In the premium I paid I get faster CPU, faster subsystems in PCIe 4.0 to finally support my RTX 3090 more fully, faster M.2 access and a cooler system overall. With the 140-150 or so watts the Ryzen 5800X puts out versus the 200+ a 7700K at full throttle can hit, that's less heat dumped into my radiator, as well as longer boost clocks held. Once my 7700K started reaching it's peak temps, boost clocks poofed.

I never see over 65C full load on the 5800X. According to logging, every time in games like Watchdogs Legion and Cyberpunk 2077, the Ryzen would hit it's peak temps of around 65C, and hold steady 4950mhz (PBO + Ryzen Master) as long as the CPU needed 100% (on one core, not overall!), and the steppings when the load was less was FAR finer than anything I've seen on Intel ever. (edit 1: temps don't stay 65C ALL the time in games, only when peaking core activity, actual gaming temperature ranges from 45C-60C average)

If the heaviest CPU core load was only 75% or so, clocks only ramped to around 4.4-4.6, but if any core hits 100% while the others are not hitting as hard, it leaps up to 4950mhz and sticks there as long as the core is asking 100% load.

In every way that matters that I can measure, the premium of the Ryzen 5000 series is well deserved over any Intel offering right now except PURE price consideration. And then NONE of the Intel current gen is on the table, it's the 9000 series Core chips that reign supreme for pure bang for buck if you don't mind giving up a lot of modern tech to get that. The Ryzen 5000 feels like a "smarter" chip overall. It self manages it's clocks so much better, just from my perspective so far...

Edit 2 - PBO + Ryzen Master working so simply was a major value add. I don't want to manually overclock anything anymore, I'm older now and 20 years of overclocking anything that could be overclocked wore me out on it). If the hardware has methods of self-overclocking with full stability, I will let it do that, but otherwise don't want to bother. PBO+Ryzen Master work as advertised. Nice value add in my book.
 
Last edited:
That's fine; they can keep jacking prices up; I won't be buying. I'd rather be done with gaming than spend ridiculous amounts of money on something that isn't remotely worth the price. Not that pc gaming has been very impressive in the last 2 years in terms of quality anyway; but even a 780Ti/290X can handle games @1080p low-medium, so it will be a much longer while for exploitation to take effect terms of reality; the jumps in graphical horsepower have been diminishing for quite a while now; things like RTX are just good ol' tapdancing as far as I'm concerned.
 
I guess I'm trying to say I could have snagged a dirt cheap Core i9 9900K and gotten great gaming performance for the dollar spent, but it would not have been the best 8 core/16 thread experience I can have and the premium is only around 50-60 bucks. Worth it, to me.


I never see over 65C full load on the 5800X.

You must have a "5800X special edition", because in the meanwhile all over the world 5800X owners (me included) are complaining about high temperatures (basically the only downside of a great CPU).

65° full load I'd like to know how are you cooling the CPU.
I'm using top of the line components (Noctua D15 and a Corsair 5000D Airflow case plus 3 Noctua fans) and under full load I can easily reach 77/78° in Cinebench.
Only using CTR I can now have 70° maximum under full load...
 
You must have a "5800X special edition", because in the meanwhile all over the world 5800X owners (me included) are complaining about high temperatures (basically the only downside of a great CPU).

65° full load I'd like to know how are you cooling the CPU.
I'm using top of the line components (Noctua D15 and a Corsair 5000D Airflow case plus 3 Noctua fans) and under full load I can easily reach 77/78° in Cinebench.
Only using CTR I can now have 70° maximum under full load...
First off, you just made my day!!! I was kinda worried that I was doing something wrong.

Secondly, that was my first or second day of owning it and stressing it. No game makes this EVER get close to 70C and typically maxes out around 65C, but Prime95 got it up to 71C and it stayed there for an hour straight. This was late last night testing, my apologies for not coming back and updating that.

So the 65C comment is outdated slightly. However... Your wording makes it clear I'm doing WAY better than I thought! YAY! Here I was worried cause I have seen multiple people never exceed 60C on AIO with same chip and thought I was doing kinda poorly. You have made my day!
 
Oh, for cooling, Corsair H100i Elite Capellix + Prolimatech PK3, and direct airflow from outside of case and a good seating. Don't use the crap that comes with AIOs, it's pure crap.

Edit: Also of note, the radiator is setup push/pull with fans on both sides of it, as well as extensive exhaust fans. Since the rad pulls in cool air from the outside, that helps some. Really, I didn't do anything super special except take a little extra care that I felt I didn't do well enough with my previous Intel setup that costed higher temps over it's lifetime than I thought necessary.
 
Last edited:
That's fine; they can keep jacking prices up; I won't be buying. I'd rather be done with gaming than spend ridiculous amounts of money on something that isn't remotely worth the price. Not that pc gaming has been very impressive in the last 2 years in terms of quality anyway; but even a 780Ti/290X can handle games @1080p low-medium, so it will be a much longer while for exploitation to take effect terms of reality; the jumps in graphical horsepower have been diminishing for quite a while now; things like RTX are just good ol' tapdancing as far as I'm concerned.
I get it, and that is fair. I could never agree with your opinion, but I do understand it at least. With the way things are right now, your frustration makes a LOT of sense and cannot be blamed.
 
You must have a "5800X special edition", because in the meanwhile all over the world 5800X owners (me included) are complaining about high temperatures (basically the only downside of a great CPU).

65° full load I'd like to know how are you cooling the CPU.
I'm using top of the line components (Noctua D15 and a Corsair 5000D Airflow case plus 3 Noctua fans) and under full load I can easily reach 77/78° in Cinebench.
Only using CTR I can now have 70° maximum under full load...

It really is getting old. The GPU manufacturers all do multiple versions of each card, 2 fans, 3 fans, etc and it never matters. Always excessive heat. I'm not one to tear them apart and install new coolers. I just like to buy something that works.
 
Also of note, a seeming bug in Prime95 did not go away in this rebuild. Even on my older Intel i7 7700K, I've always had to run Prime95 in Windows 7 compatibility mode to even get it to launch. The Ryzen does the same, so I think it's a common software component I am using across both builds causing Windows to try to shut out Prime95, possibly just the ancient version of it that I'm using, LOL! (PEBKAC issues :p )
 
First off, you just made my day!!! I was kinda worried that I was doing something wrong.

Secondly, that was my first or second day of owning it and stressing it. No game makes this EVER get close to 70C and typically maxes out around 65C, but Prime95 got it up to 71C and it stayed there for an hour straight. This was late last night testing, my apologies for not coming back and updating that.

So the 65C comment is outdated slightly. However... Your wording makes it clear I'm doing WAY better than I thought! YAY! Here I was worried cause I have seen multiple people never exceed 60C on AIO with same chip and thought I was doing kinda poorly. You have made my day!
ah ok, so you are speaking about temperatures while gaming.
That's different than full load.
People speaking about temperatures <60° in full load are probably just lying.
 
It really is getting old. The GPU manufacturers all do multiple versions of each card, 2 fans, 3 fans, etc and it never matters. Always excessive heat. I'm not one to tear them apart and install new coolers. I just like to buy something that works.
I didn't get your point. What is getting old ? 🤨
 
ah ok, so you are speaking about temperatures while gaming.
That's different than full load.
People speaking about temperatures <60° in full load are probably just lying.
Well, until I tested, I didn't know that, LOL! However, the first couple of days I mostly just gamed really heavily to get a feel for improvements in the most important area to me and why I upgraded at all.

However, last night the Prime95 itch manifested and needed scratching. However, am not disappointed at 71C after an hour with no further upward movements on the temp bar in HWInfo64, all cores slammed. That is WAY better than the 7700K ever managed on half the core count and lower clocks.

71C is with PBO + Ryzen Master enabled, though no manual pushing added. Just whatever tiny "AutoOC" Ryzen Master bestowed on me (which does nothing useful in Prime95 when all cores are going except add 50 whole Mhz to the all core clock).
 
Well, until I tested, I didn't know that, LOL! However, the first couple of days I mostly just gamed really heavily to get a feel for improvements in the most important area to me and why I upgraded at all.

However, last night the Prime95 itch manifested and needed scratching. However, am not disappointed at 71C after an hour with no further upward movements on the temp bar in HWInfo64, all cores slammed. That is WAY better than the 7700K ever managed on half the core count and lower clocks.

71C is with PBO + Ryzen Master enabled, though no manual pushing added. Just whatever tiny "AutoOC" Ryzen Master bestowed on me (which does nothing useful in Prime95 when all cores are going except add 50 whole Mhz to the all core clock).
71° full load is a very good result.
While gaming it is quite normal to have 5/6° less (it basically depends on the game).
 
" Cards like the RTX 3060 are being sold in Europe for the equivalent of $600, for a product with a $330 MSRP. It’s an absolute joke. `"

NOT TRUE !
As today 1.April 2021 RTX 3060 can be purchase for 775,- Euro aka 910 USD from only 1 shop in Germany ... Plus shipping of course... And no, not a April fools post...

Thats GPU market reality we are living right now here...
 
Back