AMD blames Intel's "horrible" CPUs for Ryzen 7 9800X3D shortages

From a real world value perspective, it’s an excellent way to future proof your CPU.

Exactly this^.
People who built an AM4 Gaming rig (5-7 years ago) whom have now tossed in a 5800x3D into their old rigs, are NOT complaining about AMD's superior "Gaming Technology" (3d stacking).

Coincidentally, 5800x3D is still up there as a top 5 Gaming CPU and shows that it is worth the slight premium for x3D and for Gamers building their rigs with a AMD's 3D Monsters. The added $$ is piece of mind and reviews become meaningless, cuz u just keep on gaming.
 
Last edited:
My 5800x3d is still kicking, I totally understand the exceptional demand for x3d parts (for gaming). In the past, intel cpus somewhat suffered from cache restrictions, that is why a 6-core ryzen 5600x performs at (aprox.) the same level as an equivalent 8-core 10700k. I remember Techspot doing an article comparing the 10900k to the 10700k by disabling cores and restricting frequencies. The i9 model showed increased performance believed to be due to the higher L3 cache capacity.
Be aware that a firmware update for Intel's ultra series is still pending, but probably the latency problems can only be solved by revising the architecture. Perhaps some highly cached x3d competitor could turn the tide.
no amount of firmware updates will ever repair a failed hardware product.
 
If you are talking about Raja Koduri, he got hired by Intel to lead their GPU department when they started to expand it to go discrete.

Are you saying he is a DEI hire because Intel GPUs are not doing good?

DEI is a major issue.

But I don't tbink in Koduri's case, it's really at play. He's just not good at his job but is good at making it look like the fault of others.

He left AMD on very bad terms and, after joining the Intel GPU team very late in the day, was quietly removed after a short while.
 
Lol and we blame AMD for whats going on with Nvidia. So everyone blames everyone. Whos Intel gonna blame then?
 
If you are a crafter budget gamer, be on the lookout for used AM5 Ryzen 7600x & 7700x CPUs.

If you can pick one up real cheap, it is well worth building a new AM5 rig around. That way you can spend bigger now on a proper mainboard and GPU and in year or so, drop in a used 9800x3D you picked up on cheap (again).. and know You are still (years later) @ 1% of gaming CPUs.
 
Why should they ban any such language? "DEI hiring" is a proven fact.
DEI hiring is in fact a real and good thing. However, referring to someone as a "DEI hire" in the pejorative is a slur. I can't believe I have to explain this? I thought we were all critical thinkers on here? Guess not? Try calling your female boss a "DEI hire" in a corporation and tell me what HR said was your punishment?
 
DEI hiring is in fact a real and good thing. However, referring to someone as a "DEI hire" in the pejorative is a slur. I can't believe I have to explain this? I thought we were all critical thinkers on here? Guess not? Try calling your female boss a "DEI hire" in a corporation and tell me what HR said was your punishment?
Sorry, but DEI is a failed ideology.

It is based on something as feckle as someone's skin color, not based on earning or merit. DEI is the opposite of judging someone for the Content of their character, not the color of their skin, which so many American fought for to uphold equal rights..

The definition of Diversity, Equity & Inclusion (DEI) are not at a pejorative when Some of these people have DEI on their door and in their Name tags and hold such Offices, that now too are failing and being shut down.

There is no defense for it..
That is why major retailers (McDonald's, Lowes, Home Depot, Target, etc) across America and elsewhere are pulling back and firing their DEI departments. Zuckerberg just announced this a few days ago, himself.


Though I do agree, if you are calling your boss (to their face, or co-worker) a DEI hire, that would be derogatory & mean.
 
But in the past, they were doing the exact opposite. They were realising CPUs with zero or minimal L2 cache (Duron and Celeron). They never chased a solution for extra cache, which was clearly a mistake, because as we can see today only the sales of the CPU with the extra cache with the advanced implementation with connectivity on the back, scales well. So Duron and Celeron was horrible idea.
Opposite markets, Celeron and Duron were the lowest end cheapest parts.
X3D are the highest end priciest parts, what we can tell from this is that cache is expensive. It takes up die space, die space that could either be omitted (save costs) or spend on doing something else (more instruction sets, accelerators, more cores)

For most game engines the CPU caches are greatly beneficial. For a lot of other tasks the diminishing returns kick in much earlier and the L3 cache could be much smaller with negligible effects).

X3D seems to be special in that's a layer on top (or below with the latest iteration) that from what I understand cannot be used for the alternatives, just for cache.

I remember those celerons that didn't just miss a L3 cache but didn't even have a L2 cache. I had a Pentium 1 100Mhz back in the day and thought I was getting an upgrade in the form of a Celeron 266Mhz from an Uncle. Turns out it performed worse in StarCraft than the P1 did (ended up turning the in game music off to gain a few FPS... Which is a shame because that game has awesome music). They stopped making those pretty fast, performance was atrocious.
 
Sorry, but DEI is a failed ideology.

It is based on something as feckle as someone's skin color, not based on earning or merit. DEI is the opposite of judging someone for the Content of their character, not the color of their skin, which so many American fought for to uphold equal rights..

The definition of Diversity, Equity & Inclusion (DEI) are not at a pejorative when Some of these people have DEI on their door and in their Name tags and hold such Offices, that now too are failing and being shut down.

There is no defense for it..
That is why major retailers (McDonald's, Lowes, Home Depot, Target, etc) across America and elsewhere are pulling back and firing their DEI departments. Zuckerberg just announced this a few days ago, himself.


Though I do agree, if you are calling your boss (to their face, or co-worker) a DEI hire, that would be derogatory & mean.
It's only companies that lean right who are removing DEI. DEI will continue and get even larger in IT companies. HPE is holding strong on their commitment to DEI and they are doing just fine. It is not a failed ideology since it's not an ideology, DEI is a disipline. You've clearly never worked for a company that adheres to the disipline.
 
While true, the 9800X3D is over kill and most people will be GPU limited long before being CPU limited.

From a hobby perspective, the 9800X3D is awesome. From a real world value perspective, it doesn't really offer that much.
How can you make assumptions about people you don't know? There is no way you can know how many people are playing games at 1080p competitively.

Opposite markets, Celeron and Duron were the lowest end cheapest parts.
X3D are the highest end priciest parts, what we can tell from this is that cache is expensive. It takes up die space, die space that could either be omitted (save costs) or spend on doing something else (more instruction sets, accelerators, more cores)
I don't know if you can determine how expensive something is to make based only on the price something is being sold at. You can have a theory on it, but I wouldn't type a post making it sound like something was a fact.
 
I don't know if you can determine how expensive something is to make based only on the price something is being sold at. You can have a theory on it, but I wouldn't type a post making it sound like something was a fact.
If you like facts.
  1. Fact: L1/L2 and non X3D L3 cache make a die physically bigger
  2. Fact: The bigger the die the more you lose out on [lower yield] because you have to cut rectangles (the die shape) out of a circle (the wafer)
  3. Fact: The bigger the die the more chance of a defect per chip [lower yield] (and you already get less per wafer because it's bigger - so double whammy)
  4. Fact: We know that most companies (Apple is an exception) get charged per wafer rather than per die. So bigger most definitely means more expensive (and it's safe to assume that Apple's special per die deal is also largely calculated depending on the size of the die they want)

2 & 3 is a large part of the reason* why companies are trying to make chiplets work. Smaller dies are simply a lot more cost effective. We also know the costs per wafer TSMC charges for some of their processes at certain points in time and what their typical yields would have been.
  • If you know the die size you can calculate how many chips would typically be harvested from a single wafer using a wafer calculator like:
    If you know the yield ratio multiply it by DPW (dies per wafer) - It's estimated to be between 60-80% for 3NM for TSMC.
  • If you know the wafer cost (TSMC expects 2nm to be over $30000 per wafer), divide it by the harvested number and you know the cost per die.
For fun you can confirm my findings for yourself using that wafer calculator if you want, by default that page lists 248 dies per wafer. Increase the number of say the die width to 11 (up from 10) to pretend you added a bit more cache, the DPW drops to 222.
Substract yield (assuming yield equal to the low numbers of 3nm):
248 * 0.6 = ~149
222 * 0.6 = ~133
If the wafer cost say $30000.
$30000 / 149 DPW = $201.34
$30000 / 133 DPW = $225.56

I recommend 'AdoredTV' on YouTube if you want to know more about this stuff although sadly he seems to have stopped uploading. The information there is still very interesting.


* The other reasons sre modularity:
  1. Makes it easier to swap out parts while keeping scale advantage
  2. Allows for different parts to be made by different vendors (Intel Arrowlake combines TSMC and Intelparts)
  3. Allows for parts that don't scale well on newer nodes (that are more expensive) to still use older nodes (which need something to do if they're getting updated to the latest process). Old nodes tend to have better yields/lower costs.
 
Last edited:
Back