The Ryzen 7 5800X3D CPU gets a delid for improved temperatures

nanoguy

Posts: 1,355   +27
Staff member
In context: In recent years, delidding has gradually become a lost art due to improvements in modern IHS design, the use of better thermal interface materials, and idiot-proof tools for the delidding process. However, it's always a pleasant surprise when a processor like the Ryzen 7 5800X3D that marks the end of an era gets delidded by an enthusiast who believes the reward is well worth the risks.

Earlier this month, an anonymous overclocker delidded an engineering sample from the upcoming Ryzen 7000 CPU lineup and revealed the bare dies in all their glory along with the first new integrated heat spreader (IHS) design change in years.

Normally, this procedure is only done by enthusiasts looking to reduce operating temperatures without the use of exotic cooling hardware. For obvious reasons, the Ryzen 7000 series processor delid was only a tease meant to showcase how AMD is working around some of the problems of moving to an LGA socket for the new CPUs. At the same time, it looks like it will be a more daunting task than on any previous CPU due to the way capacitors are arranged on the interposer.

This week, another overclocker going by @Madness7771 on Twitter revealed he had delidded the Ryzen 7 5800X3D, the last processor of the AM4 era. There are no surprises in terms of what's under the IHS — a core complex die and SoC die are present, and one can also observe some nonconductive protective goop where a second core complex die would have been if AMD had decided to make a Ryzen 9 5900X3D.

What's interesting about this delid is that replacing the factory liquid metal with some of Thermal Grizzly's Conductonaut led to the CPU running 10 degrees Celsius cooler and maintaining higher clocks during gaming workloads. Madness says this was a worthwhile improvement as the 5800X3D would previously reach temperatures of up to 90 degrees during long gaming sessions in titles like Forza Horizon 5.

That said, delidding a modern processor produces more modest results than it used to while requiring a great deal of patience and finesse. Madness used classic tools like razor blades and a heat gun to do the job, and it goes without saying this procedure carries a high risk of damaging the delicate core complex die with the stacked 3D V-Cache.

Overall, the results are impressive, and there's little one can do to further improve how the Ryzen 7 5800X3D performs. Overclocking isn't officially supported due to design limitations, but there are signs that manufacturers like MSI may add some limited overclocking support on select high-end AM4 motherboards shortly.

Permalink to story.

 
Delidding makes sense if you're overclocking to the near limits of the chip, because that's where it started. 5800X3D has a 105w TDP, reduced clocks and nerfed OC'ing.

Not sure how well 10C reduction will really help anything, but satisfying an enthusiasts' curiosity and passion for tinkering. Cool (lol), but meh.
 
For obvious reasons, the Ryzen 7000 series processor delid was only a tease meant to showcase how AMD is working around some of the problems of moving to an LGA socket for the new CPUs.
Maybe they could ask Intel for advice, since Intel has been using LGA sockets for about 2 decades. Or is AMD trying to claim LGA as an "innovation"? Or at the very least a, "new and improved" socket design.
At the same time, it looks like it will be a more daunting task than on any previous CPU due to the way capacitors are arranged on the interposer
Judging only by the photo, the row of caps looks higher than the dies. Which means, (if I'm correct), that any heat sink will have to have the face machined smaller to avoid them.
 
Intel did not invent LGA sockets. Both Intel and AMD are using them since 2006.
And AMD has been using PGA sockets as late as 2017 too. (all the Ryzen series?)
Don't ya just hate it when somebody sasses AMD?
According to Wiki, Intel's LGA 775 was introduced in 2004. If you disagree, take it up with Wiki:
 
Last edited:
And AMD has been using PGA sockets as late as 2017 too. (all the Ryzen series?)
Don't ya just hate it when somebody sasses AMD?
According to Wiki, Intel's LGA 775 was introduced in 2004. If you disagree, take it up with Wiki:

It apears that Intel is using LGA since 2004 and AMD from 2006. Still Intel didn't invent it. It was introducted to the world in 1996. So your first comment is wrong anyways.

I believe, that AMD was using old pin sockets for desktop, because they were cheaper.

Talking about CPU coolers, all Socket AM4 coolers will be compatible with AM5.
 
Maybe they could ask Intel for advice, since Intel has been using LGA sockets for about 2 decades. Or is AMD trying to claim LGA as an "innovation"? Or at the very least a, "new and improved" socket design.

Judging only by the photo, the row of caps looks higher than the dies. Which means, (if I'm correct), that any heat sink will have to have the face machined smaller to avoid them.

No only you are trying to make AMD is making any claims about LGA being an innovation.
 
I believe, that AMD was using old pin sockets for desktop, because they were cheaper.
OK., let's assume you're correct. Which raises the question, "why are they going with LGA now"? I tend to think it's because they likely heard too much whimpering about bent pins, from the elite techies, when they adhered to the "one board fits all", design strategy, and kept swapping out their CPUs.
Talking about CPU coolers, all Socket AM4 coolers will be compatible with AM5
As far as this goes, all the cooler manufacturers have to do, is include a par of 50 cent brackets, to accommodate Intel 115x 12xx, 17xx, and 20xx sockets.At least that's what it says on the brand new Noctua cooler box I'm holding in my hand at this very moment. (OK, so I did set it down when I started to type).
 
No only you are trying to make AMD is making any claims about LGA being an innovation.
If there's a dirty, thankless job to be done, or even an erroneous point to be made, I'm your go to guy. You should be thanking me. :pAfter all, I did provide you with the opportunity for that snarky rebuttal. 🤣

"I just bent the pins on my new Ryzen, and my mobo went up on flames when I turned the system on. What should I do"? I'd swear I read that somewhere, most likely at Quora. (Not on a stack of bibles though, It would most likely give me 3rd degree burns). :rolleyes:
 
Last edited:
Delidding makes sense if you're overclocking to the near limits of the chip, because that's where it started. 5800X3D has a 105w TDP, reduced clocks and nerfed OC'ing.

Not sure how well 10C reduction will really help anything, but satisfying an enthusiasts' curiosity and passion for tinkering. Cool (lol), but meh.
Every CPU being overclocked will hit a heat limitation.
 
Every CPU being overclocked will hit a heat limitation.
It's limited before you even start overclocking. 10C reduction does nothing to help get higher clocks on a chip that AMD says will be damaged if overclocked. Hence them disabling it at chip level.
 
There are little reasons these days for regular user to OCK, since all CPU's have "turbo states".
Also this deliding thing since the soldering.
Additional OCK on modern CPU's will make the power usage and heat grow exponential.
For 2-5% speed you get >100% heat and power.
 
No only you are trying to make AMD is making any claims about LGA being an innovation.
I don't know what he's talking about in regards to the socket either, but I do believe he thinks dude isn't using the original HS after replacing the stock TIM.
 
There are little reasons these days for regular user to OCK, since all CPU's have "turbo states".
Also this deliding thing since the soldering.
Additional OCK on modern CPU's will make the power usage and heat grow exponential.
For 2-5% speed you get >100% heat and power.
Delidding isn't for regular users. It's for enthusiasts that will go this far for little gain, except in this case it was only to stop the chip from reaching 90C under (heavy?) load, because manual OC'ing is disabled already at chip level by AMD.
 
OK., let's assume you're correct. Which raises the question, "why are they going with LGA now"? I tend to think it's because they likely heard too much whimpering about bent pins, from the elite techies, when they adhered to the "one board fits all", design strategy, and kept swapping out their CPUs.

As far as this goes, all the cooler manufacturers have to do, is include a par of 50 cent brackets, to accommodate Intel 115x 12xx, 17xx, and 20xx sockets.At least that's what it says on the brand new Noctua cooler box I'm holding in my hand at this very moment. (OK, so I did set it down when I started to type).
They are moving to LGA because they can add more pins in the same space that way. It's easier to cram more of them on a motherboard than on the chip itself (and bent pins are "preferred" on mobos since most of them are cheaper than CPUs).

The increased I/O and power draw kinda forced AMD to do this (it might also be a few cents cheaper). Threadripper launched as LGA for the same reason.
 
Moving to another socket still do no address the initial design flaw, cpu core not centered to IHS and cooler. But as in get a bigger hammer fix, get a bigger cooler and it's fixed.
 
I don't know what he's talking about in regards to the socket either,
All I was trying to say was that over the years, I think I've heard more reports of PGA boards being damaged trying to install the CPU, than LGA boards. Do I have empirical data? No.

What's interesting to me is, the fact that Intel retained PGA sockets on all (?) their laptop CPUs, where the end user is much less likely to be "tampering" by changing CPUs.

As for the heat sink issue, the asymmetrical placement of the dies seems, (to me at least), to require a specially machined heat face to cause direct contact with the CPU die itself.

As to "Monster Mash's" modifications, (or whatever he calls himself), I question why he simply didn't drop in a better CPU. After all, whenever AMD products are discussed, the conversation always turns to "Intel is sh!t because they keep changing boards", whereas with AMD, any old CPU would fit.

With those things said, the photos are of a de-lidded.7000 series. The conclusion I've drawn, (be it correct or not), is that the IHS, is there as much for mechanical stability, as it is for cooling. I believe that many older CPUs had a single die centered on the board, and were therefore more conducive to using a standard HSF.
 
The increased I/O and power draw kinda forced AMD to do this (it might also be a few cents cheaper). Threadripper launched as LGA for the same reason
Right. And IIRC, the "Threadripper", fas 4000 pins, or lands, as the case may be.
(and bent pins are "preferred" on mobos since most of them are cheaper than CPUs).
So, can we at least come to a consensus, that PGA boards are easier to damage, but cheaper to replace?
 
Right. And IIRC, the "Threadripper", fas 4000 pins, or lands, as the case may be.

So, can we at least come to a consensus, that PGA boards are easier to damage, but cheaper to replace?
It was always a tradeoff of this kind :)
The board is easier to damage because of the thinner legs, but your expensive CPU is safer (Linus can safely drop it now).
 
All I was trying to say was that over the years, I think I've heard more reports of PGA boards being damaged trying to install the CPU, than LGA boards. Do I have empirical data? No.

My very anecdotal only experience with PGA sockets hasn't been boards or CPUs damaged while installing, that's easy peasy.

However after about the 4th time that CPU comes straight out of the socket while still attached to the cooler, you get used to it and just go about looking for and correcting any bent pins.
 
It's limited before you even start overclocking. 10C reduction does nothing to help get higher clocks on a chip that AMD says will be damaged if overclocked. Hence them disabling it at chip level.
I think as pathways narrow, we're going to see less and less "overclockability", due the the fact that the newer, narrower pathways lack the ability to carry heavy current loads.

Narrow processes are probably a trade off between higher efficiency, higher component density, and heat dissipation ability.

Off the top of my head, several instances where processes possibly in excess of 90+ nm are necessary, audio output transistors and high power silicon control rectifiers.
 
Last edited:
Back