AMD Zen 6 could hit 7 GHz and 24 cores in desktop CPUs

Yeah, Phenom II was competitive against Core2, but by that point Nahalem was out and Sandy Bridge was just around the corner.

Really, from the E8600/Q6600 through the i7 920, i7 2600k/i5 2500k, through the 4770k Intel was on a roll. It was really after that Intel went into cruise control, and it cost them long term.

Indeed, it was round Haswell's time that Intel started sleeping, thanks also to an uncompetitive AMD.

Sandy Bridge marked a turning point in CPU design, and today's CPUs, even the ARM ones, vaguely resemble it. We've got to hand it to AMD too because Bulldozer, though failing in practice, was quite original; and learning to chop its power taught them much that got poured into Zen.
 
I waited for the second gen AM5 for better DDR5 speed and prices, so I'll be interested to see how long AM5 sticks around. If the 11800x3D can drop in that would be awesome.

The 5800x3D is still great and in my primary gaming PC. The 9800x3D was mostly about the productivity gains on my primary work PC (but also my future gaming as I my last work PC becomes my living room gaming PC).

I still on my 5800x3D too, it's rock solid working 8h a day with a UV of -100mV, using a premium tower air cooler, it get's to 74ºC under all cores 100% usage, no more than that. Paired with my 4080 I get more then enough FPS on almost everything I play at 1440p.

What else could I ask for?

I may upgrade only when it dies, perhaps Ryzen 11 gen or perhaps the last and most mature gen of socket AM5.
 
They cut the channels in half with DDR5 and gave each module two of them. It makes it annoying since nobody knows what anyone else means when they talk about channels since they don’t know if people are talking about the actual channels or what they would have been had JEDEC kept things the DDR4 way.

The DDR5 channels are 40-bit channels, not 33-bit. The DDR4 channels were 72-bit channels. There is an extra 8-bits in the channel that is not used for data.

To make matters even more confusing, the DDR5 registered DIMMs seem to use the DDR4 channel arrangement.
Cutting channel width to half was actually good thing. As for this naming, I see nothing badly wrong with it.

While DDR4 channel is actually 72 bit, generally speaking 64 bits make more sense as 64 bits determine data transfer rate. And you can actually calculate transfer rate based on bits that actually transfer data. No need to know how many ECC or other bits exist. That makes sense.

Then this channel problem. As said before, single channel refers to bus that transfer width is 64 bits. That way 128 bit memory bus is dual channel despite how many channels there actually are. It also helps to determine how many memory modules is needed (minimum).

Channel = 64 data transfer bits. Module is 64 bits data transfer. Needed modules (minimum) = amount of channels used. Transfer rate can be calculated with bits used.

Not Exactly right but just forget bits ued for data transfer and forget internal channels etcc it works fine.
 
I still on my 5800x3D too, it's rock solid working 8h a day with a UV of -100mV, using a premium tower air cooler, it get's to 74ºC under all cores 100% usage, no more than that. Paired with my 4080 I get more then enough FPS on almost everything I play at 1440p.

What else could I ask for?

I may upgrade only when it dies, perhaps Ryzen 11 gen or perhaps the last and most mature gen of socket AM5.
How does it do on Spider-Man 2 if you have it? Spider-Man 1 is the most demanding CPU title I've played - it's fine, but feels like maybe nearing the max for the 5800x3D so I'm curious how it handles 2 but all the online benchmarks have moved on. (I'm waiting on sane pricing before upgrading my 3080 before buying SM2)
 
Sounds like this will be a much bigger upgrade than 3000 to 5000 was on AM4. Having a 7800X3D, I'm excited to think I can put a 10800X3D in my system and get quite a few more years out of it. Of course, I'm nowhere close to needing more than a 7800X3D now as for gaming it's still fine and will be fore quite some time.
 
At up to 250watts per core Tejas was dead on conception.

reddit.com/r/intel/comments/wiohhr/intels_abandoned_netburstonsteroids_tejas_and/

Intel didn't need Core or Core 2 to "save" them. Prescott successor Tejas was quite ready when Intel just decided Pentium M is better choice. Intel would have done just fine with Pentium 4 too. No idea where this Core saved Intel -thing comes from. Maybe Tejas existence was just ignored.

Because at same time AMD messed at least two architectures, probably three and Bulldozer was very rushed one. No real competition was coming from AMD even if Intel just continued with Pentium 4 line.
 
At up to 250watts per core Tejas was dead on conception.

reddit.com/r/intel/comments/wiohhr/intels_abandoned_netburstonsteroids_tejas_and/
No. You again forget thar because Intel is Intel, many buy it regardless something is better. Prescott was total failure if you just look at speed vs heat. Yet it sold very well.
 
Back