I totally agree with you and I might not want it either but M.2 slots aren't a big deal for gamers because NVMe load times aren't much better than SATA SSDs.
Remember that big gaming storage test that Tim did awhile back?:
"What We Learned:
Why don't games benefit all that much from faster SSDs? Well, it seems clear that raw storage performance is not the main bottleneck for loading today's games. Pretty much all games released up to this point are designed to be run off hard drives, which are very slow; after all, the previous generation of consoles with the PS4 and Xbox One both used slow mechanical drives to store games.
Today's game engines simply aren't built to make full use of fast storage, and so far there's been little incentive to optimize for PCIe SSDs. Instead, the main limitation seems to be things like how quickly the CPU can decompress assets, and how quickly it can process a level before it's ready for action, rather than how fast it can read data off storage."
I agree but it's not about buying new DDR4, it's about being able to use the DDR4 that you already own. I mean, let's be honest here, what percentage of gamers looking to get this do you think are still using DDR3? I think that it's statistically zero.
Most people still don't see it that way. Intel has been the #1 name in x86 CPUs since the creation of the 8086 back in 1978. You don't really think that AMD managed to change 45 years of market perception in only 6 years, do you? There are probably just as many people who have never had an AMD CPU as there are who have never had a Radeon GPU. These aren't cheap devices and when people have to spend a lot of money, getting a familiar brand gives them peace-of-mind. Whether deserved or not, Intel hasn't relinquished that status. Also, don't forget how many losers are out there who think that LoserBenchmark is a reliable source of information...
To you and me, sure, I agree. To most of the world, no, Intel is still the more recongnised brand. AMD's greatest success has only happened over the past six years and by no means did it relegate Intel to almost nothing like AMD was during the FX years. Sure, AMD has been out-selling Intel but only in the last three years or so. Don't be fooled into thinking that AMD is even close to market parity with Intel yet because it isn't, not on the consumer side or the server side:
"In the fourth quarter of 2022, 62.8 percent of x86 computer processor or CPU tests recorded were from Intel processors, up from the lower percentage share seen in previous quarters of 2021, while 35.2 percent were AMD processors. When looking solely at laptop CPUs, Intel is the clear winner, accounting for 76.7 percent of laptop CPU test benchmark results in the fourth quarter of 2022."
That's going REALLY far back, like early AGP-era.
Oh I agree, I think that AMD made the right move there. The problem was that Intel's reaction of lowering their motherboard prices was a very effective counter to that.
Sure, but not so much to make impossible that the majority of those sales were to the relatively-few fanboys, very wealthy gamers and scalpers (scalpers are still a thing).
Show me decent sales numbers after they've been on the market for a month and then I'll say "Yeah, sales have been good".
That's not true at all. TSMC could produce easily produce more than AMD could sell of any cache type. I don't know where you get that idea. There's nothing there that mentions a shortage, you are still only assuming.
I'm sorry but that is not a fact as I see it. That is still you making assumptions.
Absolutely. They're a business and they have to focus more on the more profitable side of the business and that's commercial. I couldn't agree more.
That's not a fact either because they always make the most parts in the low and mid-ranges because the high-end parts don't sell nearly as well. This is not just true in the semiconductor business, this is true about everything
I'm starting to wonder if you understand what a fact is.
No, it doesn't add up, not even close. I still don't know how you managed to reach that conclusion. If there was a shortage, I can guarantee you that someone
would've said something. Do you really think that you know more about what's going on than actual CPU journalists like Steve Burke, Steve Walton, Hilbert Hagedoorn, Paul Alcorn or W1zard? All you did was read some short article that Steve Walton wrote, an article that did not mention any kind of shortage or even postulate that one existed, made one very presumptuous leap and decided to call it fact. That's not how facts work.
The article does talk about a special node but it doesn't say anything about a shortage
, so you've still failed to offer evidence. If it's being made on a TSMC process dedicated
to SRAM, that means there will be LOTS of it. Sure, AMD is using a lot of it (probably more than what is necessary) but there's nothing in that article that talks about limited supply, anywhere
. In fact, if TSMC has a production line dedicated to it, it tells me the exact opposite. So, you're still making assumptions that I think are incorrect.
AMD uses 7nm not only because RAM doesn't really scale well beyond that but because it's far less expensive
. This wouldn't be the case if there was an issue with scarcity because as we've been made painfully aware
over the last couple of years, scarcity means that prices skyrocket. I really doubt that there could possibly be a scarcity issue at 7nm because, at the very least, both TSMC and
Samsung are capable of producing it in large quantity. The 7nm process was originally produced seven years ago
by TSMC which means that it is beyond
mature at this point. Any major fab in the world could produce 7nm chips including TSMC, Samsung and GlobalFoundries as stated on its Wikipedia page:
7nm Process - Wikipedia
If there was a supply issue, all AMD would have to do is contact one of the other two to shore up supply and they would
do it too because they're not going to leave profit on the table if they don't have to. As you pointed out yourself, a shortage could negatively affect EPYC (and/or Radeon Instinct if it's used there as well). This is something that AMD would never be able to accept if it didn't have to, and it doesn't have to.
At the end of the day, I'm still forced to believe that AMD's insistence on producing X3D versions of both R9 CPUs while refusing to produce an X3D version of their R5 CPU was greedy, stupid and ultimately anti-consumer. I don't say this just to rag on AMD because you know that I am far more fond of ragging on Intel and nVidia. I'm also not personally affected by what AMD does with Zen4 because I already have an R7-5800X3D and therefore have no interest in Zen4. What I see is AMD shooting themselves in the foot because AM5 isn't so good that it can't be passed up for LGA 1700.
Sure, LGA 1700 is a dead platform but you can buy an LGA 1700 motherboard that supports DDR4 for about HALF the price of the cheapest B650 board, the Gigabyte DS3H. That's EXTREMELY attractive because not only does it effectively nullify the longevity of the AM5 platform, the ability to continue using DDR4 makes it actually a better economical choice than AM5. A lot of shrewd gamers were waiting, hoping for a 6-core Zen4 X3D CPU (because why wouldn't they?) but took the Intel route immediately upon discovering that there would be no R5-7600X3D.
Why would they do that? Simply because they didn't think that paying $450 for an octocore CPU they didn't need on an expensive B650 motherboard and having to buy at least 16GB of DDR5 was worth it and they were right. Just look at these two scenarios:
CPU: R7-7800X3D - $450
Motherboard: Gigabyte B650M DS3H - $150
DDR5 16GB: Patriot Signature Line 16GB DDR5-4800 - $59
CPU: i5-13600 - $250
Motherboard: ASRock H610M-HDV - $80
DDR5 16GB kit: $0 (Not Required)
Estimated gaming performance difference:
215 ÷ 185 = 1.1622
With the vast majority of people not caring one way or the other as to which CPU maker they have in their rig, who is going to want to pay literally double
for a gaming performance increase of less than 20%?
To be honest, not even the R5-7600X3D would be worth it (at the $350 price point that I envisioned) for gamers from a platform cost / performance standpoint but it has that hype and "cool factor" that often overrides the intelligence of gamers in the same way that nVidia's marketing does.
AMD is already not a great value for their new platform and not having an R5-7600X3D part at $350 only makes things worse.