Endymio
Posts: 3,599 +3,492
Peak speed matters in some circumstances, sustained in others. The debate is like arguing over which is better: a hammer or a screwdriver. Both have their place.Peak clock speed is what matters not sustained.
Peak speed matters in some circumstances, sustained in others. The debate is like arguing over which is better: a hammer or a screwdriver. Both have their place.Peak clock speed is what matters not sustained.
Have you been following AMD's Ryzen naming scheme for the past 3 years? This is consistent with that scheme. Not saying the scheme makes sense, as it certainly doesn't.
AMD's desktop CPUs use the first number to define which Zen revision they are on. However AMD's APUs (laptop and desktop CPUs) use the next number up. Why? Because stupid.
Zen
Desktop: Ryzen 1xxx
HEDT: Threadripper 1xxx
Desktop APU: Ryzen 2xxx
Laptop APU: Ryzen 2xxx
Zen+
Desktop: Ryzen 2xxx
HEDT: Threadripper 2xxx
Desktop APU: Ryzen 3xxx
Laptop APU: Ryzen 3xxx
Zen2
Desktop: Ryzen 3xxx
HEDT: Threadripper 3xxx
Desktop APU: Ryzen 4xxx
Laptop APU: Ryzen 4xxx
I assume Zen3 will follow suit.
Did I mention STUPID yet? No, I'm not even going into the AF models... more stoopid...
Now you should understand that buyers are those who are stupid.
Zen2 desktop CPU's are 3xxx (out July 2019), so if Zen2 APU's (July 2020) are also 3xxx, then stupid buyers assume AMD is selling 1 year old stuff.
So naming is not stupid, it's very wise and makes sense. Problem is that buyers (and you) (sorry, couldn't resist) are stupid.
The first problem with that is that the CPU architecture is 1 year old stuff as APUs are released a year after desktop CPUs with the same architecture. That said, treating the numbering as Ryzen model years is pretty close but still not perfect.
2017: Ryzen 1xxx
2018: Ryzen 2xxx
2019: Ryzen 3xxx
2020: Ryzen 4xxx
Problems with that: Ryzen 3 3100 and 3300X were released in 2020, so should be Ryzen 4000 processors. And the Ryzen 1200AF and 1600AF were released in 2019 and 2020 so should be 3xxx and 4xxx processors, respectively.
under the same logic, customers buying a 2020 laptop with a Ryzen 4800H APU is under the (wrong) impression of buying a new architecture while it is the same (good) architecture present in his " old " Ryzen 7 3700X, for instance.Now you should understand that buyers are those who are stupid.
Zen2 desktop CPU's are 3xxx (out July 2019), so if Zen2 APU's (July 2020) are also 3xxx, then stupid buyers assume AMD is selling 1 year old stuff.
So naming is not stupid, it's very wise and makes sense. Problem is that buyers (and you) (sorry, couldn't resist) are stupid.
under the same logic, customers buying a 2020 laptop with a Ryzen 4800H APU is under the (wrong) impression of buying a new architecture while it is the same (good) architecture present in his " old " Ryzen 7 3700X, for instance.
I don’t care, it is not a problem for me or for you... but it is confusing for a non technical customer.Partially true. First, APU's have slightly different architecture compared to CPU's. Another thing is that 3000-series APU models came out 2019 so buying 4000 series means buying modern APU instead last year stuff. We could except there are different customers for CPU's and APU's. Huge difference vs Intel, because Intel have GPU integrated on every consumer class (few exceptions) CPU.
Not perfect logic but keeping naming schemed both logical and not too confusing same time is quite impossible.
Intel’s gaming “dominance” isn’t caused by architecture but by clock and old game engines.Who cares what it’s named!? At least the naming scheme isn’t as bad as AMDs GPUs, I swear they just use a random name generator for those!
A 100mhz boost to clocks isn’t going to topple Intel’s gaming dominance. They need architecture improvements aswell. Hopefully it comes because I want a new gaming CPU that is a sizeable upgrade over the 4790K I currently own. At the moment a 3700X would give me up to about 20fps more and just isn’t worth it.
If one accounts for clock differences, there is evidence to suggest that Intel's architecture is still better suited for games than AMD's - albeit by a relatively small degree.Intel’s gaming “dominance” isn’t caused by architecture but by clock and old game engines.
I don’t care, it is not a problem for me or for you... but it is confusing for a non technical customer.
I know a couple of colleagues that had the same issue last year buying a notebook. They were under the impression that a Ryzen 3750H was the notebook version of the very good Ryzen 3700X...
If one accounts for clock differences, there is evidence to suggest that Intel's architecture is still better suited for games than AMD's - albeit by a relatively small degree.
![]()
4GHz CPU Battle: Ryzen 3900X vs. 3700X vs. Core i9-9900K
Expanding upon all the testing we performed in our day-one 3rd-gen Ryzen coverage, today we'll be running a clock-for-clock comparison benchmark. IPC can be a good indicator...www.techspot.com
Games, unfortunately, don't lend themselves particularly well to parallel processing. While path finding, audio, netcode, physics, and shader compiling can all be distributed across multiple threads, the final render execution is still single threaded. And since the performance of any game is always determined by the slowest variable, the overall frame rate will always be controlled by a single core.
Oh, they "knew" about 3700X. Let's do some comparison:
3700X is 8 core, 12nm+7nm, PCIe 4.0, has no integrated graphics, 65W, L3 cache 16MB
3750H is 4 core, 12nm, PCIe 3.0, with integrated graphics, 35W, L3 cache 4MB
Not to mention, there was no 4-core Ryzen 3000-series CPU available last year.
So anyone knows even basic basics of 3700X can very easily spot differences. For those who have never heard about 3700X that is not problem.
So your colleagues went into common trap: they Believed they knew Something but actually they just knew 3700X is AMD's new chip. With just that information, making assumptions like 3750H is version of 3700X is just, well, yeah, exactly. It is ... stupid.
Sometimes having no information at all is much better than having some information. As people who have some information think they have much more than they actually have.
I think it will take longer than one year to render quad cores and 6 core CPUs obsolete for gaming. And this is a good thing, we don’t want to have to need to buy an 8 core CPU just to get the most out of a graphics card. When red dead redemption 2 released it was was criticised for being too demanding, people claimed it was badly optimised etc. However if the same thing happened on a CPU you would welcome it?You are absolutely right , but we are speaking about a 5-10% difference in the best case scenario. And I pointed out another factor: old game engines.
Most of those engine were written with Intel (and 2/4 cores at best) in mind. Now the situation is changing, and the new consoles are using AMD architecture with a 8C/16T configuration and a relatively low clock speed (around 3.5 GHz).
I wouldn’t be surprised if that article in one year would be different (testing new gAmes obviously)
I think it will take longer than one year to render quad cores and 6 core CPUs obsolete for gaming. And this is a good thing, we don’t want to have to need to buy an 8 core CPU just to get the most out of a graphics card. When red dead redemption 2 released it was was criticised for being too demanding, people claimed it was badly optimised etc. However if the same thing happened on a CPU you would welcome it?
I understand that a lot of people here have made investments into high core count CPUs and I understand you want them utilised. But for the industry and the profitability of the companies who sell the games they don’t want to have to make users buy expensive multi core CPUs.
Yes, PS4 Jaguar CPU technically was an 8-cores, but it really was more a “2X 4-cores”, and Sony let developers just use 6 of them (they added support for the seventh in a later stage, but basically developers never used it).Oh and I get that the consoles have 8 cores. The last consoles had 8 cores, albeit much weaker ones. It doesn’t mean there is a revolution coming.
To argue with you is so difficult
You HAVE THE TRUTH. Your words are a dogma. Everyone else is just stupid...
They are no expert, but they are engineers most probably even more brilliant than you (Since you called them stupid). Not being tech enthusiasts they didnt know about every detail in AMD architecture, but they did know about Ryzen CPUs and how good they are in this generation. So because Intel are basically using the same architecture in a 9700K and a 9750H (with core numbers and clock speeds difference) they were expecting something similar when they faced a Ryzen 7 3700X and a Ryzen 7 3750H.
But they were smart enough to ask me advices before buying.
It is so difficult for a fanboy to admit AMD is playing a trick with those names ?
Or do you think that AMD calling its CPU “3750H” mimic Intel nomenclature was a casual choice ?
So because Intel has one kind of logic, AMD should use exactly same? Even Intel's logic is flawed when comparing just model numbers and one CPU is desktop and another is mobile.
When it comes to AMD's naming scheme, it borrows a lot from Intel. 3 is cheap, 5 is better, 7 is even better and 9 is best. Problem is that when you have something like 3700, then most obvious choice for second model is 3750. That sounds much better than 3729.
so you think the name "3950H" was a coincidence ? ?![]()
We are speaking about the CODE name, not the class of the product.
The Ryzen 3750H should have been "2750H" , because it wasn't a Zen2 product.
You know it very well, but you HAVE TO defend the "Holy Brand".
Intel has a lot of defects, but a 10750H and a 10700K are both Coffee Lake.
AMD hasn't long had more cores. Intel matches core count across most of the range up to the i9 which has 10 compared to 12.
It will be interesting, as consoles move to Ryzen 2 8 core 3.8Ghz chips how developers utilise those and we are already now seeing some PC games perform better with more cores. But Intel match cores and still have better single core speed so to beat that gaming performance won't be easy. Be nice if AMD do though and sell those cpus at a cheaper price than Intel.
OK, AMD is the best.3750H? Not just because between 3700 and 3800, 3750 is best choice.
No it shouldn't because that would mean stupid people think that is 2018 product, not 2019. Right as you said: your stupid colleagues thought 3750H has something to do with 3700X. If that was named 2750H, they would think it is basically renamed Zen+, (launched 2018) and they are buying old stuff while it was actually launched 2019.
Coffee Lake has exactly same architecture that is used on 6.th generation CPU's. So basically Intel has renamed 6.th generation products to 10.th generation products so that people believe they are buying modern architecture. Instead they are buying 2015 Skylake.
Using your logic, 10750H and 10700K should be 6750H and 6700K.
that would be a long wait...I’d rather wait for the Alder lake which presumably sports higher overclockable frequency.