AMD Ryzen 5 4500U Review: Mid-Range Zen 2 Beats Intel's Best

Yea, AMD motherboards don't overvolt CPUs at all , please...
also, you didn't weatch the video. You have no idea what is going on with power limit...Intel didn't set it to be 250W 24/7, motherboards did, to keep the CPU at max boost all the time. That's not intel's spec.

I have known long time what you are trying to explain. You just don't seem to understand that maximum power consumption is not depending on time at all. No matter if that is 24/7 for three years or one millisecond one time, 250W consumption is huge compared to AMD CPU's.
 
So?> you have no money for electricity or? I mean, not to be rude, why does that matter at all? It works. It works, surprisingly fine.
If you want low power draw, get a laptop or a weaker CPU, not a 5GHz 20 thread one. And yes, it's 250W or 350, or 5000W, doesn't matter, AMD ain't reaching 5GHz stable no matter what. See? irrelevant points just to make an argument, off topic at that, just like yours.
 
So?> you have no money for electricity or? I mean, not to be rude, why does that matter at all? It works. It works, surprisingly fine.
If you want low power draw, get a laptop or a weaker CPU, not a 5GHz 20 thread one. And yes, it's 250W or 350, or 5000W, doesn't matter, AMD ain't reaching 5GHz stable no matter what. See? irrelevant points just to make an argument, off topic at that, just like yours.

You are trying to tell Intel's huge power consumption is motherboard makers' fault. But reality is that Intel has itself specced max consumption into 250 watts that is huge.

However, the overall architecture design of Broadwell-to-Comet Lake is broaching its fundamental design limits. For example, Anandtech tested the core-to-core latency (normally a strength of Intel's chips) with the 10900K and found a bigger than expected increase in latency, going from 8 cores to 10. They also found, along with ourselves, that under full load, the PL2 draw is enormous:

116012.png

as for PL2 power draw, by spec they should be limited, mobo manufacturers ignored those and whoila, blame Intel again. Need I link the video?

Intel spec: 250W
Measured: 254W

Problem?
 
No problem, you are misrepresenting the point. There was never a problem. Intel spec is ~90 seconds of that draw than clocks down and power down. Problem? I don't know anyone with 10900K who has a problem. They all overclock it and use over 300W anyway. 250W is not huge, just by AMD ~4GHz standards anyway.
 
No problem, you are misrepresenting the point. There was never a problem. Intel spec is ~90 seconds of that draw than clocks down and power down. Problem? I don't know anyone with 10900K who has a problem. They all overclock it and use over 300W anyway. 250W is not huge, just by AMD ~4GHz standards anyway.

I see you are misrepresenting the point. We said Intel's PL2 power draw is huge. And that is:

Intel 10 core: ~250W
AMD 16-core Max (AMD has no PL2): ~145W

High power consumption Used to be problem just few years ago ;)

So when talking about how "good" Intel's 14nm process is, power consumption is simply abysmal.
 
You are comparing apples and oranges. That's the problem. 14nm vs 7nm, 5GHz vs 4GHz, and it's ~220W not 145W...doesn't matter, apples, oranges.
edit: my bad, yes, power MAX is 145W. Mind the gap...
 
Last edited:
You are comparing apples and oranges. That's the problem. 14nm vs 7nm, 5GHz vs 4GHz, and it's ~220W not 145W...doesn't matter, apples, oranges.

So you are saying Intel's power consumption does not suck because AMD has 7nm tech and Intel has 14nm. Using that logic, there is no reason for Intel to ever go under 14nm tech as then we cannot compare power consumption against AMD CPU's. Eh? So if on 2022 Intel still has 14nm and AMD has, say, 5nm, we cannot compare them because it's apples, oranges :facepalm:

AMD primarily designed Zen2 to be server CPU (designed for low clock speeds) and TSMC 7nm tech is primarily made for moderate clock speeds and quite low power consumption. No surprise Intel's super hot 14nm++++++++++++++++++++ has higher clock speeds but awful power consumption. It's just funny how some try to downplay power consumption now when AMD got bashed about it for years during construction machine era.

Base clocks are: Intel 3.7 GHz, AMD 3.5 GHz. Not much difference there. Just look graph above, CPU consumption is 145W, that is CPU only, not whole system.
 
Now you are expanding this to infinity. I don't care about this enough to spend any more time on pointless discussion about power usage. I personaly never cared about power consumption, and I guess no one who ever bought Intel's K cpu did. To me, 9700K is faster than 3950x. because I look at things that matter to me. And cinebecnh is not one of them.
 
I see you are misrepresenting the point. We said Intel's PL2 power draw is huge. And that is:

Intel 10 core: ~250W
AMD 16-core Max (AMD has no PL2): ~145W

High power consumption Used to be problem just few years ago ;)

So when talking about how "good" Intel's 14nm process is, power consumption is simply abysmal.

Power consumption on Intel parts is just because they run at super high frequencies. Just check out reviews for 10700, 10500 and 10300 on techpowerup. They consume similar power to AMD zen 2 parts like 3700x/3600/3300 while having similar performance (a few percents less in MT and a few extra in gaming).
So that is why people say power consumption of Intel parts is still reasonable and at least for that extra power you get top notch performance.
Don't get me wrong, AMD zen 2 parts are amazing. Great performance and efficiency. BUT, they encompass the best there is now, 7nm TSMC, brand new uArch (Zen 2), chiplets, infinity fabric. And still, Intel with 5 year old parts on a tweaked 5 year old process can match them. Not in every segment, sure, since they have an IPC deficit of ~5%, which requires higher frequencies, but for non K parts Intel is still able to do a lot with their very old parts. Do you get it now?
 
Last edited:
Power consumption on Intel parts just because they run at super high frequencies. Just check out reviews for 10700, 10500 and 10300 on techpowerup. They consume similar power to AMD zen 2 parts like 3700x/3600/3300 while having similar performance (a few percents less in MT and a few extra in gaming).

Low power parts have reasonable power consumption but when it comes to K-parts, it's much higher. Those non K models are cherry picked parts and there is very few reasons to get them, because outside gaming AMD is better in every aspect.

When it comes go gaming, K-models have slight lead on old, low threaded games. But then power consumption comes into play.

So basically, cases where power consumption matters (non K models), AMD is much better. And where it "doesn't matter", Intel has much higher consumption.

So that is why people say power consumption of Intel parts is still reasonable and at least for that extra power you get top notch performance.
Don't get me wrong, AMD zen 2 parts are amazing. Great performance and efficiency. BUT, they encompass the best there is now, 7nm TSMC, brand new uArch (Zen 2), chiplets, infinity fabric. And still, Intel with 5 year old parts on a tweaked 5 year old process can match them. Not in every segment, sure, since they have an IPC deficit of ~5%, which requires higher frequencies, but for non K parts Intel is still able to do a lot with their very old parts. Do you get it now?

Chiplets are huge drawback when it comes to desktop parts, especially when it comes to power consumption. Infinity Fabric is very complicated as it contais multiple buses, however it's needed mostly because chiplet design. Also Zen2 was designed to be server chip only and also it was designed for low clock speeds. Considering those, Zen2 desktop is quite awesome. Current Zen2 is basically worst case scenario for desktop (and best case for servers). Perhaps Zen3 will bring something better for desktop too.

Intel can "match" when they can offer even 10 cores while AMD can offer 16, yeah.

So Intel can "match" with process made for desktop parts only, architecture not made for servers only, "offering" huge power consumption while offering far less cores and technically worse (PCIe 4.0 AND amount of lanes) chip. That's actually not surprising at all. I'm actually surprised how well Zen2 looks on desktop, expected much worse.

When looking on server side where Zen2 was primarily designed, AMD is so far ahead that I wonder who is so stupid that buys Intel servers any more.
 
I don't fully agree with you. What I agree is yes, Intel has pretty bad power consumption for their top of the line K parts, but what do you expect at 5+Ghz? AMD is some half a Ghz to a Ghz away in frequencies (thanks to better IPC) and that, as you know is extremely important since process consumption doesn't scale linearly with frequency and voltage especially.
But their non-K parts are still technically impressive (I repeat myself for you to not get me wrong, AMD parts are a better buy), that they can match Zen 2 parts (that sure, have their sources of inefficiency with IF and chiplets) but built on a very good process and on a brand new uArch. It is testament to just how good 14nm base was and how scalable Skylake uArch was and still is. Or maybe it is a sign that AMD should push things a bit further with Zen 3, since Zen 2 is good, but not amazing.
Non-K parts beat their AMD counterparts in gaming, that is a fact.
As for core to core matching, there so much Intel can do with monolithic die and 10980xe shows just that, with it requiring lots of voltage and frequency to keep up with the 3950x advantage in IPC and so the power consumption problem become even more of a problem.
Make no mistake, what Intel has now is a stopgap. The fact that this stop gap is still decently performing is what I was talking about.
As for servers, believe me, there are a lot more things that matter when you buy networking infrastructure beside the CPU. And Intel holds a pretty big advantage is support, quality of integration and software, compability, ease of upgrade, etc. AMD has some pretty damn good parts in servers, but as you can see they didn't really took off yet since like in laptops, AMD doesn't get involve nearly enough to make the transition to AMD a smooth ride.
 
I don't fully agree with you. What I agree is yes, Intel has pretty bad power consumption for their top of the line K parts, but what do you expect at 5+Ghz? AMD is some half a Ghz to a Ghz away in frequencies (thanks to better IPC) and that, as you know is extremely important since process consumption doesn't scale linearly with frequency and voltage especially.
But their non-K parts are still technically impressive (I repeat myself for you to not get me wrong, AMD parts are a better buy), that they can match Zen 2 parts (that sure, have their sources of inefficiency with IF and chiplets) but built on a very good process and on a brand new uArch. It is testament to just how good 14nm base was and how scalable Skylake uArch was and still is. Or maybe it is a sign that AMD should push things a bit further with Zen 3, since Zen 2 is good, but not amazing.

Like I said, Zen2 for Desktop is simply emergency edition. Nothing like it should have been. Backtracking three years.

AMD had and still have WSA agreement with GlobalFoundries. At that time GlobalFoundries were developing 7nm high performance process (no reason why they should directly compete with TSMC, since TSMC had superior capacity anyway) and since GF didn't have enough capacity for AMD, they publicly told that AMD may use other foundries too. So AMD used TMSC for Epyc chiplets and Vega 20.

Now, what was Ryzen supposed to be? Since GF process was supposed to be high performance, not hard to guess Ryzen was going to be GF part. Chiplets? No sense, while on TSMC AMD had tight supply of 7nm, on GF that is "we must use everything they offer". Additionally laptop parts are monolithic so there AMD must have had ready monolithic design (just remove GPU).

GF screwed up both 20nm and 14nm processes. What if they screw 7nm too? AMD surely thought about this. They surely needed backup solution. That is, 12nm chipset/IO-die. With that AMD could at least use chip as X570 chipset but it also serves as backup solution for chipset Ryzen.

Concluding these: Zen2 Ryzen was probably supposed to be monolithic design with high performance process and desktop cache design. Now it's chiplet design, with mid performance process and server cache design. Additionally Zen2 architecture was aimed for low clocks (source: AMD lead engineer). No wonder it does not look so good on desktop. Zen 2 Ryzen is basically everything desktop chip should NOT be.

Non-K parts beat their AMD counterparts in gaming, that is a fact.

I found a similar situation when testing Battlefield 1. Performance was smooth with the Ryzen processors while every now and then the quad-core 7700K had a small hiccup. These were rare but it was something I didn't notice when using the 1800X and 1700X. But as smooth as the experience was, it doesn't change the fact that gamers running a high refresh rate monitor may be better served by a higher clocked Core i7-6700K or 7700K.


While Intel's quad core offers better FPS, gaming performance is not necessarily better except on very high frame rates.

As for core to core matching, there so much Intel can do with monolithic die and 10980xe shows just that, with it requiring lots of voltage and frequency to keep up with the 3950x advantage in IPC and so the power consumption problem become even more of a problem
Make no mistake, what Intel has now is a stopgap. The fact that this stop gap is still decently performing is what I was talking about.

It is because it's compared to AMD's worst case desktop CPU.

As for servers, believe me, there are a lot more things that matter when you buy networking infrastructure beside the CPU. And Intel holds a pretty big advantage is support, quality of integration and software, compability, ease of upgrade, etc. AMD has some pretty damn good parts in servers, but as you can see they didn't really took off yet since like in laptops, AMD doesn't get involve nearly enough to make the transition to AMD a smooth ride.

I have heard these many times. On servers, power consumption matters, CPU power matters etc etc. Now when AMD has three digit advantage, it suddenly comes to "support" or something else strange things that are more like "we like Intel" -brainwashing.

When looking at laptop side, there are no excuses: most laptops are for casual use and users won't notice at all if there is Intel or AMD CPU. Still Intel sells much better.

Because many people are still brainwashed for Intel. When time goes on, more and more of that brainwashing will go away. Right now that is not big problem for AMD, because 7nm is tight supply thanks to GlobalFoundries.
 
And after reading all of THAT..... (what did this devolve into?)

...my 4500U laptop arrives in a few days. I guess I'm just not convinced by random angry Intel person up there.
 
I have heard these many times. On servers, power consumption matters, CPU power matters etc etc. Now when AMD has three digit advantage, it suddenly comes to "support" or something else strange things that are more like "we like Intel" -brainwashing.

When looking at laptop side, there are no excuses: most laptops are for casual use and users won't notice at all if there is Intel or AMD CPU. Still Intel sells much better.

Because many people are still brainwashed for Intel. When time goes on, more and more of that brainwashing will go away. Right now that is not big problem for AMD, because 7nm is tight supply thanks to GlobalFoundries.
The brainwashing is brand perception. It is something that Intel has built during the past 10 years, when AMD had mediocre (at least now if you buy a laptop with Intel CPU it is not that far off a Renoir one) offers, while Intel was pushing for slimmer chassis, better battery life (remember Sandy bridge mobile launch? Or haswell?). This is something you don't gain back in one generation and especially not how AMD tries to do it. Instead of sponsoring OEMs to build premium designs and very high quality laptops, they still keep this "AMD is the cheapest option" image to them. And don't come saying to me that OEMs are bribed by Intel to refuse AMD on that since I don't buy it. AMD just doesn't invest sufficient $ and manpower into OEMs partnerships.

As for servers, give me one reason why not every single company in this world didn't jump on Zen 2 servers? Go and ask system engineers/managers and they will tell you what I did.
 
The brainwashing is brand perception. It is something that Intel has built during the past 10 years, when AMD had mediocre (at least now if you buy a laptop with Intel CPU it is not that far off a Renoir one) offers, while Intel was pushing for slimmer chassis, better battery life (remember Sandy bridge mobile launch? Or haswell?). This is something you don't gain back in one generation and especially not how AMD tries to do it. Instead of sponsoring OEMs to build premium designs and very high quality laptops, they still keep this "AMD is the cheapest option" image to them. And don't come saying to me that OEMs are bribed by Intel to refuse AMD on that since I don't buy it. AMD just doesn't invest sufficient $ and manpower into OEMs partnerships.

Exactly. It's most about brand. And buying something worse for higher price just because "brand blah blah" is just plain stupid.

How do you know it's AMD's fault? There doesn't seem to be Ryzen 4800U laptops with better GPU than integrated one. So AMD says OEM's cannot use anything else than integrated GPU? Or is it just that OEM's don't want to use them yet? Problem is that OEM's seem to make final decision here. It would be very nice to know exact reason.

As for servers, give me one reason why not every single company in this world didn't jump on Zen 2 servers? Go and ask system engineers/managers and they will tell you what I did.

Because server buyers are just plain stupid, repeating same mistake over and over again.

Before 2000: server buyers scream that Intel charges too much for server chips.

2001: AMD launches Opteron that surely beats Intel everywhere. Server buyers still buy Intel because ...

AMD decided to ditch whole server business since stupid buyers rather buy Intel.

Before Epyc: server buyers scream that Intel charges too much for server chips.

2017: Epyc is launched, somewhat better and cheaper than Intel.

2019: Zen2 Epyc is better than Intel on almost everywhere. Intel CPU's are plagued by security flaws. Server buyers still buy Intel because ...

What happens next? AMD decides to ditch whole server business since stupid buyers rather buy Intel and then server buyers scream that Intel charges too much for server chips?

So first server buyers complain about Intel high prices and when there is alternative, they still buy Intel and soon complain about Intel high prices. Complaining about situation they caused themselves. *****s.
 
Exactly. It's most about brand. And buying something worse for higher price just because "brand blah blah" is just plain stupid.
Agree 100% on that, but this is how things work in the real life.

How do you know it's AMD's fault? There doesn't seem to be Ryzen 4800U laptops with better GPU than integrated one. So AMD says OEM's cannot use anything else than integrated GPU? Or is it just that OEM's don't want to use them yet? Problem is that OEM's seem to make final decision here. It would be very nice to know exact reason.
It seems you don't really understand how OEMs make a laptop do you? An OEM needs to be convinced by the manufacturer that a product is worth a design win (read by this money, time and engineers from OEM).
Then, the manufacturer needs to provide to the OEM everything it needs, design documentation, power description, possible configurations and most importantly field engineers. So it is fair to say that if OEMs don't have now some specific configuration that you want from them it is AMD fault because they didn't invest the resources and money to convince and help the OEM to make it a reality. And in any case, laptop design and construction is not like a PC, where you get some components, strap them together and boom it works. Also, many of the current high end laptops with discrete GPUs are built around nvidia and intel designs (read here that these are platforms created in a joint effort by nvidia/intel/oem engineers) so the OEM cannot and it is not fair to use it with a different piece of hardware. So if AMD wants something similar, they need to create the blueprints for the OEMs. So yeah, it is AMD's fault.

AMD decided to ditch whole server business since stupid buyers rather buy Intel.
I agree with you on the past era, that Intel bribed OEMs to buy more Intel CPUs. Intel has paid its faults.
Moving to current era, AMD didn't decide to ditch server business. As far as I know, in 2010-2015 they had server parts. Their problem was mainly management related, with the buyout of Ati being a very risky move and selling the fabs also bringing them to their knees. AMD has made a lot of mistakes in the past, lets admit this. I sincerely admire what they do now, they wouldn't be here now if they wouldn't have passed through all these transformative steps.
In any case, back to servers, Intel has been good in this market from the Core 2 duo and slowly gaining a LOT of traction because they had very good power consumption with good performance on each generation. What you don't see is the most important thing and that is quality of service. During all these years Intel has created strong partnerships with big companies and small business and they provide a very good (sure, not that good performing today, but in between 2010-2016 they took even IBM down) technical and support package. AMD is a case of good performance and power and lacking in support and quality. Believe me since I have friends working in this domain and while things are starting to change for the better for AMD, they are still a long way from matching their competitor.

Bottom line is that changes need time. AMD can't become overnight Intel. I mean they barely had a first very good product, that is Zen 2. And Zen 2 success is part its technical prowess, but also partly because Intel has had major issues. So you can say you are a winner when your opponent is fighting on all cylinders and gives the best he's got, but in all honesty Intel is fighting AMD with 2015 era parts.
As for the trajectory of AMD, they need to keep executing on the hardware side. They also need to improve their relations with OEMs and step-up their game in providing assistance and quality documentation and engineering support to OEMs if they want the same kind of good stuff that Intel has. There is no way around that.
 
Agree 100% on that, but this is how things work in the real life.

Of course, but that just proves that it depends on something else than quality and such.

It seems you don't really understand how OEMs make a laptop do you? An OEM needs to be convinced by the manufacturer that a product is worth a design win (read by this money, time and engineers from OEM).
Then, the manufacturer needs to provide to the OEM everything it needs, design documentation, power description, possible configurations and most importantly field engineers. So it is fair to say that if OEMs don't have now some specific configuration that you want from them it is AMD fault because they didn't invest the resources and money to convince and help the OEM to make it a reality. And in any case, laptop design and construction is not like a PC, where you get some components, strap them together and boom it works. Also, many of the current high end laptops with discrete GPUs are built around nvidia and intel designs (read here that these are platforms created in a joint effort by nvidia/intel/oem engineers) so the OEM cannot and it is not fair to use it with a different piece of hardware. So if AMD wants something similar, they need to create the blueprints for the OEMs. So yeah, it is AMD's fault.

Well, it does seem now that my information was old. There are plenty of laptops with external graphics with something like GeForce 2060 or so. However it seems that GeForce 2070 and higher seem to be exclusively with Intel GPU's. And how this is AMD's fault? OEM manufacturers are able to pair GTX 2060 with Ryzen 7 4800U but are unable to pair Ryzen 7 4800U with GTX 2070. Simply replacing 2060 with 2070 is very trivial task in case there is already knowledge how to pair AMD CPU with 2060. And there is.

It simply seems that Intel have some kind of exclusive deal with OEM's. Or OEM's don't want to use higher end GTX chips with AMD CPU. Since Intel probably is in trouble after all court cases if they use tricks like this, it seems OEM's made this decision. At least so far. I expect there to be high end GTX parts with AMD CPU's in future. It just makes me wonder why there are not any right now.

I agree with you on the past era, that Intel bribed OEMs to buy more Intel CPUs. Intel has paid its faults.
Moving to current era, AMD didn't decide to ditch server business. As far as I know, in 2010-2015 they had server parts. Their problem was mainly management related, with the buyout of Ati being a very risky move and selling the fabs also bringing them to their knees. AMD has made a lot of mistakes in the past, lets admit this. I sincerely admire what they do now, they wouldn't be here now if they wouldn't have passed through all these transformative steps.
In any case, back to servers, Intel has been good in this market from the Core 2 duo and slowly gaining a LOT of traction because they had very good power consumption with good performance on each generation. What you don't see is the most important thing and that is quality of service. During all these years Intel has created strong partnerships with big companies and small business and they provide a very good (sure, not that good performing today, but in between 2010-2016 they took even IBM down) technical and support package. AMD is a case of good performance and power and lacking in support and quality. Believe me since I have friends working in this domain and while things are starting to change for the better for AMD, they are still a long way from matching their competitor.

We could say AMD ditched server business after 2012, as there were no new models until Epyc came out 2017. Yes, they still sold chips but there wasn't anything new. Not even rebrands.

After AMD released AthlonMP and Opteron (Athlon64 based) early 2000+, it took many years from Intel to even nearly match AMD. Still AMD couldn't sell those superior CPU's. It really doesn't matter what situation is right now and if AMD is behind in some kind of "support" things and such. Problem is that server buyers complain about Intel's high prices and when there is better alternative, they still buy Intel and when alternative goes down (Intel of course keep prices down), they complain about Intel high prices. So essentially they complain about situation they caused themselves. Doesn't sound wise to me.

Bottom line is that changes need time. AMD can't become overnight Intel. I mean they barely had a first very good product, that is Zen 2. And Zen 2 success is part its technical prowess, but also partly because Intel has had major issues. So you can say you are a winner when your opponent is fighting on all cylinders and gives the best he's got, but in all honesty Intel is fighting AMD with 2015 era parts.
As for the trajectory of AMD, they need to keep executing on the hardware side. They also need to improve their relations with OEMs and step-up their game in providing assistance and quality documentation and engineering support to OEMs if they want the same kind of good stuff that Intel has. There is no way around that.

Zen was good part, very good considering it's pricing. Zen2 is simply awesome.

It's just, I cannot see any major problems with AMD's "support" to OEM's and such. Since Zen2 is going to power multiple supercomputers, it really makes me wonder what is this mystical "support" Intel can give and AMD cannot. After all, supercomputers are much harder to get running than simple server is. So it seems again that many OEM's and server buyers are making bad excuses why they should still buy Intel. Luckily this time all that doesn't seem to slow down AMD too much. If that does and AMD ceases server development again, they will soon complain how Intel prices are high again.

So basically desktop buyers try to keep competition up, OEM's and server buyers seem to concentrate on complaining instead doing something.
 
You are stuck with your thinking I see and it seems, even if I bring you palpable argument you are still beating the bush with the OEMs don't embrace AMD and OEMs don't wanna do AMDs work.
OEMs are customers for AMD. Get that. A customer expects to be treated well, expects the involvement of the seller (AMD) and expects to get a product on the market, that will sell well, at high prices, be relatively easy to design and make, have big margins, etc.
AMD is providing only a small part of this compared to Intel, so if you are not willing to understand that, then we're done talking.
OEMs don't have absolutely any obligation to any manufacturer to do anything. There have been lots of companies with very nice pieces of hardware, but they were ignored just because they didn't know how to sell their products. If you don't understand that it is your problem, because in this domain it is just as much about marketing as it is about hardware.
 
You are stuck with your thinking I see and it seems, even if I bring you palpable argument you are still beating the bush with the OEMs don't embrace AMD and OEMs don't wanna do AMDs work.
OEMs are customers for AMD. Get that. A customer expects to be treated well, expects the involvement of the seller (AMD) and expects to get a product on the market, that will sell well, at high prices, be relatively easy to design and make, have big margins, etc.
AMD is providing only a small part of this compared to Intel, so if you are not willing to understand that, then we're done talking.

What you don't seem to understand is that OEM's have been complaining about Intel's pricing and supply issues since Coppermine days (1998). Right now Intel has serious supply problems and pricing is still very high. And what OEM's do? They complain about Intel pricing and supply issues. So they are complaining about issues but are not ready to do anything about it.

I work for a company. That company needs somewhat steady supply of materials. If our main supplier has supply problems, either quality or amount, what does company do? Just complain about problems? No, it starts to seek another sources.

When it comes to OEM's, they seem to be happy about situation because they are not doing anything to solve problems. Except they aren't. And neither are customers.

I already gave good example. RTX 2060 and RTX 2070 mobile versions. Both are using PCIe 3.0 x16 interface. So there is absolutely nothing hard to make RTX 2070 laptop if there is already RTX 2060 laptop in place.

It just seems you are just like OEM's. They are making very stupid excuses why they should still use only Intel. Despite constantly complaining Intel's pricing and supply issues.

OEMs don't have absolutely any obligation to any manufacturer to do anything. There have been lots of companies with very nice pieces of hardware, but they were ignored just because they didn't know how to sell their products. If you don't understand that it is your problem, because in this domain it is just as much about marketing as it is about hardware.

Of course OEM's can use what they want. But then it's better to stop complaining how Intel does this and that wrong. howe about solving that problem?

AMD does not seem to have big problems selling their products on retail section. Remember that AMD cannot sell CPU's without motherboards and so motherboard makers must make close partnership with AMD. Also AMD was able to secure multiple supercomputer deals.

So it really makes me wonder what are these "marketing problems" you are talking about. They must be very severe since Intel has serious problems too. It just seems more and more that there are no valid reasons, just excuses. Needless to say it makes OEM's look more and more like morons.
 
You don't understand why because you oversimplify how a laptop is designed and built. I already told you why OEMs prefer Intel but you seem to read but not understand, so I'll stop here. Believe what you want.
 
You don't understand why because you oversimplify how a laptop is designed and built. I already told you why OEMs prefer Intel but you seem to read but not understand, so I'll stop here. Believe what you want.

You didn't tell anything. Just same "AMD sucks bla bla" excuses I have heard many times. Nothing that proves any of my points wrong.

You probably didn't notice that RTX 2060 mobile and RTX 2070 mobile have same TDP. So you basically say that nobody uses RTX 2070 because AMD does not give enough support. Support for what? For integrating another GPU with SAME bus AND same TDP into laptop motherboard?

Pretty much "support" needed indeed *nerd*
 
Where exactly did I say "AMD sucks"??? Please quote those words.
I think you have a fixation and you simply cannot have an objective conversation with someone, so you just repeat the same thing over and over. That is why I say, believe what you want and lets end this discussion here.
I will leave this here https://www.igorslab.de/en/why-own-...lue-conspiracy-no-is-insights-and-commentary/
if you are still beating the bush with your own opinions, then I don't know, go and take a job at oems and start making whatever you want. But in the meantime I prefer to believe a professional with inside knowledge not some random dude on techspot forums.
 
Last edited:
They're in the article; it shows the same outcome, when you look at the results for the tests using integrated graphics only. However, if you want further data, you may have to wait a while as there aren't many 4500U reviews out there. Anandtech have done the 4700U:


115664.png

From the specs alone, one would expect the 4700U to outperform the 1065G7:

Core i7-1065G7 vs Ryzen 5 4700U
Cores: 4 vs 8
Threads: 8 vs 8
Base clock: 1.3 vs 2.0 GHz
Boost: 3.5/3.9 vs 4.1 GHz
L3 cache: 6 MB vs 12 MB

But the 2700U performs, in that particular test, just as well as the 1065G7, and the specs are comparable too:

Core i7-1065G7 vs Ryzen 5 2700U:
Cores: 4 vs 4
Threads: 8 vs 8
Base clock: 1.3 vs 2.2
Boost: 3.5/3.9 vs 3.8
L3 cache: 6 MB vs 4 MB


Yes, they are - the Core i7-1065G7 is on their 10nm node:


Who are you referring to with 'They' - AMD or Intel?


Intel were claiming approximately 100 million transistors per square mm, 3 years ago:


They're typically don't state transistor count for their products, so unless Intel changes tack with the forthcoming 10 nm lineup, all one will be able to do is make some educated guesses. As things currently stand, Renoir is roughly 62 million transistors per square mm; Matisse is just over 50; Pinnacle Ridge around 25.

I'd expect Intel 10 nm node to be at least as dense as Renoir, but possibly scaled back from the initial targets - they aimed for an enormous increase in density over their original 14 nm node, which has almost certainly been part of the issues they've been experiencing with the newer process.
I commend your patience.
 
Back