AMD Ryzen 9 6900HS Review: Can it beat Alder Lake?

It all comes down to pricing... can I get a 12700 for the same price as the 6900? If so... epic fail for AMD... if the AMD laptop comes in 10-20% cheaper.... then we have a product...

Anyone wanting the "best laptop" will be going Intel this round... but you don't sell too many of those.
 
It all comes down to pricing... can I get a 12700 for the same price as the 6900? If so... epic fail for AMD... if the AMD laptop comes in 10-20% cheaper.... then we have a product...

Not really: nowadays most people want very slim, light and thin laptops. For that you have to go iGPU. AMD offers an impressive CPU with the best iGPU which offers decent gaming. Every other chips are the worse.

Intel offers the usual: CPU best performance when you pull "unlimited" energy with crap iGPU, which means if I have a laptop I, either limit the speed and get decent battery, or good speed and horrible battery. AMD seems to have a very balanced chip.

I hope that most vendors put these AMD chips on balanced laptops. I also don't use Adobe products due to very very very bad code optimization and Intel only optimization.
Apple has those values on the M1 laptops. Even my Mac Mini M1 is very snappy and draws very little energy. And I use Adobe alternatives for video and photo and I'm very happy.
 
Bodes well for future Zen 4/4+ laptops except maybe for Adobe performance. I predict they should exceed Alder and maybe even Raptor Lake's average performance by at least 5-10%.
Based on what? Wishful thinking?
Not really: nowadays most people want very slim, light and thin laptops. For that you have to go iGPU. AMD offers an impressive CPU with the best iGPU which offers decent gaming. Every other chips are the worse.

Intel offers the usual: CPU best performance when you pull "unlimited" energy with crap iGPU, which means if I have a laptop I, either limit the speed and get decent battery, or good speed and horrible battery. AMD seems to have a very balanced chip.

I hope that most vendors put these AMD chips on balanced laptops. I also don't use Adobe products due to very very very bad code optimization and Intel only optimization.
Apple has those values on the M1 laptops. Even my Mac Mini M1 is very snappy and draws very little energy. And I use Adobe alternatives for video and photo and I'm very happy.
Anyone paying for that CPU and using the iGPU is a fool.... if you aren't going discrete, you are getting a cheaper CPU as well... i5 or ryzen5....
 
Those power numbers are really impressive. The most interesting part though is how close AMD is getting to the M1 pro. Apple's magic ARM chip, as it turns out, may just be benefiting from being on 5nm (many of us alread knew this) and I cant wait to see what zen 4 on 5nm brings to the table.
Based on what? Wishful thinking?
Based on the graphs, maybe? Zen 3+ is dramatically more efficient then zen 3 was, and zen 4 will be on the notably more efficient still 5nm node. In the majority of cases for laptops zen 3+ will be superior simply because most laptops cannot entertain 45 watt power draw today, let alone 75 watt.

Anyone paying for that CPU and using the iGPU is a fool.... if you aren't going discrete, you are getting a cheaper CPU as well... i5 or ryzen5....
Getting a i5 or ryzen 5 means getting a slower and weaker iGPU. Did you not think that someone buying this for the iGPU might, well, want the iGPU?

Shocking, I know, but not all of us want a big hot dGPU in a big, heavy laptop when all we really need is a better iGPU for casual gaming. My 4800h laptop has no dGPU and works great for most games I play on the go, like tropico or civ, and has the added benefit of working out of the box with linux, unlike dGPU setups. If you are using the iGPU, the higher TDP of the H series chips allows for higher/consistent GPU boosting. The 4800h and4700u are the same chip, but the 4700u only maintains 1200-1300 mhz GPU clocks while the H manages 1600 mhz consistently, the real life result is the H is anywhere from 20-45% faster depending on the title. And when you are not gaming, the H is a stone's throw away from the U in terms of real world battery life.

Granted, I think it would be nice if AMD would make a ryzen 5 with the big iGPU for people like me, but they dont make one, so we have no other choice. A ryzen 5 with the 680m and LPDDR5 RAM would be very interesting indeed.
Not really: nowadays most people want very slim, light and thin laptops. For that you have to go iGPU. AMD offers an impressive CPU with the best iGPU which offers decent gaming. Every other chips are the worse.

Intel offers the usual: CPU best performance when you pull "unlimited" energy with crap iGPU, which means if I have a laptop I, either limit the speed and get decent battery, or good speed and horrible battery. AMD seems to have a very balanced chip.

I hope that most vendors put these AMD chips on balanced laptops. I also don't use Adobe products due to very very very bad code optimization and Intel only optimization.
Apple has those values on the M1 laptops. Even my Mac Mini M1 is very snappy and draws very little energy. And I use Adobe alternatives for video and photo and I'm very happy.
Even if you are OK with a thicker laptop (I myself miss the days of 0.9-1.0 inch thick laptops with good cooling systems like the dell e6440 or lenovo l440) cooling 75 watts of intel CPU is a loud and difficult affair. That's just too much for a laptop's cooling system, especially if paired with a dGPU.
 
AMD is like Windows. Every other version is a pass. This is getting ridiculous now. Is this how you tell the world you're back? What a complete joke.
Who shat in your cereal this morning?
Zen 3+....
The "good if priced well" chip.
Zen 3+, the "good if you are using a laptop" chip.

Alder lake, the "good if you have a nuclear power plant in your living room " chip.
 
Last edited:
Based on what? Wishful thinking?

Anyone paying for that CPU and using the iGPU is a fool.... if you aren't going discrete, you are getting a cheaper CPU as well... i5 or ryzen5....
Anyone? Wow. Not everyone is a gamer...
I think it's 100% safe to say, most laptops are for basic work tasks almost any laptop is capable of.
 
I'd say this is another win for AMD. The performance is good for the price and power usage. Intel still has the mobile performance Crown, but only at the cost of a crap-ton of power usage.

12700H match 6900HS performance per watt when both are at low power (35 to 45w) according to the review. While intel has advantage in power efficiency when both run a higher power (above 45 watt)

 
12700H match 6900HS performance per watt when both are at low power (35 to 45w) according to the review. While intel has advantage in power efficiency when both run a higher power (above 45 watt)
On CPU side yes but AMD has much better GPU and connectivity onboard despite having smaller die size. AMD could just put more CPU cores and less GPU to easily surpass Intel on CPU performance.
 
People who think that Apples efficiency comes only from a process advantage and not the fact that ARM uses far less power to run are extremely delusional. Apples M1 doesn’t consume as much power at max load than these Ryzen or Intel parts. In fact it consumes 39w at the wall at max. Which is less than these Ryzen parts when boosting yet still faster. But the main difference as shown in the graph here is at idle the Ryzen part is using 28w and the M1 is using 7watts. That’s going to make the difference on your battery.

https://www.extremetech.com/extreme...maxs-power-efficiency-should-rattle-intel-amd

Anyone who thinks that shrinking from 6nm to 5nm will make your idle power consumption drop from 28w to 7w is in for a big disappointment. If you are one of these people then please go learn and understand the differences between ARM computing and X86 computing because you obviously don’t understand them currently.

There is a reason why both Google and MS are scrambling to get their own ARM designs out.
 
But they didn't...
Exactly. AMD prioritizes GPU, Intel CPU. Simple. But saying Intel has better Performance while ignoring GPU...

People who think that Apples efficiency comes only from a process advantage and not the fact that ARM uses far less power to run are extremely delusional. Apples M1 doesn’t consume as much power at max load than these Ryzen or Intel parts. In fact it consumes 39w at the wall at max. Which is less than these Ryzen parts when boosting yet still faster. But the main difference as shown in the graph here is at idle the Ryzen part is using 28w and the M1 is using 7watts. That’s going to make the difference on your battery.

https://www.extremetech.com/extreme...maxs-power-efficiency-should-rattle-intel-amd
Looking at chart, package power idle is Apple 0.2 watts vs Intel 1.08 watts. That makes 0.88W difference. Pretty huge, isn't it? That's all difference "ARM" makes there on idle.
Anyone who thinks that shrinking from 6nm to 5nm will make your idle power consumption drop from 28w to 7w is in for a big disappointment. If you are one of these people then please go learn and understand the differences between ARM computing and X86 computing because you obviously don’t understand them currently.

There is a reason why both Google and MS are scrambling to get their own ARM designs out.
Again, 28W vs 7W are consumptions for Whole System. Difference on idle is less than one watt when looking at CPU/SOC only.

Google and Microsoft are only designing ARM because they have no access to x86. Simple.
 
Not if you want or need the cpu power from it...
If you are just doing "basic work tasks" as you stated - you don't need the 6900... Intel's 12500 or Ryzen 6500 will be fine...

If you are buying a top-of-the-line CPU (see 12900 or 6900), then not spending on a dCPU is foolish... Kind of like buying a Ferrarri and putting $40 tires on it...
 
If you are just doing "basic work tasks" as you stated - you don't need the 6900... Intel's 12500 or Ryzen 6500 will be fine...

If you are buying a top-of-the-line CPU (see 12900 or 6900), then not spending on a dCPU is foolish... Kind of like buying a Ferrarri and putting $40 tires on it...
You're putting words in my mouth and using illogical scenarios. Please stop.
 
Show me in this review where it says the 6900 would be better off with a dGPU.

I'll wait...
lol... now who's putting words in people' mouths... I'm simply stating that PRICE is key - read my first posts... if this is priced as a top tier CPU (ie: same as the 12700 or 12900), then it is a terrible choice, as those wipe the floor with it.

If it is cheaper, then depending on how much cheaper, it is a good purchase... but those thinking to get top-tier performance will be sadly disappointed - as this review was quite clear on showing with it's plentiful graphs :)
 
Back