AMD Ryzen 7 7700X Review: Faster than Core i9?

That seems a little premature given that 13th gen is out, or nearly so. If Intel is anywhere near accurate with the 30-40% performance increases with 13th gen, it will easily out perform the 7700 at the same price point (i7-13700K) or deliver similar performance (i5-13600K) for 20% less cost ($400 vs $320). Of course, retail discounting could negate the Intel value prop so we will have to see.
You know the higher perf increase is for MT due to the additional e cores and higher clock speeds ?

E cores will do exactly zero for gaming perf.

That said, I agree that we should wait for reviews.
 
I can’t see any real reason to change my 12700K (on MSI MAG B660 Tomahawk WiFi DDR4) for a AM5 solution particularly the 7700X.

Yes in some benchmarks the 7700X is faster but in others it isn’t and with only a 280 AIO my benchmarks appear slightly above the DDR figures on the report.

I must admit the 7950X is very very impressive in productivity and if I was using my PC for business rather than for hobbies then I think that would be the choice and Intels Raptor Lake could struggle against it.

My PC is used for a mix of productivity and games for hobbies hence originally going for the 12700 (bought the K on offer cheaper than the non K version!!) and this “middle ground” is really going to get interesting over the next few weeks, 12700K, 5800X3D, 13700K and 7700X and DDR4 vs DDR5 and I think it going to be close call.

Let’s be honest they are all good CPU’s and it might take more than one or two GPU upgrades before they are a serious limit in gaming.
 
I can’t see any real reason to change my 12700K (on MSI MAG B660 Tomahawk WiFi DDR4) for a AM5 solution particularly the 7700X.

Yes in some benchmarks the 7700X is faster but in others it isn’t and with only a 280 AIO my benchmarks appear slightly above the DDR figures on the report.

I must admit the 7950X is very very impressive in productivity and if I was using my PC for business rather than for hobbies then I think that would be the choice and Intels Raptor Lake could struggle against it.

My PC is used for a mix of productivity and games for hobbies hence originally going for the 12700 (bought the K on offer cheaper than the non K version!!) and this “middle ground” is really going to get interesting over the next few weeks, 12700K, 5800X3D, 13700K and 7700X and DDR4 vs DDR5 and I think it going to be close call.

Let’s be honest they are all good CPU’s and it might take more than one or two GPU upgrades before they are a serious limit in gaming.
I'm expecting the real deal (3D ones) will be announced right after intel releases their RTL. And this time probably a whole line of CPUs instead of one (ie 7600x3d, 7900x3d etc). Bait & Switch, whether we like it or not.
 
12700K + DDR5 beats 7700X in two most demanding game in the list (Spider-Man and Cyberpunk 2077)

The question why should I care about light games that run well even on weak CPU ??? Winning in these types of games is pointless


Counter-Strike should not have been tested


Sound like 7700X will have hard time against 13700K.... 13700K will win easily in productivity, and likely match it in gaming (and 13700K probably wins in the most demanding games)
 
Last edited:
what are you talking about? AMD has managed to increase performance without raising the wattage ceiling in Zen 3.

the point of AM5 higher TDP capabilities are to push it's CPU to it's limit, in which AMD are confident that it'll still be reasonably within it's limit. granted, we don't know how it'll affect it's CPU on the long term. that being said, theoretically the point is to allow more wattage headroom for more performance gain. you, as the user are still capable of limiting the wattage itself, if you wish to do so. but the system are smart enough to do it just for you.

anyway by your logic, Intel are the best at releasing poorly enginerred products as their mantra for the past few years ever since Skylake release in 2015 are increasing and kepp increasing more performance by increasing clockspeed and thus, increasing wattage or overall power consumption. weird AMD get bad mouth from you when literally the competitor did worse. not that AMD are any better, but it's weird that you're even throwing strawman.
As the end user you shouldn't have to limit voltage to account for a poorly engineered product, if you choose to overlook that deficiency, that's on you.

I never brought up Intel?...
 
The high thermals are due to the new insanely thick IHS. Look up youtube videos of people deliding Zen4 and causing the max temps to drop by 20'C down into the 70s'C. Zen 4 is not really limited by power consumption since it only uses slightly more power than Zen 3 and uses WAY less power than the comparable chips that Intel has. Look at this review's power consumption figures. The 7700X can add +100watts to the max power consumption and still be power-consumption competitive with Intel according to the charts.

Power-p_1100.webp
You should read my first post, I already mentioned the extra thick IHS as the culprit to the increase in thermals.

Let me get this right, it's okay for AMD to increase their power consumption and now have chips that run into thermal limits "by design" but if the other brand does so it's not okay? Sounds an awful lot like a double standard.

Going by the chart above, 5600X to 7600X is a 75 watt increase, hardly a "slight" power bump, and compared to the 12600K it actually uses more power.

There's no 5700X results in the chart to compare for the 7700X, however it draws more power than a 5900X, a 12 core chip, or 20 watts less than the 12700K. Sure add 100 watts to that and now it'll be more competitive, what are you talking about???

5950X to 7950X is 130 watt increase, again not what I would claim as a "slight" increase in power, but quite a significant one, and now it's only 7 watts away from the 12900K.

Unfortunately just based solely on these charts we don't get true CPU power consumption as this is platform load, having moved to DDR5 the actual CPU power draw could be effected. Would be interesting to have the CPU's power consumption isolated.
 
Imagine what AMD can do with 7800X3D. The gaming benchmarks will be much better, especially the single core which matters more in CPU bottleneck scenarios. I just hope they price it right since the new Motherboards are way to expensive.
 
You know the higher perf increase is for MT due to the additional e cores and higher clock speeds ?

E cores will do exactly zero for gaming perf.

That said, I agree that we should wait for reviews.
I would love to see benchmarks that include everything you might be running while gaming. Eg, how about turning on Discord, maybe some CPU/GPU monitoring tools, etc. Whatever you might have up and running while gaming. Then, I think MT would become more important. Raw game benchmarks are interesting, but I think we all know that people run other things while gaming. I know I do.
 
As the end user you shouldn't have to limit voltage to account for a poorly engineered product, if you choose to overlook that deficiency, that's on you.

I never brought up Intel?...
yes, you actually doesn't need to limit your voltage or TDP if you're not comfortable with tweaking your system, because the system are fully aware of your CPU's temperature and maximum heat dissipation from your CPU's cooling solution. but of course, anyone without electrical engineering degrees specialized at semiconductor designs that worth their salt would call it as "poorly engineered product".

I brought up Intel because if you actually bothered to see Intel's behavior this past 5 years, their solution to AMD Ryzen CPUs are to increase voltage, increase wattage, anything for more power to increase their product's performance — but you don't have to reply this, as it's just simply my rant based on the same logic of your opinion.
 
I would love to see benchmarks that include everything you might be running while gaming. Eg, how about turning on Discord, maybe some CPU/GPU monitoring tools, etc. Whatever you might have up and running while gaming. Then, I think MT would become more important. Raw game benchmarks are interesting, but I think we all know that people run other things while gaming. I know I do.
good thing you asked because Techspot did actually test that.


TL;DR — unless you're rendering 4K videos on the background, a few Chrome tabs with YouTube, Discord, Spotify, or any other light application won't hurt your performance by noticeable margin.
 
good thing you asked because Techspot did actually test that.


TL;DR — unless you're rendering 4K videos on the background, a few Chrome tabs with YouTube, Discord, Spotify, or any other light application won't hurt your performance by noticeable margin.
Good article, would love to see it expanded to other CPUs/GPU combos.
 
With Intel stating that they are going to raise prices and AMD lowering prices, I'm not feeling compelled to support Intel at any time in the future. In fact, I'm planning on four builds in the next year or so as replacements for four computers in my home that do various things. 3 are ancient AMD builds, and one is an Intel Ivy Bridge X which I upgraded from a Sandy Bridge X (surprisingly for Intel) I won't be going Intel with any of the new builds.
 
yes, you actually doesn't need to limit your voltage or TDP if you're not comfortable with tweaking your system, because the system are fully aware of your CPU's temperature and maximum heat dissipation from your CPU's cooling solution. but of course, anyone without electrical engineering degrees specialized at semiconductor designs that worth their salt would call it as "poorly engineered product".

I brought up Intel because if you actually bothered to see Intel's behavior this past 5 years, their solution to AMD Ryzen CPUs are to increase voltage, increase wattage, anything for more power to increase their product's performance — but you don't have to reply this, as it's just simply my rant based on the same logic of your opinion.
So then what would you call the decision to increase the thickness of the IHS to maintain quasi compatibility with existing heatsinks and cause the CPUs to run hotter than necessary?
 
Good detailed review, gets a little murky at the end when comes to the recommendation however.

As a platform upgrade this still makes no sense right now, not until Intel lays it's cards on the table, nor when considering the gaming results. Sure it's among the fastest CPUs when paired with a RTX 3090Ti at 1080p, however that test is nearly irrelevant as I don't see gaming at 1080p with a RTX 3090Ti something anyone in their right minds of doing.

I understand the attempt at removing GPU bottlenecks to let the CPU run as fast as you can, but at the end of the day, someone with any last gen CPU shouldn't see these results and think it makes the upgrade worth anything more than bragging rights. After all, running CS:GO at over 500 FPS will make all the difference right?

Then there's the IHS issue, will AMD figure out a way to rectify this with the next generation of CPUs using AM5? Or will this extra thick IHS be standard for the lifespan of socket. As much as AMD wants to claim 95 degrees is the normal operating temperature it just does sit right with me and feels like a cop-out for this need to maintain compatibility with existing heatsinks, or marginal compatibility as it turns out.

Glad to have some competition in the CPU world regardless, looking forward to see how things unfold with Raptor Lake.
I assure you, some of the tests were actually GPU bound even at 1080p with a 3090ti. This is my 12900k running spiderman at same settings, but with DLSS ON to remove the gpu bottleneck

 
You should read my first post, I already mentioned the extra thick IHS as the culprit to the increase in thermals.

Let me get this right, it's okay for AMD to increase their power consumption and now have chips that run into thermal limits "by design" but if the other brand does so it's not okay? Sounds an awful lot like a double standard.

Going by the chart above, 5600X to 7600X is a 75 watt increase, hardly a "slight" power bump, and compared to the 12600K it actually uses more power.

There's no 5700X results in the chart to compare for the 7700X, however it draws more power than a 5900X, a 12 core chip, or 20 watts less than the 12700K. Sure add 100 watts to that and now it'll be more competitive, what are you talking about???

5950X to 7950X is 130 watt increase, again not what I would claim as a "slight" increase in power, but quite a significant one, and now it's only 7 watts away from the 12900K.

Unfortunately just based solely on these charts we don't get true CPU power consumption as this is platform load, having moved to DDR5 the actual CPU power draw could be effected. Would be interesting to have the CPU's power consumption isolated.
Let me get this right, AMD increases their power consumption and heat to what Intel typically has in their lineup, but now it's suddenly bad when AMD does it and you call it reaching a thermal limit and a "poorly engineered prototype"? What do you call most of the higher end Intel chips in the last 3 generations then that were almost all power hungry and a bunch of them also ran very hot?

You're right that the power increase from the 5000 series is more than slight. When I said add 100watts and it'll still be compeitive with Intel, I'm talking how the 7700X is now king of gaming, beating out the i9-12900K that uses 115 watts more. In productivity, it seems to be almost comparable too by rivaling it in Cinebench, rivaling it in 7 zip decompression, losing a bit in blender, losing in Corona, winning in Adobe Affereffects, Adobe Photoshop, etc. So this 7700X is better in gaming and is almost as good as the 12900k in productivity benchmarks while using 115watts less.
 
Let me get this right, AMD increases their power consumption and heat to what Intel typically has in their lineup, but now it's suddenly bad when AMD does it and you call it reaching a thermal limit and a "poorly engineered prototype"? What do you call most of the higher end Intel chips in the last 3 generations then that were almost all power hungry and a bunch of them also ran very hot?

You're right that the power increase from the 5000 series is more than slight. When I said add 100watts and it'll still be compeitive with Intel, I'm talking how the 7700X is now king of gaming, beating out the i9-12900K that uses 115 watts more. In productivity, it seems to be almost comparable too by rivaling it in Cinebench, rivaling it in 7 zip decompression, losing a bit in blender, losing in Corona, winning in Adobe Affereffects, Adobe Photoshop, etc. So this 7700X is better in gaming and is almost as good as the 12900k in productivity benchmarks while using 115watts less.
Wait what. WHAT? Are you saying that the 7700x is comparable to the 12900k in productivity? LOL, get outta here.
 
Let me get this right, AMD increases their power consumption and heat to what Intel typically has in their lineup, but now it's suddenly bad when AMD does it and you call it reaching a thermal limit and a "poorly engineered prototype"? What do you call most of the higher end Intel chips in the last 3 generations then that were almost all power hungry and a bunch of them also ran very hot?

You're right that the power increase from the 5000 series is more than slight. When I said add 100watts and it'll still be compeitive with Intel, I'm talking how the 7700X is now king of gaming, beating out the i9-12900K that uses 115 watts more. In productivity, it seems to be almost comparable too by rivaling it in Cinebench, rivaling it in 7 zip decompression, losing a bit in blender, losing in Corona, winning in Adobe Affereffects, Adobe Photoshop, etc. So this 7700X is better in gaming and is almost as good as the 12900k in productivity benchmarks while using 115watts less.
My original argument, and still where I am basing my comment of poorly engineered prototype, is based on the overly thick IHS which doesn't allow the heat to be extracted from the CPU fast enough, this is a known issue, you yourself brought this up as well, am I being harsh by calling it a poorly engineered prototype, probably. However needless to say this was a poor choice on AMDs part, and my opinion remains that had they not decided to force compatibility with AM4 coolers the chips would all run cooler, even allowing them to perform better, as demonstrated when der8auer went direct die netting increased performance and lower temperatures.
 
With Intel stating that they are going to raise prices and AMD lowering prices, I'm not feeling compelled to support Intel at any time in the future. In fact, I'm planning on four builds in the next year or so as replacements for four computers in my home that do various things. 3 are ancient AMD builds, and one is an Intel Ivy Bridge X which I upgraded from a Sandy Bridge X (surprisingly for Intel) I won't be going Intel with any of the new builds.
I'm running an Alder Lake rig atm and lovin it.
 
Wait what. WHAT? Are you saying that the 7700x is comparable to the 12900k in productivity? LOL, get outta here.
Reread what I wrote. I said the 7700x is "almost comparable" because it loses in a bunch of productivity benchmarks, ties in other benchmarks, and even wins in a few benchmarks.

Get outta here if you can't read correctly.
 
I'm most interested in performance relative to power consumption.

The 12900K's power consumption is outrageous. The 7600X consumes significantly more power than the 5600X.

The 5600X is a good value, but not as good of a value as the 5600, which is 15% cheaper and offers nearly the same level of performance.

The real winner of this test was DDR5-6400 memory.
 
Reread what I wrote. I said the 7700x is "almost comparable" because it loses in a bunch of productivity benchmarks, ties in other benchmarks, and even wins in a few benchmarks.

Get outta here if you can't read correctly.
But that's...true for everything. Is the 12900k comparable to the 7950x? Ofcourse not, it gets wiped. But according to phoronix (a 300 benchmark suite), the 12900k wins in LOTS of them.

The 7700x is comparable to the 12700k in productivity, and even loses to that. The 13700k will be on a completely different league
 
So then what would you call the decision to increase the thickness of the IHS to maintain quasi compatibility with existing heatsinks and cause the CPUs to run hotter than necessary?
source on how thicker IHS causes higher CPUs temperature?

or did I read your sentence wrong?
 
I would love to see benchmarks that include everything you might be running while gaming. Eg, how about turning on Discord, maybe some CPU/GPU monitoring tools, etc. Whatever you might have up and running while gaming. Then, I think MT would become more important. Raw game benchmarks are interesting, but I think we all know that people run other things while gaming. I know I do.
That‘s something many of us would have liked to see ever since 8C Ryzen 1000 were released and AMD consistently had a core count advantage.

Not sure how reviewers could start including this in their reviews with a straight face now. That said, I would welcome this and reviewers should also run these tests on older gen processors.
 
That‘s something many of us would have liked to see ever since 8C Ryzen 1000 were released and AMD consistently had a core count advantage.

Not sure how reviewers could start including this in their reviews with a straight face now. That said, I would welcome this and reviewers should also run these tests on older gen processors.
There is no difference in performance. Unless you are running some rendering or stuff in the background. The only things that makes a difference is latency, some apps running increase that (for example, steam or icue), but that drops performance equally regardless of how many cores your cpu has.
 
Back