Ryzen 7 7800X3D vs Core i7-13700K: Top $400 Gaming CPU?

"Considering all these factors, if we were to build a new gaming PC with a platform budget of around $700, we would unquestionably snap up the 7800X3D."

Yes, for gaming only. But if you want an all-rounder then the i7 13700k, hands down. It trades blows depending on your selection of games, but it`s marginally slower, which amounts to what? A few fps slower? You won`t ever notice. It runs hotter, ok, as long as it is stable, I don`t care. 100W more is not that big of a deal. But then... productivity. Just look at the numbers (google it!), it`s trashing the 7800X3D from all sides. Thus, I feel Intel has a better package if you want a PC for everything, not just exclusively gaming.

The 13700K drawing 50-150W more power than the 7800X3D in heavy games while having generationally less performance is simply unacceptable. In productivity I'm seeing AMD is very competitive sometimes a little behind, save for Cinebench R23 but that is a simulated benchmark not realworld.

My real concern with AMD and maybe a dealbreaker is that their boot times with Zen 4 are horrendous some people are reporting 30-60 seconds. Until this is fixed I have no interest in upgrading my 5800X (which cold boots in ~9 sec. Because of this I'm passing on both of these chip generations.
 
I was responding to specifically what the other person said when they said "100W wasn't a big deal." Aside from that, did you read the article where it showed the total system power usage difference and that 100W was the average? If I was a "Last of Us Part 1" fan, it seemed closer to 200W. Also, when I was making my hypothetical comparison of using it 8 hours a day, I wasn't talking about using it while idle, was considering gaming or other high workload usage. Please go fanboy somewhere else if you don't even read the article in reference.
Results in tlou are invalid, I bet a paycheck he measured power draw during the shader compilation.
 
I'm sure there is difference using crap cores and performance cores for browsing.

100 watt difference from CPU only makes huge difference when it comes to cooling. Add GPU in the mix and it becomes even harder. You probably could build passively cooled system using both CPUs but for AMD it's just miles easier.

For 100 watt difference, that's pretty easy to test. No need to even add 100 watts. I have pretty silent cooling including over 1 kilogram CPU cooler and 5 case fans (4*140mm, 1*120mm) with automatic RPM adjustment. When CPU consumes around 90 watts, I cannot hear fans ramping up. Allow CPU to consume around 160 watts and soon this system is not silent any more. That's "only" 70 watt difference with no GPU load. 13700K can consume around 250 watts btw.

This was about using crap cores vs using performance cores. I doubt there is no difference.
Nonsense, the 700k is smoother in browsing while using much less power. It's also way more silent cause intel cpus are better at heat transfer. The 7800x 3d consumes a truckload of power for just browsing and simple tasks while it's noisy and whiny when running heavier tasks. Yikes
 
Reread. The power delta is between 50 and 100w across the games they listed.
Except not a single one other reviewer got power delta that high. If Shadow of the Tomb Raider numbers are improbable (PCWorld for example got only 30W difference vs Steve's 100W) but still believable, Last Of Us power delta is impossible even theoretically. Power delta of 200W alone is higher than peak power consumption in gaming for 13700K which is about 170W, meaning that either: a) 13700K in LoU must consume at least 250W (or close to 270W realistically) which is its max power consumption in power virus apps or b) 7800X3D generates electricity instead of consuming it.
So it's Steve's absurdly high numbers versus dozen of other reviewers numbers.
 
Nonsense, the 700k is smoother in browsing while using much less power. It's also way more silent cause intel cpus are better at heat transfer. The 7800x 3d consumes a truckload of power for just browsing and simple tasks while it's noisy and whiny when running heavier tasks. Yikes
Basically you're claiming that crap cores are faster than cores on 7800X3D. I highly doubt that.
Raptor Lake refresh is coming in August - October.
For CPU upgrade they will be basically worthless. Like always with Intel.
 
Basically you're claiming that crap cores are faster than cores on 7800X3D. I highly doubt that.
They are not crap core if you're using it for multi-core supported productivity it will easily do a better job than that of 7800x3d as it got more threads and more cores other than gaming 13700k works better and cheaper as it got more efficient cores normal usage like browsing something or media consumption won't consume much power and also in softwares which have rendering with multi-core it works better so it depends on the usage if your priority is productivity and gaming is secondary i7 13700k would be a better option and also those high speed ram would be very useful in productivity
 
Raptor Lake refresh is coming in August - October.
So they up the frequency a little bit by pushing 300W of power through the CPU, which in turn requires massive cooling and likely ends up causing thermal throttling? Is the R&D department on garden leave?
 
How to start a typical holywar on tech forums.
Basic tutorial step by step.
Step 1: ensure you are posting into intel/amd or nvidia/amd related topic.
Step 2: your reply will benefit greatly if you're the first
Step 3: ignore topic subject
Step 4: add more reply once the flame has started
***
 
I'm sure there is difference using crap cores and performance cores for browsing.

100 watt difference from CPU only makes huge difference when it comes to cooling. Add GPU in the mix and it becomes even harder. You probably could build passively cooled system using both CPUs but for AMD it's just miles easier.
[/QUOTE]
The 100W difference was for the system. Likely that was mostly CPU differences, but I'm sure the higher performing DDR5 used with Intel added a few watts. Easier or not, you can silently cool either config.
For 100 watt difference, that's pretty easy to test. No need to even add 100 watts. I have pretty silent cooling including over 1 kilogram CPU cooler and 5 case fans (4*140mm, 1*120mm) with automatic RPM adjustment. When CPU consumes around 90 watts, I cannot hear fans ramping up. Allow CPU to consume around 160 watts and soon this system is not silent any more. That's "only" 70 watt difference with no GPU load. 13700K can consume around 250 watts btw.

This was about using crap cores vs using performance cores. I doubt there is no difference.
I built an Intel system for my grandson. It's very quiet. 240 AIO for the CPU, and 3 intake and 1 exhaust for case fans (in addition to the 2 for the AIO). It's quiet under load. He would have to push it a lot harder for it to become 'noisy'. As for your comment about "crap" cores, it doesn't make your argument any stronger. It just makes it look like you're a fan boi.
 
I personally don't care about power consumption, I only care about stability and that's why Intel CPUs will always be my first choice. Same for GPUs, Nvidia all the way or maybe Intel if they will have something good in the future. 15 years of intel + nvidia, and had no issues. I wanted to try ryzen and had a platform with AMD CPU + GPU for 2 years. The webcam and headset were getting disconnected during meetings and I had to unplug them and plug them in again in order to work, weird green dots when using multiple monitors and watching youtube videos (known driver issue, never fixed back then), monitors going off and on for a second at random times, PC not booting with ram at xmp speed, all with latest bios updates and everything up to date.
 
If you already have 4c8t+ from at least i7 3000 series you are wasting money if it is for gaming.

Even Bloomfield i7 900 series (which does take a bigger hit and doesn't support AVX) can play anything anything except one game I know of due to AVX (Warzones 2.0).

With the current econmy you have plenty of horsepower with those older 3000+i7 CPUs (and even i5 for most games).

You will get 60+FPS avg @1440p Medium-Ultra with those older CPUs even with a GTX 1080Ti.
 
Is this review going to address Ryzen 7000 issues or are they not an issue? From what I've heard, I could grow a new beard before a 7800X3D PC starts up and then I have to worry about it burning to death. On top of that, I'm hearing of even worse DDR5 than Intel AND general stability and performance issues. There are attractive things about the 7800X3D but I want to actually use my pc headache free for gaming and general purpose use. From what I've read, I can't trust AMD to deliver that experience whereas I can, in fact, trust Intel. Have/are the reviews going to address this? I'll go with the 13700k for my build this year. I need my pc to just work.
 
Some complaints about Intel consuming more power is over exaggerating. In fact, AMD has very poor idle power. It boost too easily.

5 apps idle power draw from the wall test -
5900X + Gigabyte X570 Aorus Master = 120-150W
12900K + Asus ROG Strix Z690-A D4 = 70-100W

Tech Notice also confirms Ryzen 7000 idle is not good.

I switched to 12900K from 5900X. All the problems gone.
 
Also, the 13700k idles and does simple tasks (like web browsing) at way lower wattage than the 7800x 3d. Intel browses the web using less 5 to 10 watts, the 7800x 3d needs 20+.

Sure thing bro, when I do web browsing I also have the rest of my computer and monitor turned on, so when I use my 7800X3D in web browsing my total consumption is probably around 160W, with a 13700k it would be 135W. Oh the huge difference....
 
They are not crap core if you're using it for multi-core supported productivity it will easily do a better job than that of 7800x3d as it got more threads and more cores other than gaming 13700k works better and cheaper as it got more efficient cores normal usage like browsing something or media consumption won't consume much power and also in softwares which have rendering with multi-core it works better so it depends on the usage if your priority is productivity and gaming is secondary i7 13700k would be a better option and also those high speed ram would be very useful in productivity
Crap cores are exactly right: https://www.intel.com/content/www/us/en/architecture-and-technology/avx-512-overview.html

Intel CPUs with crap cores has AVX-512 support entirely disabled. So yes, even Intel itself admits those cores are trash🤦‍♂️

Also if you are using crap cores for "light" work, those are still slower than using performance cores for same work and that difference is noticeable.

The 100W difference was for the system. Likely that was mostly CPU differences, but I'm sure the higher performing DDR5 used with Intel added a few watts. Easier or not, you can silently cool either config.
Like I said, you can but with AMD that is much easier.

I built an Intel system for my grandson. It's very quiet. 240 AIO for the CPU, and 3 intake and 1 exhaust for case fans (in addition to the 2 for the AIO). It's quiet under load. He would have to push it a lot harder for it to become 'noisy'. As for your comment about "crap" cores, it doesn't make your argument any stronger. It just makes it look like you're a fan boi.

Lot harder means in this case like "full load"? And "noisy" means in this case "much more noise than on idle"? Like I said, you Can build silent system with Intel but it's much harder. Also using AIO water cooler system is pretty risky if something goes wrong.

For crap cores, like stated earlier, Intel heavily promoted AVX-512 but because of those trash cores, have disabled it entirely. So yes, according to Intel itself, those cores are crap.

Some complaints about Intel consuming more power is over exaggerating. In fact, AMD has very poor idle power. It boost too easily.

5 apps idle power draw from the wall test -
5900X + Gigabyte X570 Aorus Master = 120-150W
12900K + Asus ROG Strix Z690-A D4 = 70-100W

Tech Notice also confirms Ryzen 7000 idle is not good.

I switched to 12900K from 5900X. All the problems gone.

150 watts sound pretty far from being idle. Also AMD has other solutions if you are so concerned about idle usage. You could also use hibernate mode or similar if idle consumption is an issue.
 
t's rare nowadays such an obviously superior product goes against its direct apple's to apple's competitor.
It's not obviously superior - just way overhyped. No meaningful OC, way overpriced ($449 for just 8 cores is a joke), just awful MT performance (449$ 7800X3D is 30% slower in MT than $319 13600K, on par with $249 old 12600K), bad memory support (there's no point in DDR5 6000 - it about as fast in gaming as DDR4 4000), mediocre performance in most office and productivity apps, inconsistent performance in games: +20% in some games, zero in others and even manages to lose in some games to older and cheaper 13700K, high idle power consumption - which almost negates realistic delta in 30-50W in games, game for hour then browse for hour - and your saved watts are gone, BIOS problems, that, I can bet, will stay unsolved for years. It's good but not great gaming CPU, mediocre at best CPU overall. Another bet: it's reign will be short-lived and it will be sold with 30% discount after Raptor Lake refresh release like it was with 5800X3D.
 
Crap cores are exactly right: https://www.intel.com/content/www/us/en/architecture-and-technology/avx-512-overview.html

Intel CPUs with crap cores has AVX-512 support entirely disabled. So yes, even Intel itself admits those cores are trash🤦‍♂️

Also if you are using crap cores for "light" work, those are still slower than using performance cores for same work and that difference is noticeable.
Imagine how crap are amd cores when they are losing in MT performance to crap cores. LMAO
 
Sure thing bro, when I do web browsing I also have the rest of my computer and monitor turned on, so when I use my 7800X3D in web browsing my total consumption is probably around 160W, with a 13700k it would be 135W. Oh the huge difference....
That's a 22% difference. Whats the difference in gaming? With a 4090 and a monitor, you are drawing close to 700w vs what, 750w for the 13700k. The difference is 6%. Wow
 
That's a 22% difference. Whats the difference in gaming? With a 4090 and a monitor, you are drawing close to 700w vs what, 750w for the 13700k. The difference is 6%. Wow
Well my math is apparently not that good in the morning, there were only supposed to be 15W difference. :p But if a 13700k uses 100W more when gaming and a 7800X3D uses 15W more in browsing, the you need a factor of more than 1:6 of gaming:browsing to make the 13700K more efficient.
 
Back