The Ryzen 7 7800X3D is the Zen 4 3D V-Cache CPU that gamers should all be interested in, it's fast and extremely power efficient. Moreover, at $450 the 7800X3D is just $50 more than the 7700X.
The Ryzen 7 7800X3D is the Zen 4 3D V-Cache CPU that gamers should all be interested in, it's fast and extremely power efficient. Moreover, at $450 the 7800X3D is just $50 more than the 7700X.
@Techspot I know benchmarks are done with as few background programs as possible to ensure maximum game performance and apples-to-apples comparison, but this is this getting further from my real use case where I am playing a game, but also chatting on Discord, streaming my game, and watching two streams on my second monitor, plus 20 tabs open in Firefox. Even though the gaming performance of the two CPUs are the same, I would assume the 7950X3D would preserve its gaming performance better than the 7800X3D in this multitask gaming scenario. There there a way to quantify this from the benchmarks you show in your article?
Nope. the math simply does not work. At 8 hours per day, 365 days per year, running at full blast the ENTIRE time on that one specific super heavy workload, it's $117 difference in electricity. In that scenario, you are either making money off of that work (in which case that $117 per year isnt even peanuts, OR you should be buying something larger to do more work in the same time) or you are running insane work loads as a hobby, in which case you are going to be wasting far more on buying the hardware in the first place.100-150w less power draw is not a joke when you're in Europe and paying 40 cents per kilowatt hour.
The 7800X3D is a $450 CPU.Nope. the math simply does not work. At 8 hours per day, 365 days per year, running at full blast the ENTIRE time on that one specific super heavy workload, it's $117 difference in electricity. In that scenario, you are either making money off of that work (in which case that $117 per year isnt even peanuts, OR you should be buying something larger to do more work in the same time) or you are running insane work loads as a hobby, in which case you are going to be wasting far more on buying the hardware in the first place.
If the price of electricity is a concern, you should not be buying $500+ processors. Period.
That's a you problem.
Again, you are looking at a $450 CU and worrying about $60 in electricity over 5 years.The 7800X3D is a $450 CPU.
Math works fine. 4 hours a day it's $60 a year if the difference is calculated at 100w, the minimum amount at load. Probably more though looking at various tests. I keep a CPU in any given machine at least 5 years typically. Over $300 over the life of the CPU in that use case. It pays for an upgrade.
Living costs are high in Europe right now, I am quite sure people will take such a saving and be perfectly happy about it. I would since I just ordered a 7800X3D and not a 13900K.
I would remind you what works or not is not the same for everyone.
So...how? What are the parameters?Both AMD and Intel seem to be heading towards big and small cores in the future, and at least part of this is to ensure maximum performance of the primary application without causing stutters in the background apps. If a review site is able to quantify it, I would be interested in reading about it.
Again, you are looking at a $450 CU and worrying about $60 in electricity over 5 years.
If that is a concern for you, you do not need a $450 gaming CPU. Gaming is a wasteful hobby. Go get an i3 for $100, it plays all the same games.
It's still not a selling point. You're spending another $350 over an i3 to save a hypothetical $60 over 5 years over an i9. That is terrible math.
It's up to Techspot, they are the professionals... My off-the-cuff idea would be to use a stress tester with a "compatible workload" that you can set the number of threads/ops per sec requested, and see how much work you can squeeze out of a CPU with no more than a 10% drop in FPS in a game. This would give me a rough idea of the "leeway" left in a CPU while gaming. I wouldn't test every game, maybe 2 or 3 with moderate to high CPU usage to create a compare and contrast. Don't ask me for more than this, I am here because I don't know.So...how? What are the parameters?
actually if they are worried about power consumption the 7600x came close in power usage and at 1440p and 4k will have negligible performance delta in real world which saves you hundreds of dollars.Again, you are looking at a $450 CU and worrying about $60 in electricity over 5 years.
If that is a concern for you, you do not need a $450 gaming CPU. Gaming is a wasteful hobby. Go get an i3 for $100, it plays all the same games.
It's still not a selling point. You're spending another $350 over an i3 to save a hypothetical $60 over 5 years over an i9. That is terrible math.
If TS is daring enough to take on the backlash of crybabies out there to try and do this, that's on them. I know another big tech said said they wouldn't do this because of all the millions/billions of different parameters out there and that you couldn't make everyone happy and they refused to be a sounding board for those that cry/complain about tests not being done with their software and usage.It's up to Techspot, they are the professionals... My off-the-cuff idea would be to use a stress tester with a "compatible workload" that you can set the number of threads/ops per sec requested, and see how much work you can squeeze out of a CPU with no more than a 10% drop in FPS in a game. This would give me a rough idea of the "leeway" left in a CPU while gaming. I wouldn't test every game, maybe 2 or 3 with moderate to high CPU usage to create a compare and contrast. Don't ask me for more than this, I am here because I don't know.
It's certainly not easy. I'm sure TS receives many requests, so even if mine is read I expect a decision based on cost versus benefit like all businesses. Thanks for weighing in.I personally wouldn't want TS or any other tech site to be punished more by the public, they're criticized enough already just from doing these reviews, while trying to do a review using a "compatible workload".
Both AMD and Intel seem to be heading towards big and small cores in the future, and at least part of this is to ensure maximum performance of the primary application without causing stutters in the background apps. If a review site is able to quantify it, I would be interested in reading about it.
The game I am building for, Star Citizen, uses all 8 P-cores of my 12700K@5.2Ghz at 100% at certain locations. If I run additional streaming background apps, there is another 20% FPS penalty. My 4090 is not the bottleneck as the utilization is 40-50% in those CPU bottlenecked scenes. Can you tell me if the 7800X3D or 7950X3D would be better or equal for my use? Or does my situation not exist because nothing uses 6 cores?Also nothing scales past 6 cores, and even the most multi-threaded games have 85% of their workload on 2 cores. The remaining 15% can scale up to 6 and then there is nothing beyond that.
You should be building a system to do the things that you want/need to do now AND for the next few years. While gaming at higher resolutions now will usually keep you GPU limited, after a few years newer games will be CPU limited as well. Why not spend a little bit more now to push back the time when the CPU limit appears at higher resolutions.actually if they are worried about power consumption the 7600x came close in power usage and at 1440p and 4k will have negligible performance delta in real world which saves you hundreds of dollars.
for eg.
AMD Ryzen 5 7600X, MSI B650-P Pro WiFi, G.Skill Flare X5 16GB DDR5-5600 Module, Computer Build Combo
$533.96 SAVE $133.97
$399.99 at microcenter
Plus you still have an upgrade path and the money saved can be applied to a better gpu. Linus tech tips paired the 7800x3d with a 6700xt while the money saved could get you the 6950XT at $549.99 with cpu Combo at microcenter and see almost double performance in rasterization.
Ah yes the 7700x is also a good option for eg.You should be building a system to do the things that you want/need to do now AND for the next few years. While gaming at higher resolutions now will usually keep you GPU limited, after a few years newer games will be CPU limited as well. Why not spend a little bit more now to push back the time when the CPU limit appears at higher resolutions.
Also, 2023 games are already wanting 32GB of RAM. Building at 16GB is likely to be unwise.