It's Unnecessary But, AMD Is Basically Lying About CPU Performance

B/c I am a nice guy I didn't even mention accelerated degradation due to electromigration as Intel disregarded their own safety limits and allowed the chips to run at ridiculous voltages and temperatures indefinitely just to gain the edge over AMD.

As for 74W consumption, nice try but no banana. Maybe if you play Starcraft I at 1024X768 frame capped.
Did you forget the Ryzen 7000 115c fiasco? Intel didn't disregard anything. I think you misinterpret that as mobo manufacturers ignoring Intel's stated limits, which isn't really new, they have to convince you the board is better than others. The wattage argument is kinda funny too when 350w gpus are the ideal pairing for the stated CPUs.
 
I mean if you cherry pick the very best results when using the fastest GPU available then yeah you might see a whole 24%?

It's not cherrypicked, it's an average of a bunch of games. You'd know that if you'd looked at source at the bottom of my post.

For more than 50% more of the cost? How does that make any sense? Also the 12600K. The X3D isn't faster by enough to make it better for future generations. The next gen AMD stuff (non X3D) will be faster.

For many people the improvements in 1% lows and frametime consistency are most important and make all the difference even if the averages are merely 24% better. Paying $100 more in an entire system for that improvement is worth it. AMD has already said that next gen will not quite match this gen's X3D in gaming. However there is next gen's X3D parts of course.

Its your money buy an X3D if you like, AMDs marketing is targeting gamers. But the numbers are clear to me, these chips are not worth the premium you pay for them unless you have like a watercooled 4090 and a 1080p monitor.

For you and lots of people. Also for me when I had a 5600 and a 6600 XT. Based on sales, the X3Ds seem to be excellent options for a great many other people.

If you are running like a 4070 or something, you will really not get the benefit of an X3D over a basic i5 or R5. Its just a waste of money.

I now use a 6800 XT and could see the difference easily when upgrading from an OC'd R5 5600 to a 5800X3D in CPU-limited games. Not much in the average fps as I usually run at VSync 80 or 100 or 144fps game-dependent, but the improvement was very welcome in the 1% lows/frametime smoothness. That tells me that at least 1 more GPU tier down would still make a welcome difference and that's before lowering quality. Some people really like their fps so are happy to play at Medium or performance-optimized settings so even a $300 GPU will reap good frametime/1% low benefits from a top-tier CPU.
 
Last edited:
6400 is just a bit more cheaper than 7200 but perform also very bad compared to 7200 mt/s. I use 8200 stick cl38 and I have 23 fps difference in 5k between a basic 6400mhz and my 8200 mhz ...

23 fps in what? Cities Skylines which would be a 100% fps improvement, or CS2 which would be a 5% improvement. A raw fps number contains no information.
 
x3d isn't for gaming if the x3d perform worst in 99.9% games ... you lose so much performance in optimized game with larger cache size so the only advantage is the miss hit but again, we're already using dod pattern to avoid miss hit by aligning data in memory so meh

That did not make any sense. X3D is much faster in gaming, which is the reason to buy an X3D CPU. Read a few CPU reviews and check it out for yourself.
 
That did not make any sense. X3D is much faster in gaming, which is the reason to buy an X3D CPU. Read a few CPU reviews and check it out for yourself.
Lot's of people believe that. Then I show them a memory tuned stock 12900k and they come back to reality :laughing:
 
23 fps in what? Cities Skylines which would be a 100% fps improvement, or CS2 which would be a 5% improvement. A raw fps number contains no information.

23 fps in general ... some game you can get over 35 fps another game will just give 16 fps ... most new game use 8k-16k texture so the more bandwidth you have and the best performance you get.



That did not make any sense. X3D is much faster in gaming, which is the reason to buy an X3D CPU. Read a few CPU reviews and check it out for yourself.

few cpu review? I did my own test with unity same for my co-worker ...

you can read our post if you want ....
https://forum.unity.com/threads/burst-and-avx512.1350899/
 
2 CPUs have the exact same price.

CPU A is much faster in gaming than CPU B but I don't know that cause reviewers instead of testing with a 4090 they are testing with a 6600. So I buy CPU B, and down the line when I upgrade my GPU I realize my CPU is holding it back.

So I bought the wrong CPU for my usecase because the CPU review - instead of testing the actual CPU, they were testing the 6600.

Got it now?
Only problem here that in the real world looking at the "same price" option is not the only one scenario. Much more often people compare CPUs from different price points and so your rather belaboured argument collapses at this moment.

And it's much more probable that you'll lose money and buy the "wrong CPU" because 4090@1080p showed awesome gains for some highe end model, but the moment you stick it in next to you mid-range GPU@1440p you'll be limited and much better off with the cheaper solution. And couple of years "down the line" things and prices will be completely different so all this careful future proofing will be worthless.

The answer here of course is to test both with 4090 AND some real-world GPU. But this rather simple solution seems to be a bit too much to comprehend for many hardcore zealots here.
 
Only problem here that in the real world looking at the "same price" option is not the only one scenario. Much more often people compare CPUs from different price points and so your rather belaboured argument collapses at this moment.
And how would they know what exactly are they paying the extra for unless they actually see the cpu stretch it's legs?

You don't need to test with a "real world' GPU, user can go check a GPU review and see how fast his GPU is then see the CPU review to determine what CPU he needs. It ain't hard
 
2 CPUs have the exact same price.

CPU A is much faster in gaming than CPU B but I don't know that cause reviewers instead of testing with a 4090 they are testing with a 6600. So I buy CPU B, and down the line when I upgrade my GPU I realize my CPU is holding it back.

So I bought the wrong CPU for my usecase because the CPU review - instead of testing the actual CPU, they were testing the 6600.

Got it now?
I get it mate, but I doubt anyone having the money to upgrade to high levels will go with such class CPU. This is what I mean.
 
Poor Steve, he tried, this article was brilliant, breaks down exactly why all reviewers test like they do, yet the comments section is still filled with nonsense GPU limited testing requests.

I don’t understand how a tech website attracts so many people who aren’t technical enough to understand such a well written article…
 
Poor Steve, he tried, this article was brilliant, breaks down exactly why all reviewers test like they do, yet the comments section is still filled with nonsense GPU limited testing requests.

I don’t understand how a tech website attracts so many people who aren’t technical enough to understand such a well written article…
Steve's opinion about how to test CPUs is not one and only right way, as stated multiple times on comments.
 
Steve's opinion about how to test CPUs is not one and only right way, as stated multiple times on comments.
Can you point me to another site that does it’s reviews of CPU’a using a GPU limit please?

I would love to find a review site that does it the way you’ve explained here in the comments.
 
I am like AMD stop giving free advertisements to Nvidia and release a 5090 competitor to show off with flagship gaming cpu. AMD quick show off mid level cpu with rx 6600 to make it as good as the competition. 🤪
 
Can you point me to another site that does it’s reviews of CPU’a using a GPU limit please?

I would love to find a review site that does it the way you’ve explained here in the comments.

You see, there are certain "standards" used on virtually all reviews. Like no background tasks, clean Windows install etc. I have been paid real money to make GPU reviews so I know these things well enough. Reason for this is to make sure that results have no anomalies and they can be replicated.

However, on IRL situations you very rarely have clean Windows install, no programs running on background, all settings default except those for testing purposes, no internet connection etc.

What this means is that benchmarks rarely reflect real world situations. In this case, GPU is very rarely maxed out because reviews want to show differences between CPUs. But again, you have results is one thing and what those results actually tell is totally another thing.

For your question, not many sites do that for obvious reasons. However that still doesn't mean GPU maxed out results are flawed, lie or anything. There is more than one way to test hardware.
 
For your question, not many sites do that for obvious reasons. However that still doesn't mean GPU maxed out results are flawed, lie or anything. There is more than one way to test hardware.
Right… so there’s more than one way to test, but you can’t point me to any, not a single source on the internet that tests CPU gaming performance using low end GPU’s?

Have you ever wondered why no one tests like that? Not a single source? Have you ever thought about setting up your own site and doing those reviews yourself?
 
23 fps in general ... some game you can get over 35 fps another game will just give 16 fps ... most new game use 8k-16k texture so the more bandwidth you have and the best performance you get.

few cpu review? I did my own test with unity same for my co-worker ...

you can read our post if you want ....
https://forum.unity.com/threads/burst-and-avx512.1350899/
"faster" ... amd is only faster when compared to low bandwidth intel cpu ... lol any ** 700k is faster than x3d processor.

here more reading if you want ....

https://forum.unity.com/threads/amd-5800x3d-cache-and-ecs-dots-performance.1221540/

Those look like fun tech dives but overall gaming FPS comparisons are more informative when considering a gaming CPU:

relative-performance-games-2560-1440.png
 
Right… so there’s more than one way to test, but you can’t point me to any, not a single source on the internet that tests CPU gaming performance using low end GPU’s?

Have you ever wondered why no one tests like that? Not a single source? Have you ever thought about setting up your own site and doing those reviews yourself?

As said, it's standard practice to use high end GPU for CPU testing, so you probably won't see much else.

I don't have to wonder because reasons are simple: review practices try to avoid other components being bottlenecks. If one component is bottlenecking, there is not much point on testing of course. But again, every system has at least single bottleneck and tbh majority of gaming systems are probably bottlenecked by GPU, not CPU. That's why AMD testing actually makes much more sense than standard practices.
 
As said, it's standard practice to use high end GPU for CPU testing, so you probably won't see much else.

I don't have to wonder because reasons are simple: review practices try to avoid other components being bottlenecks. If one component is bottlenecking, there is not much point on testing of course. But again, every system has at least single bottleneck and tbh majority of gaming systems are probably bottlenecked by GPU, not CPU. That's why AMD testing actually makes much more sense than standard practices.
AMD's testing makes more sense if you are trying to fool consumers into buying an inferior product
 
AMD's testing makes more sense if you are trying to fool consumers into buying an inferior product

It actually makes sense since generally people who buy Intel CPUs are much stupider than people who buy AMD CPUs.

You can easily see this by comparing retail CPU sales and OEM CPU sales. AMD dominates "intelligent people" retail sales and Intel "stupid people" OEM sales.
 
It actually makes sense since generally people who buy Intel CPUs are much stupider than people who buy AMD CPUs.

You can easily see this by comparing retail CPU sales and OEM CPU sales. AMD dominates "intelligent people" retail sales and Intel "stupid people" OEM sales.
Right, and when it comes to GPU's only stupid people buy AMD, that's why nvidia has like 90% of the DIY market
 
Those look like fun tech dives but overall gaming FPS comparisons are more informative when considering a gaming CPU:

relative-performance-games-2560-1440.png

the problem with such review is they are using vulkan game with max avx512 support and 6000-6400mhz ram so of course amd will perform a bit better than intel if you disable avx512 on intel and cut his half memory support.

amd throttle with memory above 6400 mhz so the real question is why testing with vulkan games, why not allowing intel cpu to use 100% his power?

if you want to test a race between 2 car, will you prevent one car from using maxed gear? I hope no ... using maximum memory frequncy on amd but average memory frequency support on intel isn't really fair play ... there is like 250 vulkan games in the world so yeah amd processor can perform better only if you limits intel processor and use such game otherwise in others 99.9% game amd has no chance.
 
0.o..

I think all Steve has PROVEN is that there was no reason for LGA1700 or even the 14900k when the AM4's 5800X 3D is within a few frames of the latest i9 in all of these tests...!

Imagine all the people who bought a 12th-Gen CPU and has no upgrade path other than a 14900k... they getting hosed!

LGA1700 is dead...

 
Back