The Best CPU for the Money: AMD FX vs. Intel Budget Shootout

AMD technology is so far behind Intel the only thing they can do is to put as many cores as possible with higher clocks and bring down the price to barely match an entry level I3 , I'm afraid if the competition is gone Intel will control the market as she wish .and always the consumer who pays the price
agreed mate ;-)
 
My main comment about these game benchmarks is that these are all DX11 games. Yes, that's what's available right now, I get that. But the important thing to know is that DX11 can only feed the graphics card rendering information with a single CPU core. That's right, your 4 or 6 or 8 core CPU is feeding that massively powerful, hungry GPU through a skinny little straw of only 1 of your CPU cores.

But DX12 allows the GPU to be fed by as many CPU cores as you have. AMD designed the Bulldozer/Piledriver/Steamroller/Excavator architecture for DX12 and multi-thread optimized software. Trouble is, Microsoft dragged their feet and have only just NOW released the operating system, Windows 10, with the kind of DX that AMD designed their chips for 4 years ago. The same goes for the Radeon CGN graphics core, which was equipped with a hardware-based scheduler for asynchronous shader usage that we're only now about to get in upcoming DX12 games. AMD designs their hardware to work well with what's coming down the pipeline, not just for what's available today. Remember that when you're hearing people crap-talk AMD, and bitching about how they need to 'catch up' with Intel. The irony of it.

When you look at the 7Zip benchmark, you're seeing the kind of highly optimized, integer-heavy software workload AMD built their Bulldozer to handle, and games using DX12 will be more like this than the DX11 games used in this review. Just something to bear in mind when you're thinking a dinky dual-core i3 is going to be a better value in the years to come than an AMD 6 or 8 integer core CPU. It's not going to work that way.
 
That's right, your 4 or 6 or 8 core CPU is feeding that massively powerful, hungry GPU through a skinny little straw of only 1 of your CPU cores.
If that were true your 4/6/8 core would never get above 25%/17%/12% CPU usage. Dude think about what you are saying. My 4 core i7 frequently goes up to 50% CPU. That's the equivalent of 2 cores with Hyperthreading or 3 cores without Hyperthreading.
 
Just something to bear in mind when you're thinking a dinky dual-core i3 is going to be a better value in the years to come than an AMD 6 or 8 integer core CPU. It's not going to work that way.
Yeah, whatever. Meanwhile in the real world...

ashesheavy-r9390x.png

If that were true your 4/6/8 core would never get above 25%/17%/12% CPU usage. Dude think about what you are saying. My 4 core i7 frequently goes up to 50% CPU. That's the equivalent of 2 cores with Hyperthreading or 3 cores without Hyperthreading.
I wouldn't spend too much time with anubis44. He (or she) regularly spams forums and comments sections on AMD's behalf. Must be a slow day at the clickbait sites.
 
Last edited:
I think you find that your AMD gaming performance more than doubles When you use your AMD CPU with an AMD GPU(s) . Nvidia always works better with intell its not made for AMD cpu's. People buy intell and Nvidia for the same reason they buy apple product's they think cost equals quality I think you will find that an 8 core AMD FX octa core 5ghz CPU With 4 $150 graphic cards running in crossfire and liquid cooling since your top grade AMD parts will melt without it. will out do the top level intel CPU and say a $1500 Nvidia GPU. the amd rig is faster and less than half the cost Yes AMD is cheaper investment cost. but that cpu and 4 gpu's whit everything else are going to suck up 1,500 watts of power. leave that thing running for a year you just sucked up 1,300 of electricity at 15 cents per kwh (cost where I live) I know electric bills all to well, when you have 25 of those puppys mining litecoin and your paying 4k a month in electric its got to be profitable, I don't mine anymore, difficulty's so high id loose money on electric cost. but I made ten fold my investment.
 
I think you find that your AMD gaming performance more than doubles When you use your AMD CPU with an AMD GPU(s) .
May I ask what are you smoking?

The PCIe spec, x86 spec, AMD x64 specs don't magically operate double clock depending on what card is in your PCIe slot. Unless a *program* is written specifically optimised for AMD CPU + AMD GPU (which mind you are independent parts anyway when you are talking programming), "more than doubles" is just made up.
 
May I ask what are you smoking?

The PCIe spec, x86 spec, AMD x64 specs don't magically operate double clock depending on what card is in your PCIe slot. Unless a *program* is written specifically optimised for AMD CPU + AMD GPU (which mind you are independent parts anyway when you are talking programming), "more than doubles" is just made up.

The funny part being that Nvidia's GPU's perform better on AMD's CPUs than AMD's own GPU's, at least in DX11 titles as they don't come with the added driver overhead. Of course AMD's GPU's also perform better on Intel CPU's but we all know that by now, right?

Hopefully AMD can change all that soon.
 
Back