AMD Mantle Performance: Thief & Battlefield 4

To me this review is completely unfounded and misleading... I found +78% for min fps in Thief ultra 1920x1080 with fx8350+290 and it's a great improvement... so your "frames per second" what? min, avg, max? if you write just a unique number and don't explain, it doesn't mean anything at all... so I don't trust you, I trust myself... then, BF4 is totally smooth regarding the frametime and compared to D3D, there are only some spikes randomly...'cause they are teething expected issues...why don't you mention something like this?

It’s a sad say when you learn that your CPU was robbing you of 80% performance.
You're completly wrong there.
How a CPU is possibly robbing performance, and that being fixed by a piece of software?
The whole point, which is confirmed by what can be seen of DX12, is that DX11 was terribly charging CPU, and it was doing it in a terrible way.
On DX11, multithreading is pretty hard to use, and draw call are charging CPU a lot.
you seems to beleive it is not shocking in 2014. Both AMD and Intel can't improve IPC much anymmore, and CPU gain are optained by increasing the number of core, for close to 10 years now.
DX11 was robbing performance, no one else.
Worse, you said:
Looking at things realistically if you own an AMD CPU for gaming you are unlikely to be using an FX-8350 with the R9 290X.
Why, just why? How do you choose your component when building a rig?
I personnaly have a price limit. GPU is the limit like 99.9% of the time. Why would I spend more than 350€ in a 4770k? Even with a 290X, you are never CPU limited with a 8350 in any real life situation. And now less than ever.
What I mean is I will buy the ceapest possible CPU, since it is not the limiting factor, to be able to buy the most powerfull possible GPU, to increase framerate, and play at ultra setting. And that is what mantle allow me, actually, to use a super cheap 8350, to buy an overpower GPU, to buy a rig which will be a lot faster than a 4770k + R9 290, and cheaper. 4770k are 350€ here, and 8350, 150€, I save just 200€. I can have a R9 290 for 320€. So R9 290 + 4770k is 670€. 8350 + R9 290X is 550€. Cheaper and faster. 4770k + R9 290X is 720€ . 8350 + 2 x R9 290 is 790 €. A bit more expensive, and a lot faster.

Buying a 290 and a 4770K is actually stupid now. You just save money to buy an over power CPU, just to overcome DX11 limitation. And that prevent you to buy a 290X.
Btw here in Europe, I can have a 290X for just 399€, with BF4 included.
 
In other words mantle is useless unless you're a doofus system builder that likes to pair a $400+ graphics card with a cheap $100 processor.

They said the 6350, and the 8350 performed the same. Granted I have the 6300 but that performs the same as the 6350 with any overclock at all. I also have the r9 280x which is now $400 dollars in places. Am I a doofus for only getting a $100 dollar cpu when even a cpu intensive game like bf4 only restricts my gpu by two frames per second? Every game I play, I get the same fps benchmarks I see on here or toms hardware within about that margin.

You have to realize that people build pc's for different things. Not everyone needs to play ARMA III, so why would anyone who just games need an i7? Yeah its better kind of sometimes but I absolutely got the most out of my budget. And now with developers optimizing better for the cpu, I will need a new motherboard before I get bottle-necked.
 
In other words mantle is useless unless you're a doofus system builder that likes to pair a $400+ graphics card with a cheap $100 processor.

They said the 6350, and the 8350 performed the same. Granted I have the 6300 but that performs the same as the 6350 with any overclock at all. I also have the r9 280x which is now $400 dollars in places. Am I a doofus for only getting a $100 dollar cpu when even a cpu intensive game like bf4 only restricts my gpu by two frames per second? Every game I play, I get the same fps benchmarks I see on here or toms hardware within about that margin.

You have to realize that people build pc's for different things. Not everyone needs to play ARMA III, so why would anyone who just games need an i7? Yeah its better kind of sometimes but I absolutely got the most out of my budget. And now with developers optimizing better for the cpu, I will need a new motherboard before I get bottle-necked.

I was talking about the FM2+ series CPUs they were using with the 290x.
 
You're completly wrong there.

How a CPU is possibly robbing performance, and that being fixed by a piece of software?

The whole point, which is confirmed by what can be seen of DX12, is that DX11 was terribly charging CPU, and it was doing it in a terrible way.

On DX11, multithreading is pretty hard to use, and draw call are charging CPU a lot.

you seems to beleive it is not shocking in 2014.

Seems hard to be completely wrong when the benchmark results in our article and others on the Internet back up exactly what I am saying. You can bang on all you want about DX11 being inefficient and while it might be true in reality it makes very little difference.

How can you look at the Thief results with the Radeon R9 290X and draw any other conclusion. At 2560x1600 the Core i3-4130 averaged 42fps, the FX-8350 managed 33fps and then 42fps with Mantle. It the same story at the lower resolutions as well, in fact things just get worse for the FX-8350 using DX11 while the pokey little Core i3-4130 is able to get the most out of the R9 290X at 1920x1200.

Both AMD and Intel can't improve IPC much anymmore, and CPU gain are optained by increasing the number of core, for close to 10 years now.

DX11 was robbing performance, no one else.

Not sure how to respond to this, have you been hiding under a rock of the past 7 years? AMD’s core efficiency is very weak when compared to Intel’s that is the entire problem!

Worse, you said:
Why, just why? How do you choose your component when building a rig?

I personnaly have a price limit. GPU is the limit like 99.9% of the time. Why would I spend more than 350€ in a 4770k? Even with a 290X, you are never CPU limited with a 8350 in any real life situation. And now less than ever.

What I mean is I will buy the ceapest possible CPU, since it is not the limiting factor, to be able to buy the most powerfull possible GPU, to increase framerate, and play at ultra setting. And that is what mantle allow me, actually, to use a super cheap 8350, to buy an overpower GPU, to buy a rig which will be a lot faster than a 4770k + R9 290, and cheaper. 4770k are 350€ here, and 8350, 150€, I save just 200€. I can have a R9 290 for 320€. So R9 290 + 4770k is 670€. 8350 + R9 290X is 550€. Cheaper and faster. 4770k + R9 290X is 720€ . 8350 + 2 x R9 290 is 790 €. A bit more expensive, and a lot faster.

Buying a 290 and a 4770K is actually stupid now. You just save money to buy an over power CPU, just to overcome DX11 limitation. And that prevent you to buy a 290X.

Btw here in Europe, I can have a 290X for just 399€, with BF4 included.

Who said anything about buying the Core i7-4770K? Again your rant seems misinformed. As always we recommend the Core i5 for gaming, certain models can be had for the same price as the FX-8350 and overall deliver much better performance. I am willing to bet I have tested significantly more games for CPU scaling than you have and the evidence is compelling. Do yourself a favour and read out last dozen or some gaming performance articles.

Moreover getting back to the Core i3 example I just gave in Thief, how can you rant about the FX-8350/R9 290 combo being intelligent when a Core i3 which costs much less than the FX-8350 delivers better gaming performance. Hell even in Battlefield 4 the Core i3 matched the FX-8350 with DX11 and Mantle.
 
Seems hard to be completely wrong when the benchmark results in our article and others on the Internet back up exactly what I am saying. You can bang on all you want about DX11 being inefficient and while it might be true in reality it makes very little difference.

How can you look at the Thief results with the Radeon R9 290X and draw any other conclusion. At 2560x1600 the Core i3-4130 averaged 42fps, the FX-8350 managed 33fps and then 42fps with Mantle. It the same story at the lower resolutions as well, in fact things just get worse for the FX-8350 using DX11 while the pokey little Core i3-4130 is able to get the most out of the R9 290X at 1920x1200.
Do you know what an API is?
Please explaim how, by reading your results, the CPU can be the issue? And how a slow CPU can be magicaly made faster just by using another piece of software? Seriously?
Your tests just show how infficient directX11 is at using CPU. And how directX11 is unable to use a multicore processor.
Otherwise, Mantle would not have been able to give any improvment.
Not sure how to respond to this, have you been hiding under a rock of the past 7 years? AMD’s core efficiency is very weak when compared to Intel’s that is the entire problem!
No, that's not the problem. Since I was not hiding under a rock during the last years, that's why I saw that no one won't get much imporvment in performance on a epr core basis, and that it is a lot easier and more efficient to had more core. Single core CPU almost does not exists anymore, and mainstream CPU have at least 4. Software not able to use 4 core is the issue, not matter what. AMD was hit first, because they choose the multicore way first, but Intel CPU have exactly the same problem. Having OK performance is not satisfyingwhen just one coreof the four of powerfull CPU is used at 90%.

Who said anything about buying the Core i7-4770K? Again your rant seems misinformed. As always we recommend the Core i5 for gaming, certain models can be had for the same price as the FX-8350 and overall deliver much better performance. I am willing to bet I have tested significantly more games for CPU scaling than you have and the evidence is compelling. Do yourself a favour and read out last dozen or some gaming performance articles.
I am sorry to inform you that you're not alone on earth do do some CPU review. Google is your friend.
Also you have no idea who I am, what my job is, and what I sutidied at university. Do yourslef a favour, and don't try that "I know better than you just shut the **** up thing".
Moreover getting back to the Core i3 example I just gave in Thief, how can you rant about the FX-8350/R9 290 combo being intelligent when a Core i3 which costs much less than the FX-8350 delivers better gaming performance. Hell even in Battlefield 4 the Core i3 matched the FX-8350 with DX11 and Mantle.
Maybe because the main issue with CPU on gaming rigs seems to be directX11. Maybe because Mantle just show that. Recent openGL bench show that. Hell, even Micreosoft recognized it, saying that they will mainly work on that for directX12. So choosing right know a CPU just to try to overcome a software limitation, limitation that seems to not exist anymore in a few month is, from my point of view, not really a good choice is a not so short term view.
If I buy a new rigs now, I will choose something that will perform well next year. When per core performance will not mean anything, and when I will be able to fully use my GPU.

My rant was again directX11, mainly. And I still can't understand, how, by reading YOUR result, you can think otherwise.
 
don't try that "I know better than you just shut the **** up thing".
Why not? That is the way it appears to me. I don't see compelling evidence to change our minds. Ohh and don't come here stating an argument and then suggest Google is your friend. If you are going to state an argument, usually the way that works is you bring the facts or at least links.
 
Why not? That is the way it appears to me. I don't see compelling evidence to change our minds. Ohh and don't come here stating an argument and then suggest Google is your friend. If you are going to state an argument, usually the way that works is you bring the facts or at least links.
Because it's a kid argument.
I studied electronics and computer architecture at university, and I work for ten years in electronics and computring. My current job is about integrating custom develleped software within AIX cluster. Even, if it is more about server, and server related hardware, I am not exactly a newbee, especialy not a newbe about multicore software capabilitie. The main difference between him and me is that I do not pretend ot be the king of the computer hardware.
I said that google is his friend because any half literrate imbecile would find a lot of litterature on the internet using google. He is the journalist, not me. He is supposed to not this. I can take some times to find some links for him. But I a m not sure it would be even uselfull when his own benchmark results suggest I am right.
Just give me an answer to this unique question: How a piece of software could possibly make a slow and defficient CPU fast, and competitive, from a technical point of view? Then I would happily agree that AMD CPU are the issue.
 
Just give me an answer to this unique question: How a piece of software could possibly make a slow and defficient CPU fast, and competitive, from a technical point of view? Then I would happily agree that AMD CPU are the issue.
That is not what he is suggesting. He (and I) are suggesting that AMD CPU's are deficient and that the piece of software is designed to take the load off and give it the GPU. Which in turn makes it appear as if the software makes the deficient CPU more efficient.
 
That is not what he is suggesting. He (and I) are suggesting that AMD CPU's are deficient and that the piece of software is designed to take the load off and give it the GPU. Which in turn makes it appear as if the software makes the deficient CPU more efficient.
But you did noty answer the question.
What you say seems to be fixe by using mantle, which is, as far as I know, a pice of software.

So, how a pice of software can fixe a bad CPU design?
And I tried to explain that performance difference, but I got no answer on this as well: What was wrong in my reasoning?
 
It doesn't fix anything. It is AMD's BS response to Intel's more efficient CPU.

Thanks for your efforts here cliffordcooley but I have given up. This conversation no longer interests me. He/she is completely off topic with their comments and is missing the point. There is clearly a large ego at work here so let’s move on.
 
DX11 8350 vs i3 ??? I think your benchmarks may need looking at, as not even the i5 matches the 8350.
Awesome, which board of those listed did the Russian site benchmark the LGA 2011 socket CPU with... the LGA1150, 1155, 1156, or 1366?
Just to note, the i3 that Steve used is a Haswell (3.4GHz). The i3 the Russian's used the two generations older 2100 (3.1GHz). There is a reasonable performance difference there.
Just an aside, the same Russians were one of the few that had the R9 295X2 being spanked by a GTX 780 Ti SLI setup in 4K and 1600p
http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-Radeon_R9_295X2_-tests-bf4_4k.jpg
http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-Radeon_R9_295X2_-tests-bf4_2560.jpg
 
Well. Personally I went from 69fps on average in BF4 to never going under 75 (in SP) Average would land around 85. (Went between 75-110) Thief gave me a nice boost as well. No surprise perhaps since I use the 8320 (@4.6GHz) 3 of the biggest engines in gaming has incorporated Mantle or are doing so as we write. Dont think the dev´s would bother if all they saw were a 5% increase. But the MAJOR, say again, MAJOR thing Mantle brought was the elimination of screen tearing. I can play both Thief and BF4 without v-sync on my 42" LCD TV (75hz) Something that was impossible in DX. Cant stand tearing. Hate it. That is what I love with Mantle. The extra fps are just icing on the cake and I hope they optimize it further. Thief runs even smoother now with the official 14,4 driver. Just saying... =)
 
Thanks for the feedback. Your situation isn't surprising, as we know the FX 8320 performs fairly poorly in those two games. Those with Intel Core processors probably don't suffer from the screen tearing in DX11 like you do so its not a DX11 issue. I know both those games appear silky smooth for me using say a Core i5 processor on 60Hz up to 144Hz screens.

What I suspect is because your minimum and maximum frame rates are so far apart, due to the CPU, you see quite a lot of tearing. Using Mantle stabilizes and improves your frame rate and therefore removes the tearing you were seeing.

In any case Mantle is a good deal for you and other AMD CPU users.
 
Back