AMD Mantle Performance: Thief & Battlefield 4

Many people seem to miss the real benefit of this API. It basically allows you to pair a stronger gpu with a lesser cpu. If you have a limited budget, and want the best gaming performance, you will be aiming to spend as much of that money to gpu while not being bottlenecked by the cpu that the remaining money can buy. The main point that the conclusion skips is exactly this. It says that it is not very probable that someone matches a 8350 with a 290x, but rather 8350 - 280x is more likely. Why? If 8350 didn't bottleneck 290x in dx11 I am sure that many people would prefer it over 4770 and 280x. This is what this API supposed to offer us consumers. By the way, I'm not implying that Mantle has achieved doing this, nor am I an AMD fanboy. I got 3770 and 290 and I want the best performance for the least money.
 
Many people seem to miss the real benefit of this API. It basically allows you to pair a stronger gpu with a lesser cpu. If you have a limited budget, and want the best gaming performance, you will be aiming to spend as much of that money to gpu while not being bottlenecked by the cpu that the remaining money can buy. The main point that the conclusion skips is exactly this. It says that it is not very probable that someone matches a 8350 with a 290x, but rather 8350 - 280x is more likely. Why? If 8350 didn't bottleneck 290x in dx11 I am sure that many people would prefer it over 4770 and 280x. This is what this API supposed to offer us consumers. By the way, I'm not implying that Mantle has achieved doing this, nor am I an AMD fanboy. I got 3770 and 290 and I want the best performance for the least money.

So your secretly trying to tell us your a "Poor mans fanboy"? xD
 
Well they didn't I was mostly referring to the Single tests as well in HardOCP and the frame time variance that was very smooth in BF4 with Mantle online. But of course everyones tests are going to be different because all hardware acts differently and scenarios will of course change just be the smallest of details.
You'd also need to take into account which maps are used for benching, and of course, Steve is using the single player campaign, whilst Kyle and Co are using multiplayer.
 
You'd also need to take into account which maps are used for benching, and of course, Steve is using the single player campaign, whilst Kyle and Co are using multiplayer.
True, since I to was also referring to multi-player with my setup since the campaign is more or less nothing special.

Its hard to test multiplayer since the game has no built in benchmark software and there is a huge difference between the way multiplayer acts and single player acts.
 
I feel it was rushed. It could possibly be better if they waited.
Hes just trying to get a reaction out of people, ignore the troll.

But anyway, @JC713, as stated it removes most of the CPU head and adds to the stability/smoothness of the game (Now that it works properly) especially in the case of BF4. Before on 14.1, it crashed constantly but now on multiplayer its definitely more stable than before and keeps a more constant FPS (At least for me).

But in short, since its working its got its advantages and of course time will see some improvements (At least I hope).
 
Hilarious anti-AMD 'review' as always.
1) The disgusting claim that no-one would pair a 290 with an AMD processor. Clearly, the 290 is the biggest thing to have happened to 'value' AAA gaming in a long time, simply because the 28nm process has ended up kicking around for far, far longer than either Nvidia or AMD anticipated. So, many people have a perfectly good PC with non-i5/i7 processors that they would like to upgrade with a simple, long lasting act- and putting in a 290 is such an act.

2) The disgusting implication that Intel CPUs are inherently better for gaming. Mantle proves the exact opposite. Mantle does NOT make a CPU work 'harder'. Mantle (in BF4 and Thief 2014) simply changes the way in which the CPU talks to the GPU. So, why does Mantle make games go faster? Because Intel paid Microsoft to create a driver model that exploits LEGACY modes in the CPU that deal with obsolete forms of connectivity between the CPU and the bus - legacy modes that Intel processes faster than AMD. Mantle dumps the artificially crippled DX driver layer, and replaces it with a modern bus communication system.

It is like how Intel, despite inventing SSE (Intel's version of AMD's earlier 3D-Now), instructed tech-sites to benchmark CPUs using obsolete x87 floating-point instructions, because Intel had a clear lead over AMD when running obsolete code. Nvidia, in its highly crippled CPU implementations of PhysX, purposely complied against the insanely slow x87 op-codes until very recently. Why? Because it made PhysX artificially run like garbage on AMD systems.

Thief 2014 is an Nvidia title, in so far as it uses the purposely crippled Unreal 3 engine. Epic work hand-in-glove with Nvidia and Intel to ensure the Unreal 3 engine ships in a form that performs badly on AMD CPUs and GPUs. Unreal 4 is currently adding record amounts of Nvidia 'middleware' that will run on Nvidia GPUs and (strange as it may seem) the AMD powered Xbox One and PS4, but NOT on PCs with AMD GPUs.

No-one is denying that the end result of all these very dirty games is that a gamer is better off with an i7 system and a 780TI, if they can possibly afford such a combination. But despite the lying statements of sites like this one, many ambitious gamers will be adding a 290 to an existing, less ideally specced PC, and hoping this one-time investment will serve them well across the next three years.

The current Mantle initiative, and coming Mantle based DX12 will certainly solve the current problem of AMD processors being artificially crippled. But, the dirty tricks from Nvidia, inserting Nvidia-only features into PC-game code, look like they are going to get a whole lot worse. The Witcher 3 people, for instance, have just announced that they have taken a massive pay-off from Nvidia to make the AMD-GPU-PC version of Witcher 3 vastly inferior to the Nvidia-PC, and console versions.
 
Hilarious anti-AMD 'review' as always.
1) The disgusting claim that no-one would pair a 290 with an AMD processor. Clearly, the 290 is the biggest thing to have happened to 'value' AAA gaming in a long time, simply because the 28nm process has ended up kicking around for far, far longer than either Nvidia or AMD anticipated. So, many people have a perfectly good PC with non-i5/i7 processors that they would like to upgrade with a simple, long lasting act- and putting in a 290 is such an act.

2) The disgusting implication that Intel CPUs are inherently better for gaming. Mantle proves the exact opposite. Mantle does NOT make a CPU work 'harder'. Mantle (in BF4 and Thief 2014) simply changes the way in which the CPU talks to the GPU. So, why does Mantle make games go faster? Because Intel paid Microsoft to create a driver model that exploits LEGACY modes in the CPU that deal with obsolete forms of connectivity between the CPU and the bus - legacy modes that Intel processes faster than AMD. Mantle dumps the artificially crippled DX driver layer, and replaces it with a modern bus communication system.

It is like how Intel, despite inventing SSE (Intel's version of AMD's earlier 3D-Now), instructed tech-sites to benchmark CPUs using obsolete x87 floating-point instructions, because Intel had a clear lead over AMD when running obsolete code. Nvidia, in its highly crippled CPU implementations of PhysX, purposely complied against the insanely slow x87 op-codes until very recently. Why? Because it made PhysX artificially run like garbage on AMD systems.

Thief 2014 is an Nvidia title, in so far as it uses the purposely crippled Unreal 3 engine. Epic work hand-in-glove with Nvidia and Intel to ensure the Unreal 3 engine ships in a form that performs badly on AMD CPUs and GPUs. Unreal 4 is currently adding record amounts of Nvidia 'middleware' that will run on Nvidia GPUs and (strange as it may seem) the AMD powered Xbox One and PS4, but NOT on PCs with AMD GPUs.

No-one is denying that the end result of all these very dirty games is that a gamer is better off with an i7 system and a 780TI, if they can possibly afford such a combination. But despite the lying statements of sites like this one, many ambitious gamers will be adding a 290 to an existing, less ideally specced PC, and hoping this one-time investment will serve them well across the next three years.

The current Mantle initiative, and coming Mantle based DX12 will certainly solve the current problem of AMD processors being artificially crippled. But, the dirty tricks from Nvidia, inserting Nvidia-only features into PC-game code, look like they are going to get a whole lot worse. The Witcher 3 people, for instance, have just announced that they have taken a massive pay-off from Nvidia to make the AMD-GPU-PC version of Witcher 3 vastly inferior to the Nvidia-PC, and console versions.
Umm Dude, Techspot just gave their results and opinions regarding what they have seen and the models that more gamers are accustomed to. While what some of your comments say is true, more gamers are going to pair up an i5 with a 290/x because the price points for gamers is pretty close together and it performs excellently for gaming.

Most of what was said was saying removing the CPU from the mix helps in many cases, but most PC gamers are running i5's or i7's in this day and age especially with AMD's drop (At least as of now) of the AM3+ Platform.

Fact is, cool down because they gave their viewpoint and provided facts, and this is also coming from a site that normally rates AMD GPU hardware higher than NVidia and even other sites. They are far from biased.
 
It's about the lowest framerate, so 140 compared to 100 is in reality a 40% increase.
Yes, I can see that. But is the 100 or 140 considered the reference point, when they are both considered drops in frame rate? In this case it seems the established reference point is the 200 max frame rate.
 
It's about the lowest framerate, so 140 compared to 100 is in reality a 40% increase.
Yes, I can see that. But is the 100 or 140 considered the reference point, when they are both considered drops in frame rate? In this case it seems the established reference point is the 200 max frame rate.
The reference point for most gamers today is achieving at least 60 fps when playing in any conditions, so the 100 or 140 lower framerate in fact doesn't mean anything as a reference point.It only shows us that a specific hardware had a 40% increase in framerate when using Mantle.
 
"Core i3 through i7 owners can hope to see a 5% boost (a few fps) with Mantle but that's not enough to recommend it over the more mature DirectX 11 API for now."

Sorry, but what are you "not recommending" here?

Are you recommending developers to no use an API ? ( doing that based on this review would be hilarious )

Users to not enable Mantle on their system?

Do not considering Mantle as a feature when choosing their next video card?

If you believe based on that statement we are recommending developers not use Mantle then yes that is hilarious. Given the results we recommend that gamers, particularly those playing BF4 stick with DX11. The small performance improvement (if you receive one at all) isn’t worth the stability trade off. BF4 isn’t the most stable game out there but we found it was more stable with DX11. Frankly when it comes to a next gen video card as long as you have a decent CPU mantle shouldn’t be a factor.

On a side question: Why is there no testing down in 1920x1080? I know a lot of gamers that run at 1080 resolution, but I never see benchmarks for it on Techspot. Most budget gamers run at this resolution so shouldn't this be used? Will this ever be an option for testing?

1920x1200 should be close enough to 1920x1080 for you to work out HD performance, we are talking around 10% more pixels.

I think everyone's really missing the point of Mantle... it's not really meant for R9 290 owners... in fact it's not really designed for discrete graphics owners. AMD knows (Oh you better believe they KNOW,) their CPUs will never catch up to Intel. Plain and simple, Intel has far too many resources. It'd be like Sears trying to overtake Walmart.

Since the new generation of consoles have released games will become more CPU intensive, and graphically demanding (happens every generation.) For AMD to compete they need to think outside of the box. AMD discrete GPUs pair well with Intel chips... so they make some money there, but the market is focusing on low power, low heat, low noise, cheap APUs that do it all. Here comes Mantle to the rescue. Kaveri wasn't the knockout I was hoping for, but I bet AMD's next APUs keep the same Steamroller cores, but beef up the GPU. Mantle will allow games to run well with the weak CPU and strong GPU the same as an equivalent GPU paired with an Intel chip (I'd bet a 1024 Steam Processor APU is in the works.)

The moral of the story is... if you have a highend discrete GPU, pair it with an Intel chip (Core i5 is the best bang for your buck,) but if you want to buy a cheap APU that'll handle your gaming in your living room (Steambox,) AMD, like with this generation of consoles, will be your best option. The HW isn't there yet, but it will be sooner or later.

Please check the Radeon R7 260X results. The situation is even worse with APU’s, I know because we tested them (the data wasn't shown because it was useless just as the R7 260X data is). When it comes to APU’s believe me they are not CPU limited in games, Mantle has nothing to offer here for the most part. Mantle is indeed designed for R9 290 owners, R9 290 owners with slow CPU’s to be precise.

There is a reason AMD didn't show any Mantle APU performance in their review guides. They did should an APU offering huge Mantle performance gains when paired with a GeForce GTX 770 :)

Many people seem to miss the real benefit of this API. It basically allows you to pair a stronger gpu with a lesser cpu. If you have a limited budget, and want the best gaming performance, you will be aiming to spend as much of that money to gpu while not being bottlenecked by the cpu that the remaining money can buy. The main point that the conclusion skips is exactly this. It says that it is not very probable that someone matches a 8350 with a 290x, but rather 8350 - 280x is more likely. Why? If 8350 didn't bottleneck 290x in dx11 I am sure that many people would prefer it over 4770 and 280x. This is what this API supposed to offer us consumers. By the way, I'm not implying that Mantle has achieved doing this, nor am I an AMD fanboy. I got 3770 and 290 and I want the best performance for the least money.

You realize that the FX-8350 costs $200 while the most expensive Core i5 being the 4670K costs $40 more at $240. Meanwhile the i5-4570 can be had for $200. Either way there is a $40 price gap at the most here.

The Radeon R9 280X costs $350 and the Radeon R9 290X an eye watering $600. Seems like its cheaper to invest in a faster CPU than a faster GPU :S

Hilarious anti-AMD 'review' as always.

Thank you for the nickel.
 
Last edited:
Please check the Radeon R7 260X results. The situation is even worse with APU’s, I know because we tested them (the data wasn't shown because it was useless just as the R7 260X data is). When it comes to APU’s believe me they are not CPU limited in games, Mantle has nothing to offer here for the most part. Mantle is indeed designed for R9 290 owners, R9 290 owners with slow CPU’s to be precise.

APUs are not where they need to be at the moment, however AMD has proven they can strap an R9 265X (Radeon 7850 basically,) to their APUs in the PS4. Since it's already been done I'm sure a Desktop variant is in the works. Once that level of GPU, and beyond, is achieved the CPU will become a limiting factor.
 
APUs are not where they need to be at the moment, however AMD has proven they can strap an R9 265X (Radeon 7850 basically,) to their APUs in the PS4. Since it's already been done I'm sure a Desktop variant is in the works. Once that level of GPU, and beyond, is achieved the CPU will become a limiting factor.

Maybe, that's assuming once they do that CPU performance won't have changed and you can bet it will ;)

Plus your PS4 example is just that, a PS4 example. If it were economical for AMD to offer an APU sporting a Radeon R7 265 for desktop computers you can bet they would have done it.

Fact is no one is going to invest in an expensive APU that has Piledriver performance with a gusty GPU. Unlike the PS4 there are options on PC and buying a Core i3/265 combo just makes more sense right now.

AMD APU’s only make sense if they are cheap and aren’t going to be coupled with a discrete GPU. If they cost more than a Core i3 there isn’t any point and if you are going to buy a more powerful GPU then again there isn’t any point.

Finally defore Archer11 comes back waving his ‘Hilarious anti-AMD’ flag I should inform you that I have three HTPC’s in my house and all three use an AMD APU.
 
Maybe, that's assuming once they do that CPU performance won't have changed and you can bet it will ;)

Plus your PS4 example is just that, a PS4 example. If it were economical for AMD to offer an APU sporting a Radeon R9 265X for desktop computers you can bet they would have done it.

Fact is no one is going to invest in an expensive APU that has Piledriver performance with a gusty GPU. Unlike the PS4 there are options on PC and buying a Core i3/265X combo just makes more sense right now.

AMD APU’s only make sense if they are cheap and aren’t going to be coupled with a discrete GPU. If they cost more than a Core i3 there isn’t any point and if you are going to buy a more powerful GPU then again there isn’t any point.

Finally defore Archer11 comes back waving his ‘Hilarious anti-AMD’ flag I should inform you that I have three HTPC’s in my house and all three use an AMD APU.
I think most of the holdup on doing something like that power comes down to Ram. As seen before, the 7850K (And its predecessors) are very dependent on high speed ram to perform at all in the gaming world. This is why I believe AMD is holding off on releasing such an APU until a time when we have DDR4 or above.

Just my thoughts of course, a grain of salt in the ocean.
 
I think most of the holdup on doing something like that power comes down to Ram. As seen before, the 7850K (And its predecessors) are very dependent on high speed ram to perform at all in the gaming world. This is why I believe AMD is holding off on releasing such an APU until a time when we have DDR4 or above.

Just my thoughts of course, a grain of salt in the ocean.

That is a good valid point. It is also worth considering that AMD is a business and a strong part of that business is their GPU sales. They would be doing nothing short of shooting themselves in the foot if they took one of their $200 GPU's and stuck it in an APU selling it for $200 or less. Any APU today that features something as powerful as the R7 265 would have to cost over $200 and that's a lot for an AMD APU.
 
I like the new Techspot 3.0 and I'm quickly becoming used to it but this is the 1st review I've read in this format and I found out that I can no longer quickly navigate to the 'final thoughts' without having to scroll through all the other pages. If it's a subject or product that I find interesting it's not a problem as I will read the entire review but not every review or subject tickles my fancy and I just want to read the introduction and closing, this is one of them.
I'd appreciate it if the devs would include a 'quickly navigate to page' option.

This feature has been moved to the index that you can find towards the top and on the right hand side of every feature article. Hope this helps :)
 
Kinda read like a hatchet piece or at least a little biased. It does what it is supposed to do, which is alleviate the cpu bottleneck. It will truly be more useful in RTS or other games that by nature are cpu intensive. For this reason I will be interested in the performance difference in Star Citizen. Of course you'll probably just say that AMD cpus performed horribly without it and not that they were improved with it (like you did with thief here)... AMD knows there are woefully behind intel, that's why they do this, and knowing that you should be a little more impressed.
 
That is a good valid point. It is also worth considering that AMD is a business and a strong part of that business is their GPU sales. They would be doing nothing short of shooting themselves in the foot if they took one of their $200 GPU's and stuck it in an APU selling it for $200 or less. Any APU today that features something as powerful as the R7 265 would have to cost over $200 and that's a lot for an AMD APU.
True, they would probably have to drop everything below whatever their next top APU has for that to work and to charge accordingly which is of course hard because of the lower CPU performance and the slow DDR3 Ram.

I look forward to this year mostly because Haswell-E will support DDR4 and soon enough im sure the APU's will follow suit.
 
Long story short, it's a piece of software that you can use to gain at least a small boost in gaming. I don't see nothing wrong with that.
 
Kinda read like a hatchet piece or at least a little biased. It does what it is supposed to do, which is alleviate the cpu bottleneck. It will truly be more useful in RTS or other games that by nature are cpu intensive. For this reason I will be interested in the performance difference in Star Citizen. Of course you'll probably just say that AMD cpus performed horribly without it and not that they were improved with it (like you did with thief here)...

Nvidia’s PhysX does what it is supposed to as well but we never recommended anyone invest in a GeForce just for this technology.

Saying it will be more useful in RTS games is just an assumption. While I agree it has more potential to be useful here we are yet to see that. Yes it can alleviate CPU overhead but if the CPU is intended to be used I am not sure Mantle can help. The CPU has a lot of specific jobs it has to do in an RTS game and Mantle cannot change that so will it really have any impact here? Time will tell.

So you are saying we will just be truthful and state the obvious about AMD CPU performance in Star Citizen? I hope so I don’t want to start lying to you all now.

AMD knows there are woefully behind intel, that's why they do this, and knowing that you should be a little more impressed.

I should be impressed that AMD has created an API that allows consumers to use their AMD CPU’s with their high-end AMD GPU’s and achieve maximum performance or close to in a handful of very select titles? If that was the game plan all along I think the word you are after is bamboozled rather than impressed.

Moreover if that really is the case, AMD are helping out gamers who invested in their CPUs and are missing out on performance, I would be impressed if they provided the CPU Hotfix aka Mantle to all users AMD and Nvidia. Sucks if you bought an AMD CPU with a Nvidia GPU.

Long story short, it's a piece of software that you can use to gain at least a small boost in gaming. I don't see nothing wrong with that.

You are right there isn’t anything wrong with that. We are just don’t think it is going to revolutionize gaming like AMD say.
 
Last edited:
To me this review is completely unfounded and misleading... I found +78% for min fps in Thief ultra 1920x1080 with fx8350+290 and it's a great improvement... so your "frames per second" what? min, avg, max? if you write just a unique number and don't explain, it doesn't mean anything at all... so I don't trust you, I trust myself... then, BF4 is totally smooth regarding the frametime and compared to D3D, there are only some spikes randomly...'cause they are teething expected issues...why don't you mention something like this?
 
To me this review is completely unfounded and misleading... I found +78% for min fps in Thief ultra 1920x1080 with fx8350+290 and it's a great improvement... so your "frames per second" what? min, avg, max? if you write just a unique number and don't explain, it doesn't mean anything at all... so I don't trust you, I trust myself... then, BF4 is totally smooth regarding the frametime and compared to D3D, there are only some spikes randomly...'cause they are teething expected issues...why don't you mention something like this?

It’s a sad say when you learn that your CPU was robbing you of 80% performance.

FYI - Our performance numbers align almost exactly with other Thief Mantle reviews that were published around the same time.
 
Back