TechSpot

AMD Mantle Performance: Thief & Battlefield 4

By Steve
Mar 24, 2014
Post New Reply
  1. Many people seem to miss the real benefit of this API. It basically allows you to pair a stronger gpu with a lesser cpu. If you have a limited budget, and want the best gaming performance, you will be aiming to spend as much of that money to gpu while not being bottlenecked by the cpu that the remaining money can buy. The main point that the conclusion skips is exactly this. It says that it is not very probable that someone matches a 8350 with a 290x, but rather 8350 - 280x is more likely. Why? If 8350 didn't bottleneck 290x in dx11 I am sure that many people would prefer it over 4770 and 280x. This is what this API supposed to offer us consumers. By the way, I'm not implying that Mantle has achieved doing this, nor am I an AMD fanboy. I got 3770 and 290 and I want the best performance for the least money.
  2. wastedkill

    wastedkill TS Maniac Posts: 1,144   +241

    So your secretly trying to tell us your a "Poor mans fanboy"? xD
  3. dividebyzero

    dividebyzero trainee n00b Posts: 4,836   +667

    You'd also need to take into account which maps are used for benching, and of course, Steve is using the single player campaign, whilst Kyle and Co are using multiplayer.
  4. GhostRyder

    GhostRyder TS Evangelist Posts: 2,206   +518

    True, since I to was also referring to multi-player with my setup since the campaign is more or less nothing special.

    Its hard to test multiplayer since the game has no built in benchmark software and there is a huge difference between the way multiplayer acts and single player acts.
  5. JC713

    JC713 TS Evangelist Posts: 6,939   +899

    Steve: Yeah, and the charts need a TechSpot water mark to make them more official :D.

    I feel it was rushed. It could possibly be better if they waited.
  6. mosu

    mosu TS Enthusiast Posts: 303

    It's about the lowest framerate, so 140 compared to 100 is in reality a 40% increase.
  7. GhostRyder

    GhostRyder TS Evangelist Posts: 2,206   +518

    Hes just trying to get a reaction out of people, ignore the troll.

    But anyway, @JC713, as stated it removes most of the CPU head and adds to the stability/smoothness of the game (Now that it works properly) especially in the case of BF4. Before on 14.1, it crashed constantly but now on multiplayer its definitely more stable than before and keeps a more constant FPS (At least for me).

    But in short, since its working its got its advantages and of course time will see some improvements (At least I hope).
  8. Archer11

    Archer11 TS Rookie

    Hilarious anti-AMD 'review' as always.
    1) The disgusting claim that no-one would pair a 290 with an AMD processor. Clearly, the 290 is the biggest thing to have happened to 'value' AAA gaming in a long time, simply because the 28nm process has ended up kicking around for far, far longer than either Nvidia or AMD anticipated. So, many people have a perfectly good PC with non-i5/i7 processors that they would like to upgrade with a simple, long lasting act- and putting in a 290 is such an act.

    2) The disgusting implication that Intel CPUs are inherently better for gaming. Mantle proves the exact opposite. Mantle does NOT make a CPU work 'harder'. Mantle (in BF4 and Thief 2014) simply changes the way in which the CPU talks to the GPU. So, why does Mantle make games go faster? Because Intel paid Microsoft to create a driver model that exploits LEGACY modes in the CPU that deal with obsolete forms of connectivity between the CPU and the bus - legacy modes that Intel processes faster than AMD. Mantle dumps the artificially crippled DX driver layer, and replaces it with a modern bus communication system.

    It is like how Intel, despite inventing SSE (Intel's version of AMD's earlier 3D-Now), instructed tech-sites to benchmark CPUs using obsolete x87 floating-point instructions, because Intel had a clear lead over AMD when running obsolete code. Nvidia, in its highly crippled CPU implementations of PhysX, purposely complied against the insanely slow x87 op-codes until very recently. Why? Because it made PhysX artificially run like garbage on AMD systems.

    Thief 2014 is an Nvidia title, in so far as it uses the purposely crippled Unreal 3 engine. Epic work hand-in-glove with Nvidia and Intel to ensure the Unreal 3 engine ships in a form that performs badly on AMD CPUs and GPUs. Unreal 4 is currently adding record amounts of Nvidia 'middleware' that will run on Nvidia GPUs and (strange as it may seem) the AMD powered Xbox One and PS4, but NOT on PCs with AMD GPUs.

    No-one is denying that the end result of all these very dirty games is that a gamer is better off with an i7 system and a 780TI, if they can possibly afford such a combination. But despite the lying statements of sites like this one, many ambitious gamers will be adding a 290 to an existing, less ideally specced PC, and hoping this one-time investment will serve them well across the next three years.

    The current Mantle initiative, and coming Mantle based DX12 will certainly solve the current problem of AMD processors being artificially crippled. But, the dirty tricks from Nvidia, inserting Nvidia-only features into PC-game code, look like they are going to get a whole lot worse. The Witcher 3 people, for instance, have just announced that they have taken a massive pay-off from Nvidia to make the AMD-GPU-PC version of Witcher 3 vastly inferior to the Nvidia-PC, and console versions.
    mosu likes this.
  9. GhostRyder

    GhostRyder TS Evangelist Posts: 2,206   +518

    Umm Dude, Techspot just gave their results and opinions regarding what they have seen and the models that more gamers are accustomed to. While what some of your comments say is true, more gamers are going to pair up an i5 with a 290/x because the price points for gamers is pretty close together and it performs excellently for gaming.

    Most of what was said was saying removing the CPU from the mix helps in many cases, but most PC gamers are running i5's or i7's in this day and age especially with AMD's drop (At least as of now) of the AM3+ Platform.

    Fact is, cool down because they gave their viewpoint and provided facts, and this is also coming from a site that normally rates AMD GPU hardware higher than NVidia and even other sites. They are far from biased.
  10. cliffordcooley

    cliffordcooley TechSpot Paladin Posts: 5,962   +1,482

    Yes, I can see that. But is the 100 or 140 considered the reference point, when they are both considered drops in frame rate? In this case it seems the established reference point is the 200 max frame rate.
  11. mosu

    mosu TS Enthusiast Posts: 303

    The reference point for most gamers today is achieving at least 60 fps when playing in any conditions, so the 100 or 140 lower framerate in fact doesn't mean anything as a reference point.It only shows us that a specific hardware had a 40% increase in framerate when using Mantle.
     
  12. Steve

    Steve TechSpot Staff Topic Starter Posts: 1,351   +439 Staff Member

    If you believe based on that statement we are recommending developers not use Mantle then yes that is hilarious. Given the results we recommend that gamers, particularly those playing BF4 stick with DX11. The small performance improvement (if you receive one at all) isn’t worth the stability trade off. BF4 isn’t the most stable game out there but we found it was more stable with DX11. Frankly when it comes to a next gen video card as long as you have a decent CPU mantle shouldn’t be a factor.

    1920x1200 should be close enough to 1920x1080 for you to work out HD performance, we are talking around 10% more pixels.

    Please check the Radeon R7 260X results. The situation is even worse with APU’s, I know because we tested them (the data wasn't shown because it was useless just as the R7 260X data is). When it comes to APU’s believe me they are not CPU limited in games, Mantle has nothing to offer here for the most part. Mantle is indeed designed for R9 290 owners, R9 290 owners with slow CPU’s to be precise.

    There is a reason AMD didn't show any Mantle APU performance in their review guides. They did should an APU offering huge Mantle performance gains when paired with a GeForce GTX 770 :)

    You realize that the FX-8350 costs $200 while the most expensive Core i5 being the 4670K costs $40 more at $240. Meanwhile the i5-4570 can be had for $200. Either way there is a $40 price gap at the most here.

    The Radeon R9 280X costs $350 and the Radeon R9 290X an eye watering $600. Seems like its cheaper to invest in a faster CPU than a faster GPU :S

    Thank you for the nickel.
    Last edited: Mar 25, 2014
  13. theBest11778

    theBest11778 TS Enthusiast Posts: 151   +32

    APUs are not where they need to be at the moment, however AMD has proven they can strap an R9 265X (Radeon 7850 basically,) to their APUs in the PS4. Since it's already been done I'm sure a Desktop variant is in the works. Once that level of GPU, and beyond, is achieved the CPU will become a limiting factor.
  14. Steve

    Steve TechSpot Staff Topic Starter Posts: 1,351   +439 Staff Member

    Maybe, that's assuming once they do that CPU performance won't have changed and you can bet it will ;)

    Plus your PS4 example is just that, a PS4 example. If it were economical for AMD to offer an APU sporting a Radeon R7 265 for desktop computers you can bet they would have done it.

    Fact is no one is going to invest in an expensive APU that has Piledriver performance with a gusty GPU. Unlike the PS4 there are options on PC and buying a Core i3/265 combo just makes more sense right now.

    AMD APU’s only make sense if they are cheap and aren’t going to be coupled with a discrete GPU. If they cost more than a Core i3 there isn’t any point and if you are going to buy a more powerful GPU then again there isn’t any point.

    Finally defore Archer11 comes back waving his ‘Hilarious anti-AMD’ flag I should inform you that I have three HTPC’s in my house and all three use an AMD APU.
  15. GhostRyder

    GhostRyder TS Evangelist Posts: 2,206   +518

    I think most of the holdup on doing something like that power comes down to Ram. As seen before, the 7850K (And its predecessors) are very dependent on high speed ram to perform at all in the gaming world. This is why I believe AMD is holding off on releasing such an APU until a time when we have DDR4 or above.

    Just my thoughts of course, a grain of salt in the ocean.
  16. Steve

    Steve TechSpot Staff Topic Starter Posts: 1,351   +439 Staff Member

    That is a good valid point. It is also worth considering that AMD is a business and a strong part of that business is their GPU sales. They would be doing nothing short of shooting themselves in the foot if they took one of their $200 GPU's and stuck it in an APU selling it for $200 or less. Any APU today that features something as powerful as the R7 265 would have to cost over $200 and that's a lot for an AMD APU.
  17. Scorpus

    Scorpus TechSpot Staff Posts: 601   +72 Staff Member

    This feature has been moved to the index that you can find towards the top and on the right hand side of every feature article. Hope this helps :)
  18. Julio Franco

    Julio Franco TechSpot Editor Posts: 6,527   +315

    @Skidmarksdeluxe @yukka

    Page navigation is actually more accessible than before (but perhaps not as evident), see the orange "Index" button on the right hand side.

    article index.png
  19. Kinda read like a hatchet piece or at least a little biased. It does what it is supposed to do, which is alleviate the cpu bottleneck. It will truly be more useful in RTS or other games that by nature are cpu intensive. For this reason I will be interested in the performance difference in Star Citizen. Of course you'll probably just say that AMD cpus performed horribly without it and not that they were improved with it (like you did with thief here)... AMD knows there are woefully behind intel, that's why they do this, and knowing that you should be a little more impressed.
  20. GhostRyder

    GhostRyder TS Evangelist Posts: 2,206   +518

    True, they would probably have to drop everything below whatever their next top APU has for that to work and to charge accordingly which is of course hard because of the lower CPU performance and the slow DDR3 Ram.

    I look forward to this year mostly because Haswell-E will support DDR4 and soon enough im sure the APU's will follow suit.
  21. Nobina

    Nobina TS Booster Posts: 380   +79

    Long story short, it's a piece of software that you can use to gain at least a small boost in gaming. I don't see nothing wrong with that.
  22. cliffordcooley

    cliffordcooley TechSpot Paladin Posts: 5,962   +1,482

    Sorry @Nobina, please forgive me.
    When inverted they mean, you do see something wrong. I can tell by your first sentence that is not what you meant.
  23. Steve

    Steve TechSpot Staff Topic Starter Posts: 1,351   +439 Staff Member

    Nvidia’s PhysX does what it is supposed to as well but we never recommended anyone invest in a GeForce just for this technology.

    Saying it will be more useful in RTS games is just an assumption. While I agree it has more potential to be useful here we are yet to see that. Yes it can alleviate CPU overhead but if the CPU is intended to be used I am not sure Mantle can help. The CPU has a lot of specific jobs it has to do in an RTS game and Mantle cannot change that so will it really have any impact here? Time will tell.

    So you are saying we will just be truthful and state the obvious about AMD CPU performance in Star Citizen? I hope so I don’t want to start lying to you all now.

    I should be impressed that AMD has created an API that allows consumers to use their AMD CPU’s with their high-end AMD GPU’s and achieve maximum performance or close to in a handful of very select titles? If that was the game plan all along I think the word you are after is bamboozled rather than impressed.

    Moreover if that really is the case, AMD are helping out gamers who invested in their CPUs and are missing out on performance, I would be impressed if they provided the CPU Hotfix aka Mantle to all users AMD and Nvidia. Sucks if you bought an AMD CPU with a Nvidia GPU.

    You are right there isn’t anything wrong with that. We are just don’t think it is going to revolutionize gaming like AMD say.
    Last edited: Mar 26, 2014
  24. To me this review is completely unfounded and misleading... I found +78% for min fps in Thief ultra 1920x1080 with fx8350+290 and it's a great improvement... so your "frames per second" what? min, avg, max? if you write just a unique number and don't explain, it doesn't mean anything at all... so I don't trust you, I trust myself... then, BF4 is totally smooth regarding the frametime and compared to D3D, there are only some spikes randomly...'cause they are teething expected issues...why don't you mention something like this?
  25. Steve

    Steve TechSpot Staff Topic Starter Posts: 1,351   +439 Staff Member

    It’s a sad say when you learn that your CPU was robbing you of 80% performance.

    FYI - Our performance numbers align almost exactly with other Thief Mantle reviews that were published around the same time.


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.