Mass Effect 3 Tested, Benchmarked

Why did we use more AMD processors than Intel processors? Well again that’s simple we have more AMD processors than Intel processors. AMD are great with review sample supply, they send out lots and let us keep them. Intel sends out just the flagship model and usually wants it back within a few weeks.

Wow, so you admit to accepting bribes? Can't we get a reviewer with some integrity around here?

Keep up the good work Steve. :grinthumb
 
Wow, so you admit to accepting bribes? Can't we get a reviewer with some integrity around here?

Keep up the good work Steve. :grinthumb

haha yes my Bulldozer review was a glowing report for AMD :)

Sandy what?
 
Ultraman1966 said:
Guest said:
Now start bashing MW3 for not providing a new graphics engine unlike BF3 which requires DX11 card for features which a DX9 card can handle too :D
Yes, I would love to see tessellation in DX9...
Some of DX9/10 AMD RadeonHD cards (2xxx-4xxx) have a built in tessellation unit, and it's also possible to have a limited tessellation (up to a factor of 16 compared to 64 with DX11 cards) by using compute-shaders introduced with DX10 cards.
Ultraman1966 said:
More to the point, DX11 isn't about new eye candy but increasing efficiency so that more can be delivered for less.
In theory it should be more efficient, but in action current drivers will not allow multithreaded rendering and it's kinda serial. Like other guest pointed, current DX11 cards doesn't have the enough horse power for real DX11 tessellations and it is used with care.

Another annoying thing about DX10/11 is being limited to Windows Vista+, which makes the developers to stick to good old DX9 for the time being, because most of gamers are using DX9 cards or DX10/11 cards on Windows XP ( Check Valve hardware survey store.steampowered.com/hwsurvey ). Ofcourse using OpenGL 3 or 4 this limit will fade away, and developers will be able to use full features of DX10/11 cards on WindowsXP without any issues.
 
Actually, if you do look on current Steam HW survey now you'll see that DX10+ GPU and Vista+ combos are in the majority now. No, the reason they don't bother with DX11 these days is because most games are made with consoles in mind which makes financial sense but is disappointing for those looking for progression.
 
Julio said:
@hahahanoobs, there's no bias whatsoever, we may change the test setup between articles so it's not the same. In fact most of our articles in the past year or two, exclusively use high-end Intel CPUs because AMD doesn't have much to offer on the top range.

No side should have substantially more products tested than the other. As far as not having a lot of high end AMD parts, that shouldn't matter, because most of your readers aren't using the high end stuff, so that should be more reason to show them. Which is my point. CPU's and GPU's both react differently to each game and/or its engines. The framerate continued upward all the way to 4.5GHz on the FX chip, and those people may want to overclock their FX to 4.5GHz now to get those extra frames. I'm just saying, it would be nice to know and see how an Intel chip would scale. Maybe the Intel chip only needs 4.2GHz to get that same 95fps, and owners of that chip or generation of chip, may be comfortable going that high for this game, whereas if they thought they needed to go to 4.5GHz, they wouldn't bother overclocking at all, because its much harder to get that high of clock, maybe because their cooling isn't up for it. It's information that is valuable to some, myself included, and doesn't have to be there to show who has a better chip, but to show what each CPU and GPU is capable of doing.

Also, if you go back on a few other reviews you will have people saying we are biased against Nvidia and not in favor, it's just a matter of perception I guess.

And that is why I asked for the parts tested to be EVEN.
Use CPU's from BOTH sides to test clock and core scaling.
And to use an EVEN number of GPU's tested in the graphics tests. (they were even here, but I also mean in general and always)
How could this possibly be a bias or poor method of testing?

@ hahahanoobs - I think you have to be a little more realistic here. We tested 24 graphics cards, 24 that we had on hand. I shouldn’t need to tell you that this is a ship load of graphics cards and that means a ship load of testing. In order for us to deliver these articles in a timely manner so that they are still relevant we have to be realistic in what we test.

I tested the AMD and Nvidia cards that I had on hand. There are two issues with your request for more Nvidia cards. The first issue being that Nvidia simply doesn’t offer the range of cards that AMD does, that is also due to the fact that AMD has updated their range and Nvidia is yet to do so. We certainly included all the key players from the GeForce GTX 500 series.

The other issue is Nvidia’s board partners are reluctant to send lower end cards because they performance poorly in most modern games. So we are stuck with a heap of GTX 580, 570 and 560 cards and that’s about it.

As for the CPU scaling performance again be realistic, we cannot test multiple processors here and why should we? It’s not required as in the graph below we show a range of AMD and Intel processors. Furthermore the scaling performance of the AMD FX-8150 paints the full picture as does the screenshot of the Windows 7 task manager. We give the reader a great deal of information, so much so that they should be able to work out how a Core i5 or Core i7 processor will scale in comparison.

Why did we use more AMD processors than Intel processors? Well again that’s simple we have more AMD processors than Intel processors. AMD are great with review sample supply, they send out lots and let us keep them. Intel sends out just the flagship model and usually wants it back within a few weeks.

Still having said that we covered the LGA2011, LGA1366 and LGA1155 platforms leaving out just the obsolete LGA1156 platform so what more can you ask for? Yes there were no Core i3 processors included but that is because we don’t have any, Intel don’t really sample those and when they do we have to return them after the product review.

Your confusion about our comments regarding the performance of certain cards seems to be looking for something were there isn’t anything. We are not reviewing the graphics cards, we are simply looking at the data from 24 different cards and making a few comments. We are not going to discuss the performance vs. price ratio of very card at every tested resolution. In the conclusion we make a few recommendations about what is needed for playable performance, that is it.

Finally on another note thanks to everyone that posted feedback and support we appreciate it.


I had no problems with the number of physical video cards you tested in this test, only what you wrote about them in the 2560x1600 summary, which I actually miss read. You said anything over a 6870 will give playable framerates, when I took it as you comparing two $300 nVIDIA cards to one $200 AMD card that didn't reach 60fps. My apologies.

As for the CPU side. Not having the hardware because it wasn't given to you is kind of a cop out. How hard would it be, for example, to get a donation, buy, or sell some of those AMD chips, and pick up an i3-2130 and i5 750 with motherboards to conduct a more complete test, using hardware that is more likely to be used by your readers?


Feedback denied... I get it. I still had to try. :)

EEatGDL said:
hahahanoobs said:
Once again a nice [PC Game] GPU & CPU Performance Test. The CPU clock scaling in this game surprised me the most... in a good way.

Question 1: Why do you test so many AMD CPU's versus Intel CPU's (4 Intel vs 8 AMD)?
Question 2: For the 1920x1080 graphics card test, you mention the 570 and 480 hitting 60fps (both $300+ cards), followed by the 6870 ($200) that only gets 52fps. Why not also at least mention the 6950 ($300) hitting 58fps?

I'm not crying bias, but it seems you favour AMD CPU's and nVIDIA GPU's. I believe it should be even for all sides. Same for any CPU core/clock scaling tests. For the record I have a 2500K @ 4.6GHz paired with an unlocked 6950 2GB OC'd to 880/1375MHz.

Because of previous experiences with many other games, actually there's a lot of AMD fanboys who still think testers don't want to show up the potential of Bulldozer, that's why Steve decided to OC it this time to see how up it goes, we all know by now (newEgg TV, LinusTechTips, etc.) that the i5 beats the Bulldozer, so there's no need to use either an i7 for a CPU scaling test.

But yeah, Steve could have tested more cards as usual; but simply running this tests for ME3 I consider it a waste of time, like the one with Duke Nukhem. Before reading the test I already knew the FPS would be extremely high without effort [ran the game maxed out with a GT 540M @60 FPS average].

I asked for an Intel CPU to be tested next to the FX in the scaling test.

My complaint was what was written on one of the pages about video cards, not the amount of cards tested, but i misread it, and apologized.

@ hahahanoobs
Your confusion about our comments regarding the performance of certain cards seems to be looking for something were there isn’t anything. We are not reviewing the graphics cards, we are simply looking at the data from 24 different cards and making a few comments. We are not going to discuss the performance vs. price ratio of very card at every tested resolution. In the conclusion we make a few recommendations about what is needed for playable performance, that is it.

Finally on another note thanks to everyone that posted feedback and support we appreciate it.

Show me where I argued the performance.

"Folks looking to run ME3 maxed on a high-res 2560x1600 display will need at least a GTX 570 or GTX 480 to hit 60fps. That said, anything above the HD 6870 should deliver sufficiently smooth gameplay."

What I did notice the second time reading it, you say a 570 and 480 is needed for getting 60fps, yet you bypass the 7850@59fps, 6950@58, 5870@57 to write about needing a 6870@52, or higher for playable framerates. Are you saying an AMD card at 52fps is equal to 60fps on an nVIDIA card @ the same resolution? LOL I can't help it, it just doesn't make sense why you would summarize it like that.

Once again, I NEVER questioned the results, only the writing. ;)
 
And that is why I asked for the parts tested to be EVEN.
And to use an EVEN number of GPU's tested in the graphics tests.
How could this possibly be a bias or poor method of testing?

Because you wrongly assume Steve has an even number of CPUs/GPUs on hand. As he said, some parts go back, others don't. What's more, some generations launch ahead of others.

So instead of determining whether each card is a good fit for the review, you want him to arbitrarily add anything -- as long as AMD and Nvidia have the same number of products listed.

That makes no sense. He tests the parts that are available and are the most relevant to the specific review. If that means AMD or Nvidia has a few more products in the graphs, so be it.
 
HD 7870 will be my definite next GPU upgrade, if it drops down a bit in price in the next few months. I'll still have a tiny bit of hope for kepler, but Nvidia you gotta start bringing the heat soon.
 
@ hahahanoobs - I am curious as to what Nvidia cards we should have included to even it out? Or are you just suggesting we delete half a dozen relevant AMD graphics cards? Again you do understand that AMD has released their next generation cards and Nvidia has not, tipping the number of relevant cards in AMD’s favor.

It is silly to suggest that the same amount of cards from each manufacturer should be tested. It’s a little bit like saying if I were to test a range of Smartphones I could only look at one Samsung phone and one HTC phone because Apple only has a single phone.

No side should have substantially more products tested than the other. As far as not having a lot of high end AMD parts, that shouldn't matter, because most of your readers aren't using the high end stuff, so that should be more reason to show them. Which is my point. CPU's and GPU's both react differently to each game and/or its engines. The framerate continued upward all the way to 4.5GHz on the FX chip, and those people may want to overclock their FX to 4.5GHz now to get those extra frames. I'm just saying, it would be nice to know and see how an Intel chip would scale. Maybe the Intel chip only needs 4.2GHz to get that same 95fps, and owners of that chip or generation of chip, may be comfortable going that high for this game, whereas if they thought they needed to go to 4.5GHz, they wouldn't bother overclocking at all, because its much harder to get that high of clock, maybe because their cooling isn't up for it. It's information that is valuable to some, myself included, and doesn't have to be there to show who has a better chip, but to show what each CPU and GPU is capable of doing.

I’m sorry I did chuckle at that text block for a little while. Either you did not read my previous post or you simply did not understand it. Again what I said was “As for the CPU scaling performance again be realistic, we cannot test multiple processors here and why should we? It’s not required as in the graph below we show a range of AMD and Intel processors”…

So did you look at the graph below? If you did you would know “to get that same 95fps” which was actually 91fps the Intel processors don’t need to be clocked as high as even the 4.2GHz that you suggested. In fact even 4GHz would be too high. For 95fps you would have to heavily under clock either the Core i5-2500K, Core i7-2600K or Core i7-3960X processor as all rendered well over 100fps using their default clock speeds.

You might feel like your feedback is denied and as I said earlier that is because you are unrealistic. We have to pick and choose what we can test to deliver a timely article.

Finally about your last comment regarding the 2560x1600 comments. I am not sure why I need to say it again but we are only briefly discussing the results here, readers do not want to sift through tons of text to hear me blurt out exactly what they just saw above.

The writing makes sense. The GTX 570 and GTX 480 hit 60fps and if you want an average of 60fps they are the cards that can do it. Yes the 7850 was similar with 59fps, as was the 6950 and the 5870 and as I said anything above the 6870 should deliver smooth game play. The 6870 averaged 52fps so I am pretty sure that puts the 7850, 6950 and even the 5870 above it.

So then as I understand it, you are displeased because an Intel Core i3 processor was not tested, an Intel processor was not used for scaling (which would have told us nothing interesting by the way), there were more AMD graphics cards than Nvidia cards (for a good reason as there are more AMD cards on the market) and you didn’t understand comments summarizing the results. From all that I am taking away that next time I will have to source a Core i3 processor.
 
*facepalm...
Look at the numbers. First look at the numbers for games sold in the more popular of the PC eras. Now look at the numbers for the last 2 or so console cycles. Without consoles, games like Mass Effect would not even exist. We'd be stuck playing StarCraft and WoW 24/7.
 
I don't care how much people are complaining about the game - I'm loving the hell outa it. I'm normally not one for American RPG's. They usually don't pull me into the game and make be care about the characters on an emotional level, which is what makes me want to come back and play more. But the Mass Effect series has done that very well IMO, especially the 3rd one. I found myself several times stopping to think very hard about a choice in the game, not based on how the game play might change but based on my emotional attachment to the characters it effects, and the morality of my choice. I know to some that may seem rather odd for a video game, but I like to get that involved an RPG.

As far as graphics goes I'm happy with them. They look nice to me, but still can be run on a reasonable computer. (C2D E8400 @ 3.0ghz & 8800GTX 512mb). So I'm happy with the graphics. I know I need to upgrade my computer to keep up, but its nice I can still play some of the newer games on low to med graphics settings.
 
noob said:
I'm just saying, it would be nice to know and see how an Intel chip would scale. Maybe the Intel chip only needs 4.2GHz to get that same 95fps, and owners of that chip or generation of chip, may be comfortable going that high for this game, whereas if they thought they needed to go to 4.5GHz, they wouldn't bother overclocking at all, because its much harder to get that high of clock, maybe because their cooling isn't up for it.

Complete fail. Also... Uh 4.5 GHz isn't hard at all to get to on a 2600k , and that's on AIR.
AIR coolers are cheap as crap and you can easily go to 4.5-4.7 GHz even in a mildy warm room.

And yea , the intel was way above the amd at it's stock clocks.
Anyways, if you meant some other intel chip that wasn't even listed then that's another fail for you cause you didn't specify.
One would only assume you meant the one listed.

One last note -
Anyone who overclocks their own CPUs would be able to infer what would happen on an intel from the above data.
If you couldn't then you honestly shouldn't even be trying to overclock at all.
 
Guest said:
The graphics may be dx9, but if you are playing mass effect for graphics superiority you are a *****. you play this game for the story. besides, the graphics still look pretty good to me.

agreed . No one is buying this game for cutting edge graphics , the graphics are good enough. And when your chatting to people it still looks very good.
 
Back