Mass Effect 3 Tested, Benchmarked

Awesome to see another DX9 only title, surely with a multimillion pound budget which dwarfs most games they could've afforded to use technology from... well 2008/09 instead 2004 (DX9)? Fact is this benchmark was a massive waste of time, you may as well have benched the last game and it would've yielded probably identical results...
 
Now start bashing MW3 for not providing a new graphics engine unlike BF3 which requires DX11 card for features which a DX9 card can handle too :D
 
Catered to Consoles /palm ... Consoles, ruining gaming since [insert year here]. Thanks for the article, Ultra is right though.
 
I can just see it 100 years later and developers will have only just started using DX11...
 
Who thinks that DX9 graphics are enough for PC gamers? I've been using DX11 cards for 2,5 years now and I still don't see much benefit from it....
We have only a couple of titles that are truly showing us amazing DX11 graphics and that's all, but there you need a really powerful DX11 card to enjoy it.
IMO 80% of DX11 cards are incapable of handling real DX11 games...

If there were no consoles we would be already enjoying DX12...DX13.....etc....

Hope that PCs and Consoles will not go hand-in-hand in terms of graphics in the future!!!


Peter
 
lol if Consoles didn't exist we would be enjoying a more advanced gaming experience and games would be a lot more fun with amazing graphics I bet if consoles didn't exist we would be very close to hitting a CGI type gaming experience!
 
As with previous entries, Mass Effect 3 is built using Unreal Engine 3. In other words, it's a DirectX 9 title.
But in march 2011 Unreal Engine was update and support DirectX11 plus tessellation and displacement mapping, advanced hair rendering with MSAA, deferred rendering with MSAA, screen space subsurface scattering, image-based lighting, billboard reflections, glossy reflections, reflection shadows, point light reflections, and bokeh depth of field.
Taken from Wikipedia.
Then the problem is´t Unreal Engine, the real problem were lazy developers, whom did not bother to update to the last Unreal Engine to gain performance and better looking game.
 
The graphics may be dx9, but if you are playing mass effect for graphics superiority you are a *****. you play this game for the story. besides, the graphics still look pretty good to me.
 
Agree pointless review.
Few video settings in game and if you have to edit .ini file, this is a little advanced for average user.
Like COD3 this is a game best ignored.. .from a hardware point of view.
ie an XBOX port. In other words it runs well on any old DX9 video card.

PS. Not commenting on gameplay, it may well be a good game in itself, but i won't be getting it until its in the bargain bin, it will suffer like Crysis 2 did on the PC without its DX11 patch

Irishgamer01
 
Once again a nice [PC Game] GPU & CPU Performance Test. The CPU clock scaling in this game surprised me the most... in a good way.

Question 1: Why do you test so many AMD CPU's versus Intel CPU's (4 Intel vs 8 AMD)?
Question 2: For the 1920x1080 graphics card test, you mention the 570 and 480 hitting 60fps (both $300+ cards), followed by the 6870 ($200) that only gets 52fps. Why not also at least mention the 6950 ($300) hitting 58fps?

I'm not crying bias, but it seems you favour AMD CPU's and nVIDIA GPU's. I believe it should be even for all sides. Same for any CPU core/clock scaling tests. For the record I have a 2500K @ 4.6GHz paired with an unlocked 6950 2GB OC'd to 880/1375MHz.
 
I can't wait to spend $1,500+ on next generation 28nm high-end cards to max out console ports.

Doh.

* Puts $1,500 back into savings.

** HD6790 came out a long time ago. April 5, 2011
http://www.gpureview.com/show_cards.php?card1=649&card2=
 
@hahahanoobs, there's no bias whatsoever, we may change the test setup between articles so it's not the same. In fact most of our articles in the past year or two, exclusively use high-end Intel CPUs because AMD doesn't have much to offer on the top range. Also, if you go back on a few other reviews you will have people saying we are biased against Nvidia and not in favor, it's just a matter of perception I guess.
 
...well that's a surprise; my 5770 can still keep up somewhat! Also great to know I've got the third-best processor around, even after more than six months.
 
Got this game running pretty smooth on a intel core 2 duo e6400 @ 2.8ghz and a half gig 4870. Needless to say it's not very demanding. The dx9 looks decent enough though and even though i wish all games were dx11 capable it looks pretty nice.
 
Have you guys seen the disgusting terms and conditions of the Origin DRM?

http://www.rockpapershotgun.com/2011/08/24/eas-origin-eula-proves-even-more-sinister/

Quote from the TOS:
You agree that EA may collect, use, store and transmit technical and related information that identifies your computer (including the Internet Protocol Address), operating system, Application usage (including but not limited to successful installation and/or removal), software, software usage and peripheral hardware, that may be gathered periodically to facilitate the provision of software updates, dynamically served content, product support and other services to you, including online services. EA may also use this information combined with personal information for marketing purposes and to improve our products and services. We may also share that data with our third party service providers in a form that does not personally identify you. IF YOU DO NOT WANT EA TO COLLECT, USE, STORE, TRANSMIT OR DISPLAY THE DATA DESCRIBED IN THIS SECTION, PLEASE DO NOT INSTALL OR USE THE APPLICATION.

Luckily, whilst looking for the quote above, I found a way to stop (i hope) Origin from this MASSIVE invasion of privacy.

http://masseffect.livejournal.com/1262968.html
 
Have you guys seen the disgusting terms and conditions of the Origin DRM?

http://www.rockpapershotgun.com/2011/08/24/eas-origin-eula-proves-even-more-sinister/

Quote from the TOS:
You agree that EA may collect, use, store and transmit technical and related information that identifies your computer (including the Internet Protocol Address), operating system, Application usage (including but not limited to successful installation and/or removal), software, software usage and peripheral hardware, that may be gathered periodically to facilitate the provision of software updates, dynamically served content, product support and other services to you, including online services. EA may also use this information combined with personal information for marketing purposes and to improve our products and services. We may also share that data with our third party service providers in a form that does not personally identify you. IF YOU DO NOT WANT EA TO COLLECT, USE, STORE, TRANSMIT OR DISPLAY THE DATA DESCRIBED IN THIS SECTION, PLEASE DO NOT INSTALL OR USE THE APPLICATION.

Luckily, whilst looking for the quote above, I found a way to stop (i hope) Origin from this MASSIVE invasion of privacy.

I really wanted to play this game on PC - I picked up the first two titles on Steam for pennies and was going to do one last full playthrough with all the decisions I really wanted so that the last game would truly be the continuation of a story I built from the ground up (I play on xbox and have explored all the storyline possibilities etc)... But then I learned it's never even coming to Steam.

Bought this one for xbox as well in the end. It's kind of disappointing - developers need to stop fighting each other on these petty fronts and just work on doing what they do best.
 
Guest said:
Now start bashing MW3 for not providing a new graphics engine unlike BF3 which requires DX11 card for features which a DX9 card can handle too :D
Yes, I would love to see tessellation in DX9... More to the point, DX11 isn't about new eye candy but increasing efficiency so that more can be delivered for less.
 
Guest said:
As with previous entries, Mass Effect 3 is built using Unreal Engine 3. In other words, it's a DirectX 9 title.
But in march 2011 Unreal Engine was update and support DirectX11 plus tessellation and displacement mapping, advanced hair rendering with MSAA, deferred rendering with MSAA, screen space subsurface scattering, image-based lighting, billboard reflections, glossy reflections, reflection shadows, point light reflections, and bokeh depth of field.
Taken from Wikipedia.
Then the problem is´t Unreal Engine, the real problem were lazy developers, whom did not bother to update to the last Unreal Engine to gain performance and better looking game.
In fairness, development of Mass Effect 3 began back in 2010 or earlier if reports are correct so they never had a chance to implement it. Well, they did but they couldn't be arsed...
 
Hmm, if consoles didn't exist, I'm not sure how games would sell or be able to make such big budget games. Don't be too hard on them.
 
hahahanoobs said:
Once again a nice [PC Game] GPU & CPU Performance Test. The CPU clock scaling in this game surprised me the most... in a good way.

Question 1: Why do you test so many AMD CPU's versus Intel CPU's (4 Intel vs 8 AMD)?
Question 2: For the 1920x1080 graphics card test, you mention the 570 and 480 hitting 60fps (both $300+ cards), followed by the 6870 ($200) that only gets 52fps. Why not also at least mention the 6950 ($300) hitting 58fps?

I'm not crying bias, but it seems you favour AMD CPU's and nVIDIA GPU's. I believe it should be even for all sides. Same for any CPU core/clock scaling tests. For the record I have a 2500K @ 4.6GHz paired with an unlocked 6950 2GB OC'd to 880/1375MHz.

Because of previous experiences with many other games, actually there's a lot of AMD fanboys who still think testers don't want to show up the potential of Bulldozer, that's why Steve decided to OC it this time to see how up it goes, we all know by now (newEgg TV, LinusTechTips, etc.) that the i5 beats the Bulldozer, so there's no need to use either an i7 for a CPU scaling test.

But yeah, Steve could have tested more cards as usual; but simply running this tests for ME3 I consider it a waste of time, like the one with Duke Nukhem. Before reading the test I already knew the FPS would be extremely high without effort [ran the game maxed out with a GT 540M @60 FPS average].
 
Expected results considering it's the same engine with slight improvements, but good review nevertheless Steve.
 
EEatGDL said:
hahahanoobs said:
Once again a nice [PC Game] GPU & CPU Performance Test. The CPU clock scaling in this game surprised me the most... in a good way.

Question 1: Why do you test so many AMD CPU's versus Intel CPU's (4 Intel vs 8 AMD)?
Question 2: For the 1920x1080 graphics card test, you mention the 570 and 480 hitting 60fps (both $300+ cards), followed by the 6870 ($200) that only gets 52fps. Why not also at least mention the 6950 ($300) hitting 58fps?

I'm not crying bias, but it seems you favour AMD CPU's and nVIDIA GPU's. I believe it should be even for all sides. Same for any CPU core/clock scaling tests. For the record I have a 2500K @ 4.6GHz paired with an unlocked 6950 2GB OC'd to 880/1375MHz.

Because of previous experiences with many other games, actually there's a lot of AMD fanboys who still think testers don't want to show up the potential of Bulldozer, that's why Steve decided to OC it this time to see how up it goes, we all know by now (newEgg TV, LinusTechTips, etc.) that the i5 beats the Bulldozer, so there's no need to use either an i7 for a CPU scaling test.

But yeah, Steve could have tested more cards as usual; but simply running this tests for ME3 I consider it a waste of time, like the one with Duke Nukhem. Before reading the test I already knew the FPS would be extremely high without effort [ran the game maxed out with a GT 540M @60 FPS average].

You're not listening. What i'd like to see are CPU scaling/core tests from BOTH sides. Whether two flagships are used I don't care. Techspot may have used two CPU's in the past, but that was with a DIFFERENT game. I'd like to see consistency. We all know the difference between two CPU's tested isn't just about clock speed, but the architecture, and I think it would be beneficial for all if the two different architectures were tested for EACH game, because each game is different.
 
@ hahahanoobs - I think you have to be a little more realistic here. We tested 24 graphics cards, 24 that we had on hand. I shouldn’t need to tell you that this is a ship load of graphics cards and that means a ship load of testing. In order for us to deliver these articles in a timely manner so that they are still relevant we have to be realistic in what we test.

I tested the AMD and Nvidia cards that I had on hand. There are two issues with your request for more Nvidia cards. The first issue being that Nvidia simply doesn’t offer the range of cards that AMD does, that is also due to the fact that AMD has updated their range and Nvidia is yet to do so. We certainly included all the key players from the GeForce GTX 500 series.

The other issue is Nvidia’s board partners are reluctant to send lower end cards because they performance poorly in most modern games. So we are stuck with a heap of GTX 580, 570 and 560 cards and that’s about it.

As for the CPU scaling performance again be realistic, we cannot test multiple processors here and why should we? It’s not required as in the graph below we show a range of AMD and Intel processors. Furthermore the scaling performance of the AMD FX-8150 paints the full picture as does the screenshot of the Windows 7 task manager. We give the reader a great deal of information, so much so that they should be able to work out how a Core i5 or Core i7 processor will scale in comparison.

Why did we use more AMD processors than Intel processors? Well again that’s simple we have more AMD processors than Intel processors. AMD are great with review sample supply, they send out lots and let us keep them. Intel sends out just the flagship model and usually wants it back within a few weeks.

Still having said that we covered the LGA2011, LGA1366 and LGA1155 platforms leaving out just the obsolete LGA1156 platform so what more can you ask for? Yes there were no Core i3 processors included but that is because we don’t have any, Intel don’t really sample those and when they do we have to return them after the product review.

Your confusion about our comments regarding the performance of certain cards seems to be looking for something were there isn’t anything. We are not reviewing the graphics cards, we are simply looking at the data from 24 different cards and making a few comments. We are not going to discuss the performance vs. price ratio of very card at every tested resolution. In the conclusion we make a few recommendations about what is needed for playable performance, that is it.

Finally on another note thanks to everyone that posted feedback and support we appreciate it.
 
Back