Mass Effect 3 Tested, Benchmarked

By on March 13, 2012, 12:53 AM

As you've undoubtedly heard, the third installment of BioWare's Mass Effect trilogy hit shelves last Tuesday. Being one of the year's most anticipated launches, it's no surprise to see it with an aggregate review score of over 90. As with previous entries, Mass Effect 3 is built using Unreal Engine 3. In other words, it's a DirectX 9 title. As such, there are no fancy tessellation, additional dynamic lighting, depth of field or ambient occlusion effects. Although ME3 probably won't win any awards for being the best looking PC game of 2012, fans have generally approved of the way previous titles looked, so they shouldn't be too disappointed this time.

Besides, there are some benefits to sticking with the UE3 engine. For starters, its performance must be highly optimized by now. The developers say these tweaks have allowed them to improve everything, from better storytelling methods to better overall ground pics and cinematics. They also say the extra performance has allowed them to display more enemies at once, making for a richer and livelier experience.

We've benchmarked Mass Effect 3 across three different resolutions with two dozen GPU configurations -- including AMD's new Radeon HD 7000 series. We'll also see how the performance scales when overclocking an eight-core FX-8150, along with benching a handful of other Intel and AMD processors.

Read the complete performance review.




User Comments: 36

Got something to say? Post a comment
Ultraman1966 said:

Awesome to see another DX9 only title, surely with a multimillion pound budget which dwarfs most games they could've afforded to use technology from... well 2008/09 instead 2004 (DX9)? Fact is this benchmark was a massive waste of time, you may as well have benched the last game and it would've yielded probably identical results...

Guest said:

Now start bashing MW3 for not providing a new graphics engine unlike BF3 which requires DX11 card for features which a DX9 card can handle too :D

H3llion H3llion, TechSpot Paladin, said:

Catered to Consoles /palm ... Consoles, ruining gaming since [insert year here]. Thanks for the article, Ultra is right though.

Guest said:

I can just see it 100 years later and developers will have only just started using DX11...

Guest said:

Who thinks that DX9 graphics are enough for PC gamers? I've been using DX11 cards for 2,5 years now and I still don't see much benefit from it....

We have only a couple of titles that are truly showing us amazing DX11 graphics and that's all, but there you need a really powerful DX11 card to enjoy it.

IMO 80% of DX11 cards are incapable of handling real DX11 games...

If there were no consoles we would be already enjoying DX12...DX13.....etc....

Hope that PCs and Consoles will not go hand-in-hand in terms of graphics in the future!!!

Peter

Guest said:

lol if Consoles didn't exist we would be enjoying a more advanced gaming experience and games would be a lot more fun with amazing graphics I bet if consoles didn't exist we would be very close to hitting a CGI type gaming experience!

Guest said:

When did they come out with a 6790 video card?

Guest said:

As with previous entries, Mass Effect 3 is built using Unreal Engine 3. In other words, it's a DirectX 9 title.

But in march 2011 Unreal Engine was update and support DirectX11 plus tessellation and displacement mapping, advanced hair rendering with MSAA, deferred rendering with MSAA, screen space subsurface scattering, image-based lighting, billboard reflections, glossy reflections, reflection shadows, point light reflections, and bokeh depth of field.

Taken from Wikipedia.

Then the problem isĀ't Unreal Engine, the real problem were lazy developers, whom did not bother to update to the last Unreal Engine to gain performance and better looking game.

Guest said:

The graphics may be dx9, but if you are playing mass effect for graphics superiority you are a *****. you play this game for the story. besides, the graphics still look pretty good to me.

Guest said:

Agree pointless review.

Few video settings in game and if you have to edit .ini file, this is a little advanced for average user.

Like COD3 this is a game best ignored.. .from a hardware point of view.

ie an XBOX port. In other words it runs well on any old DX9 video card.

PS. Not commenting on gameplay, it may well be a good game in itself, but i won't be getting it until its in the bargain bin, it will suffer like Crysis 2 did on the PC without its DX11 patch

Irishgamer01

hahahanoobs hahahanoobs said:

Once again a nice [PC Game] GPU & CPU Performance Test. The CPU clock scaling in this game surprised me the most... in a good way.

Question 1: Why do you test so many AMD CPU's versus Intel CPU's (4 Intel vs 8 AMD)?

Question 2: For the 1920x1080 graphics card test, you mention the 570 and 480 hitting 60fps (both $300+ cards), followed by the 6870 ($200) that only gets 52fps. Why not also at least mention the 6950 ($300) hitting 58fps?

I'm not crying bias, but it seems you favour AMD CPU's and nVIDIA GPU's. I believe it should be even for all sides. Same for any CPU core/clock scaling tests. For the record I have a 2500K @ 4.6GHz paired with an unlocked 6950 2GB OC'd to 880/1375MHz.

Guest said:

I can't wait to spend $1,500+ on next generation 28nm high-end cards to max out console ports.

Doh.

* Puts $1,500 back into savings.

** HD6790 came out a long time ago. April 5, 2011

http://www.gpureview.com/show_cards.php?card1=649&card2=

Staff
Julio Franco Julio Franco, TechSpot Editor, said:

@hahahanoobs, there's no bias whatsoever, we may change the test setup between articles so it's not the same. In fact most of our articles in the past year or two, exclusively use high-end Intel CPUs because AMD doesn't have much to offer on the top range. Also, if you go back on a few other reviews you will have people saying we are biased against Nvidia and not in favor, it's just a matter of perception I guess.

Guest said:

...well that's a surprise; my 5770 can still keep up somewhat! Also great to know I've got the third-best processor around, even after more than six months.

EXCellR8 EXCellR8, The Conservative, said:

Got this game running pretty smooth on a intel core 2 duo e6400 @ 2.8ghz and a half gig 4870. Needless to say it's not very demanding. The dx9 looks decent enough though and even though i wish all games were dx11 capable it looks pretty nice.

captainawesome captainawesome said:

Have you guys seen the disgusting terms and conditions of the Origin DRM?

[link]

Quote from the TOS:

You agree that EA may collect, use, store and transmit technical and related information that identifies your computer (including the Internet Protocol Address), operating system, Application usage (including but not limited to successful installation and/or removal), software, software usage and peripheral hardware, that may be gathered periodically to facilitate the provision of software updates, dynamically served content, product support and other services to you, including online services. EA may also use this information combined with personal information for marketing purposes and to improve our products and services. We may also share that data with our third party service providers in a form that does not personally identify you. IF YOU DO NOT WANT EA TO COLLECT, USE, STORE, TRANSMIT OR DISPLAY THE DATA DESCRIBED IN THIS SECTION, PLEASE DO NOT INSTALL OR USE THE APPLICATION.

Luckily, whilst looking for the quote above, I found a way to stop (i hope) Origin from this MASSIVE invasion of privacy.

http://masseffect.livejournal.com/1262968.html

Zilpha Zilpha said:

Have you guys seen the disgusting terms and conditions of the Origin DRM?

[link]

Quote from the TOS:

You agree that EA may collect, use, store and transmit technical and related information that identifies your computer (including the Internet Protocol Address), operating system, Application usage (including but not limited to successful installation and/or removal), software, software usage and peripheral hardware, that may be gathered periodically to facilitate the provision of software updates, dynamically served content, product support and other services to you, including online services. EA may also use this information combined with personal information for marketing purposes and to improve our products and services. We may also share that data with our third party service providers in a form that does not personally identify you. IF YOU DO NOT WANT EA TO COLLECT, USE, STORE, TRANSMIT OR DISPLAY THE DATA DESCRIBED IN THIS SECTION, PLEASE DO NOT INSTALL OR USE THE APPLICATION.

Luckily, whilst looking for the quote above, I found a way to stop (i hope) Origin from this MASSIVE invasion of privacy.

I really wanted to play this game on PC - I picked up the first two titles on Steam for pennies and was going to do one last full playthrough with all the decisions I really wanted so that the last game would truly be the continuation of a story I built from the ground up (I play on xbox and have explored all the storyline possibilities etc)... But then I learned it's never even coming to Steam.

Bought this one for xbox as well in the end. It's kind of disappointing - developers need to stop fighting each other on these petty fronts and just work on doing what they do best.

Ultraman1966 said:

Guest said:

Now start bashing MW3 for not providing a new graphics engine unlike BF3 which requires DX11 card for features which a DX9 card can handle too

Yes, I would love to see tessellation in DX9... More to the point, DX11 isn't about new eye candy but increasing efficiency so that more can be delivered for less.

Ultraman1966 said:

Guest said:

As with previous entries, Mass Effect 3 is built using Unreal Engine 3. In other words, it's a DirectX 9 title.

But in march 2011 Unreal Engine was update and support DirectX11 plus tessellation and displacement mapping, advanced hair rendering with MSAA, deferred rendering with MSAA, screen space subsurface scattering, image-based lighting, billboard reflections, glossy reflections, reflection shadows, point light reflections, and bokeh depth of field.

Taken from Wikipedia.

Then the problem isĀ't Unreal Engine, the real problem were lazy developers, whom did not bother to update to the last Unreal Engine to gain performance and better looking game.

In fairness, development of Mass Effect 3 began back in 2010 or earlier if reports are correct so they never had a chance to implement it. Well, they did but they couldn't be arsed...

Guest said:

Hmm, if consoles didn't exist, I'm not sure how games would sell or be able to make such big budget games. Don't be too hard on them.

EEatGDL said:

hahahanoobs said:

Once again a nice [PC Game] GPU & CPU Performance Test. The CPU clock scaling in this game surprised me the most... in a good way.

Question 1: Why do you test so many AMD CPU's versus Intel CPU's (4 Intel vs 8 AMD)?

Question 2: For the 1920x1080 graphics card test, you mention the 570 and 480 hitting 60fps (both $300+ cards), followed by the 6870 ($200) that only gets 52fps. Why not also at least mention the 6950 ($300) hitting 58fps?

I'm not crying bias, but it seems you favour AMD CPU's and nVIDIA GPU's. I believe it should be even for all sides. Same for any CPU core/clock scaling tests. For the record I have a 2500K @ 4.6GHz paired with an unlocked 6950 2GB OC'd to 880/1375MHz.

Because of previous experiences with many other games, actually there's a lot of AMD fanboys who still think testers don't want to show up the potential of Bulldozer, that's why Steve decided to OC it this time to see how up it goes, we all know by now (newEgg TV, LinusTechTips, etc.) that the i5 beats the Bulldozer, so there's no need to use either an i7 for a CPU scaling test.

But yeah, Steve could have tested more cards as usual; but simply running this tests for ME3 I consider it a waste of time, like the one with Duke Nukhem. Before reading the test I already knew the FPS would be extremely high without effort [ran the game maxed out with a GT 540M @60 FPS average].

Relic Relic, TechSpot Chancellor, said:

Expected results considering it's the same engine with slight improvements, but good review nevertheless Steve.

hahahanoobs hahahanoobs said:

EEatGDL said:

hahahanoobs said:

Once again a nice [PC Game] GPU & CPU Performance Test. The CPU clock scaling in this game surprised me the most... in a good way.

Question 1: Why do you test so many AMD CPU's versus Intel CPU's (4 Intel vs 8 AMD)?

Question 2: For the 1920x1080 graphics card test, you mention the 570 and 480 hitting 60fps (both $300+ cards), followed by the 6870 ($200) that only gets 52fps. Why not also at least mention the 6950 ($300) hitting 58fps?

I'm not crying bias, but it seems you favour AMD CPU's and nVIDIA GPU's. I believe it should be even for all sides. Same for any CPU core/clock scaling tests. For the record I have a 2500K @ 4.6GHz paired with an unlocked 6950 2GB OC'd to 880/1375MHz.

Because of previous experiences with many other games, actually there's a lot of AMD fanboys who still think testers don't want to show up the potential of Bulldozer, that's why Steve decided to OC it this time to see how up it goes, we all know by now (newEgg TV, LinusTechTips, etc.) that the i5 beats the Bulldozer, so there's no need to use either an i7 for a CPU scaling test.

But yeah, Steve could have tested more cards as usual; but simply running this tests for ME3 I consider it a waste of time, like the one with Duke Nukhem. Before reading the test I already knew the FPS would be extremely high without effort [ran the game maxed out with a GT 540M @60 FPS average].

You're not listening. What i'd like to see are CPU scaling/core tests from BOTH sides. Whether two flagships are used I don't care. Techspot may have used two CPU's in the past, but that was with a DIFFERENT game. I'd like to see consistency. We all know the difference between two CPU's tested isn't just about clock speed, but the architecture, and I think it would be beneficial for all if the two different architectures were tested for EACH game, because each game is different.

Staff
Steve Steve said:

@ hahahanoobs - I think you have to be a little more realistic here. We tested 24 graphics cards, 24 that we had on hand. I shouldn't need to tell you that this is a ship load of graphics cards and that means a ship load of testing. In order for us to deliver these articles in a timely manner so that they are still relevant we have to be realistic in what we test.

I tested the AMD and Nvidia cards that I had on hand. There are two issues with your request for more Nvidia cards. The first issue being that Nvidia simply doesn't offer the range of cards that AMD does, that is also due to the fact that AMD has updated their range and Nvidia is yet to do so. We certainly included all the key players from the GeForce GTX 500 series.

The other issue is Nvidia's board partners are reluctant to send lower end cards because they performance poorly in most modern games. So we are stuck with a heap of GTX 580, 570 and 560 cards and that's about it.

As for the CPU scaling performance again be realistic, we cannot test multiple processors here and why should we? It's not required as in the graph below we show a range of AMD and Intel processors. Furthermore the scaling performance of the AMD FX-8150 paints the full picture as does the screenshot of the Windows 7 task manager. We give the reader a great deal of information, so much so that they should be able to work out how a Core i5 or Core i7 processor will scale in comparison.

Why did we use more AMD processors than Intel processors? Well again that's simple we have more AMD processors than Intel processors. AMD are great with review sample supply, they send out lots and let us keep them. Intel sends out just the flagship model and usually wants it back within a few weeks.

Still having said that we covered the LGA2011, LGA1366 and LGA1155 platforms leaving out just the obsolete LGA1156 platform so what more can you ask for? Yes there were no Core i3 processors included but that is because we don't have any, Intel don't really sample those and when they do we have to return them after the product review.

Your confusion about our comments regarding the performance of certain cards seems to be looking for something were there isn't anything. We are not reviewing the graphics cards, we are simply looking at the data from 24 different cards and making a few comments. We are not going to discuss the performance vs. price ratio of very card at every tested resolution. In the conclusion we make a few recommendations about what is needed for playable performance, that is it.

Finally on another note thanks to everyone that posted feedback and support we appreciate it.

Staff
Matthew Matthew, TechSpot Staff, said:

Why did we use more AMD processors than Intel processors? Well again that's simple we have more AMD processors than Intel processors. AMD are great with review sample supply, they send out lots and let us keep them. Intel sends out just the flagship model and usually wants it back within a few weeks.

Wow, so you admit to accepting bribes? Can't we get a reviewer with some integrity around here?

Keep up the good work Steve.

Staff
Steve Steve said:

Wow, so you admit to accepting bribes? Can't we get a reviewer with some integrity around here?

Keep up the good work Steve.

haha yes my Bulldozer review was a glowing report for AMD

Sandy what?

Siavash Siavash said:

Ultraman1966 said:

Guest said:

Now start bashing MW3 for not providing a new graphics engine unlike BF3 which requires DX11 card for features which a DX9 card can handle too

Yes, I would love to see tessellation in DX9...

Some of DX9/10 AMD RadeonHD cards (2xxx-4xxx) have a built in tessellation unit, and it's also possible to have a limited tessellation (up to a factor of 16 compared to 64 with DX11 cards) by using compute-shaders introduced with DX10 cards.

Ultraman1966 said:

More to the point, DX11 isn't about new eye candy but increasing efficiency so that more can be delivered for less.

In theory it should be more efficient, but in action current drivers will not allow multithreaded rendering and it's kinda serial. Like other guest pointed, current DX11 cards doesn't have the enough horse power for real DX11 tessellations and it is used with care.

Another annoying thing about DX10/11 is being limited to Windows Vista+, which makes the developers to stick to good old DX9 for the time being, because most of gamers are using DX9 cards or DX10/11 cards on Windows XP ( Check Valve hardware survey store.steampowered.com/hwsurvey ). Ofcourse using OpenGL 3 or 4 this limit will fade away, and developers will be able to use full features of DX10/11 cards on WindowsXP without any issues.

Ultraman1966 said:

Actually, if you do look on current Steam HW survey now you'll see that DX10+ GPU and Vista+ combos are in the majority now. No, the reason they don't bother with DX11 these days is because most games are made with consoles in mind which makes financial sense but is disappointing for those looking for progression.

hahahanoobs hahahanoobs said:

Julio said:

@hahahanoobs, there's no bias whatsoever, we may change the test setup between articles so it's not the same. In fact most of our articles in the past year or two, exclusively use high-end Intel CPUs because AMD doesn't have much to offer on the top range.

No side should have substantially more products tested than the other. As far as not having a lot of high end AMD parts, that shouldn't matter, because most of your readers aren't using the high end stuff, so that should be more reason to show them. Which is my point. CPU's and GPU's both react differently to each game and/or its engines. The framerate continued upward all the way to 4.5GHz on the FX chip, and those people may want to overclock their FX to 4.5GHz now to get those extra frames. I'm just saying, it would be nice to know and see how an Intel chip would scale. Maybe the Intel chip only needs 4.2GHz to get that same 95fps, and owners of that chip or generation of chip, may be comfortable going that high for this game, whereas if they thought they needed to go to 4.5GHz, they wouldn't bother overclocking at all, because its much harder to get that high of clock, maybe because their cooling isn't up for it. It's information that is valuable to some, myself included, and doesn't have to be there to show who has a better chip, but to show what each CPU and GPU is capable of doing.

Also, if you go back on a few other reviews you will have people saying we are biased against Nvidia and not in favor, it's just a matter of perception I guess.

And that is why I asked for the parts tested to be EVEN.

Use CPU's from BOTH sides to test clock and core scaling.

And to use an EVEN number of GPU's tested in the graphics tests. (they were even here, but I also mean in general and always)

How could this possibly be a bias or poor method of testing?

@ hahahanoobs - I think you have to be a little more realistic here. We tested 24 graphics cards, 24 that we had on hand. I shouldn't need to tell you that this is a ship load of graphics cards and that means a ship load of testing. In order for us to deliver these articles in a timely manner so that they are still relevant we have to be realistic in what we test.

I tested the AMD and Nvidia cards that I had on hand. There are two issues with your request for more Nvidia cards. The first issue being that Nvidia simply doesn't offer the range of cards that AMD does, that is also due to the fact that AMD has updated their range and Nvidia is yet to do so. We certainly included all the key players from the GeForce GTX 500 series.

The other issue is Nvidia's board partners are reluctant to send lower end cards because they performance poorly in most modern games. So we are stuck with a heap of GTX 580, 570 and 560 cards and that's about it.

As for the CPU scaling performance again be realistic, we cannot test multiple processors here and why should we? It's not required as in the graph below we show a range of AMD and Intel processors. Furthermore the scaling performance of the AMD FX-8150 paints the full picture as does the screenshot of the Windows 7 task manager. We give the reader a great deal of information, so much so that they should be able to work out how a Core i5 or Core i7 processor will scale in comparison.

Why did we use more AMD processors than Intel processors? Well again that's simple we have more AMD processors than Intel processors. AMD are great with review sample supply, they send out lots and let us keep them. Intel sends out just the flagship model and usually wants it back within a few weeks.

Still having said that we covered the LGA2011, LGA1366 and LGA1155 platforms leaving out just the obsolete LGA1156 platform so what more can you ask for? Yes there were no Core i3 processors included but that is because we don't have any, Intel don't really sample those and when they do we have to return them after the product review.

Your confusion about our comments regarding the performance of certain cards seems to be looking for something were there isn't anything. We are not reviewing the graphics cards, we are simply looking at the data from 24 different cards and making a few comments. We are not going to discuss the performance vs. price ratio of very card at every tested resolution. In the conclusion we make a few recommendations about what is needed for playable performance, that is it.

Finally on another note thanks to everyone that posted feedback and support we appreciate it.

I had no problems with the number of physical video cards you tested in this test, only what you wrote about them in the 2560x1600 summary, which I actually miss read. You said anything over a 6870 will give playable framerates, when I took it as you comparing two $300 nVIDIA cards to one $200 AMD card that didn't reach 60fps. My apologies.

As for the CPU side. Not having the hardware because it wasn't given to you is kind of a cop out. How hard would it be, for example, to get a donation, buy, or sell some of those AMD chips, and pick up an i3-2130 and i5 750 with motherboards to conduct a more complete test, using hardware that is more likely to be used by your readers?

Feedback denied... I get it. I still had to try.

EEatGDL said:

hahahanoobs said:

Once again a nice [PC Game] GPU & CPU Performance Test. The CPU clock scaling in this game surprised me the most... in a good way.

Question 1: Why do you test so many AMD CPU's versus Intel CPU's (4 Intel vs 8 AMD)?

Question 2: For the 1920x1080 graphics card test, you mention the 570 and 480 hitting 60fps (both $300+ cards), followed by the 6870 ($200) that only gets 52fps. Why not also at least mention the 6950 ($300) hitting 58fps?

I'm not crying bias, but it seems you favour AMD CPU's and nVIDIA GPU's. I believe it should be even for all sides. Same for any CPU core/clock scaling tests. For the record I have a 2500K @ 4.6GHz paired with an unlocked 6950 2GB OC'd to 880/1375MHz.

Because of previous experiences with many other games, actually there's a lot of AMD fanboys who still think testers don't want to show up the potential of Bulldozer, that's why Steve decided to OC it this time to see how up it goes, we all know by now (newEgg TV, LinusTechTips, etc.) that the i5 beats the Bulldozer, so there's no need to use either an i7 for a CPU scaling test.

But yeah, Steve could have tested more cards as usual; but simply running this tests for ME3 I consider it a waste of time, like the one with Duke Nukhem. Before reading the test I already knew the FPS would be extremely high without effort [ran the game maxed out with a GT 540M @60 FPS average].

I asked for an Intel CPU to be tested next to the FX in the scaling test.

My complaint was what was written on one of the pages about video cards, not the amount of cards tested, but i misread it, and apologized.

@ hahahanoobs

Your confusion about our comments regarding the performance of certain cards seems to be looking for something were there isn't anything. We are not reviewing the graphics cards, we are simply looking at the data from 24 different cards and making a few comments. We are not going to discuss the performance vs. price ratio of very card at every tested resolution. In the conclusion we make a few recommendations about what is needed for playable performance, that is it.

Finally on another note thanks to everyone that posted feedback and support we appreciate it.

Show me where I argued the performance.

"Folks looking to run ME3 maxed on a high-res 2560x1600 display will need at least a GTX 570 or GTX 480 to hit 60fps. That said, anything above the HD 6870 should deliver sufficiently smooth gameplay."

What I did notice the second time reading it, you say a 570 and 480 is needed for getting 60fps, yet you bypass the 7850@59fps, 6950@58, 5870@57 to write about needing a 6870@52, or higher for playable framerates. Are you saying an AMD card at 52fps is equal to 60fps on an nVIDIA card @ the same resolution? LOL I can't help it, it just doesn't make sense why you would summarize it like that.

Once again, I NEVER questioned the results, only the writing.

Staff
Matthew Matthew, TechSpot Staff, said:

And that is why I asked for the parts tested to be EVEN.

And to use an EVEN number of GPU's tested in the graphics tests.

How could this possibly be a bias or poor method of testing?

Because you wrongly assume Steve has an even number of CPUs/GPUs on hand. As he said, some parts go back, others don't. What's more, some generations launch ahead of others.

So instead of determining whether each card is a good fit for the review, you want him to arbitrarily add anything -- as long as AMD and Nvidia have the same number of products listed.

That makes no sense. He tests the parts that are available and are the most relevant to the specific review. If that means AMD or Nvidia has a few more products in the graphs, so be it.

venomblade said:

HD 7870 will be my definite next GPU upgrade, if it drops down a bit in price in the next few months. I'll still have a tiny bit of hope for kepler, but Nvidia you gotta start bringing the heat soon.

Staff
Steve Steve said:

@ hahahanoobs - I am curious as to what Nvidia cards we should have included to even it out? Or are you just suggesting we delete half a dozen relevant AMD graphics cards? Again you do understand that AMD has released their next generation cards and Nvidia has not, tipping the number of relevant cards in AMD's favor.

It is silly to suggest that the same amount of cards from each manufacturer should be tested. It's a little bit like saying if I were to test a range of Smartphones I could only look at one Samsung phone and one HTC phone because Apple only has a single phone.

No side should have substantially more products tested than the other. As far as not having a lot of high end AMD parts, that shouldn't matter, because most of your readers aren't using the high end stuff, so that should be more reason to show them. Which is my point. CPU's and GPU's both react differently to each game and/or its engines. The framerate continued upward all the way to 4.5GHz on the FX chip, and those people may want to overclock their FX to 4.5GHz now to get those extra frames. I'm just saying, it would be nice to know and see how an Intel chip would scale. Maybe the Intel chip only needs 4.2GHz to get that same 95fps, and owners of that chip or generation of chip, may be comfortable going that high for this game, whereas if they thought they needed to go to 4.5GHz, they wouldn't bother overclocking at all, because its much harder to get that high of clock, maybe because their cooling isn't up for it. It's information that is valuable to some, myself included, and doesn't have to be there to show who has a better chip, but to show what each CPU and GPU is capable of doing.

I'm sorry I did chuckle at that text block for a little while. Either you did not read my previous post or you simply did not understand it. Again what I said was "As for the CPU scaling performance again be realistic, we cannot test multiple processors here and why should we? It's not required as in the graph below we show a range of AMD and Intel processors"...

So did you look at the graph below? If you did you would know "to get that same 95fps" which was actually 91fps the Intel processors don't need to be clocked as high as even the 4.2GHz that you suggested. In fact even 4GHz would be too high. For 95fps you would have to heavily under clock either the Core i5-2500K, Core i7-2600K or Core i7-3960X processor as all rendered well over 100fps using their default clock speeds.

You might feel like your feedback is denied and as I said earlier that is because you are unrealistic. We have to pick and choose what we can test to deliver a timely article.

Finally about your last comment regarding the 2560x1600 comments. I am not sure why I need to say it again but we are only briefly discussing the results here, readers do not want to sift through tons of text to hear me blurt out exactly what they just saw above.

The writing makes sense. The GTX 570 and GTX 480 hit 60fps and if you want an average of 60fps they are the cards that can do it. Yes the 7850 was similar with 59fps, as was the 6950 and the 5870 and as I said anything above the 6870 should deliver smooth game play. The 6870 averaged 52fps so I am pretty sure that puts the 7850, 6950 and even the 5870 above it.

So then as I understand it, you are displeased because an Intel Core i3 processor was not tested, an Intel processor was not used for scaling (which would have told us nothing interesting by the way), there were more AMD graphics cards than Nvidia cards (for a good reason as there are more AMD cards on the market) and you didn't understand comments summarizing the results. From all that I am taking away that next time I will have to source a Core i3 processor.

spaceship said:

*facepalm...

Look at the numbers. First look at the numbers for games sold in the more popular of the PC eras. Now look at the numbers for the last 2 or so console cycles. Without consoles, games like Mass Effect would not even exist. We'd be stuck playing StarCraft and WoW 24/7.

TorturedChaos, TechSpot Chancellor, said:

I don't care how much people are complaining about the game - I'm loving the hell outa it. I'm normally not one for American RPG's. They usually don't pull me into the game and make be care about the characters on an emotional level, which is what makes me want to come back and play more. But the Mass Effect series has done that very well IMO, especially the 3rd one. I found myself several times stopping to think very hard about a choice in the game, not based on how the game play might change but based on my emotional attachment to the characters it effects, and the morality of my choice. I know to some that may seem rather odd for a video game, but I like to get that involved an RPG.

As far as graphics goes I'm happy with them. They look nice to me, but still can be run on a reasonable computer. (C2D E8400 @ 3.0ghz & 8800GTX 512mb). So I'm happy with the graphics. I know I need to upgrade my computer to keep up, but its nice I can still play some of the newer games on low to med graphics settings.

Guest said:

I'm just saying, it would be nice to know and see how an Intel chip would scale. Maybe the Intel chip only needs 4.2GHz to get that same 95fps, and owners of that chip or generation of chip, may be comfortable going that high for this game, whereas if they thought they needed to go to 4.5GHz, they wouldn't bother overclocking at all, because its much harder to get that high of clock, maybe because their cooling isn't up for it.

Complete fail. Also... Uh 4.5 GHz isn't hard at all to get to on a 2600k , and that's on AIR.

AIR coolers are cheap as crap and you can easily go to 4.5-4.7 GHz even in a mildy warm room.

And yea , the intel was way above the amd at it's stock clocks.

Anyways, if you meant some other intel chip that wasn't even listed then that's another fail for you cause you didn't specify.

One would only assume you meant the one listed.

One last note -

Anyone who overclocks their own CPUs would be able to infer what would happen on an intel from the above data.

If you couldn't then you honestly shouldn't even be trying to overclock at all.

gingerbill said:

Guest said:

The graphics may be dx9, but if you are playing mass effect for graphics superiority you are a *****. you play this game for the story. besides, the graphics still look pretty good to me.

agreed . No one is buying this game for cutting edge graphics , the graphics are good enough. And when your chatting to people it still looks very good.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.