Read the full article at:
https://www.techspot.com/review/507-mass-effect-3-performance-test/
Please leave your feedback here.
https://www.techspot.com/review/507-mass-effect-3-performance-test/
Please leave your feedback here.
But in march 2011 Unreal Engine was update and support DirectX11 plus tessellation and displacement mapping, advanced hair rendering with MSAA, deferred rendering with MSAA, screen space subsurface scattering, image-based lighting, billboard reflections, glossy reflections, reflection shadows, point light reflections, and bokeh depth of field.As with previous entries, Mass Effect 3 is built using Unreal Engine 3. In other words, it's a DirectX 9 title.
Have you guys seen the disgusting terms and conditions of the Origin DRM?
http://www.rockpapershotgun.com/2011/08/24/eas-origin-eula-proves-even-more-sinister/
Quote from the TOS:
You agree that EA may collect, use, store and transmit technical and related information that identifies your computer (including the Internet Protocol Address), operating system, Application usage (including but not limited to successful installation and/or removal), software, software usage and peripheral hardware, that may be gathered periodically to facilitate the provision of software updates, dynamically served content, product support and other services to you, including online services. EA may also use this information combined with personal information for marketing purposes and to improve our products and services. We may also share that data with our third party service providers in a form that does not personally identify you. IF YOU DO NOT WANT EA TO COLLECT, USE, STORE, TRANSMIT OR DISPLAY THE DATA DESCRIBED IN THIS SECTION, PLEASE DO NOT INSTALL OR USE THE APPLICATION.
Luckily, whilst looking for the quote above, I found a way to stop (i hope) Origin from this MASSIVE invasion of privacy.
Yes, I would love to see tessellation in DX9... More to the point, DX11 isn't about new eye candy but increasing efficiency so that more can be delivered for less.Guest said:
Now start bashing MW3 for not providing a new graphics engine unlike BF3 which requires DX11 card for features which a DX9 card can handle too![]()
In fairness, development of Mass Effect 3 began back in 2010 or earlier if reports are correct so they never had a chance to implement it. Well, they did but they couldn't be arsed...Guest said:
But in march 2011 Unreal Engine was update and support DirectX11 plus tessellation and displacement mapping, advanced hair rendering with MSAA, deferred rendering with MSAA, screen space subsurface scattering, image-based lighting, billboard reflections, glossy reflections, reflection shadows, point light reflections, and bokeh depth of field.As with previous entries, Mass Effect 3 is built using Unreal Engine 3. In other words, it's a DirectX 9 title.
Taken from Wikipedia.
Then the problem is´t Unreal Engine, the real problem were lazy developers, whom did not bother to update to the last Unreal Engine to gain performance and better looking game.
hahahanoobs said:
Once again a nice [PC Game] GPU & CPU Performance Test. The CPU clock scaling in this game surprised me the most... in a good way.
Question 1: Why do you test so many AMD CPU's versus Intel CPU's (4 Intel vs 8 AMD)?
Question 2: For the 1920x1080 graphics card test, you mention the 570 and 480 hitting 60fps (both $300+ cards), followed by the 6870 ($200) that only gets 52fps. Why not also at least mention the 6950 ($300) hitting 58fps?
I'm not crying bias, but it seems you favour AMD CPU's and nVIDIA GPU's. I believe it should be even for all sides. Same for any CPU core/clock scaling tests. For the record I have a 2500K @ 4.6GHz paired with an unlocked 6950 2GB OC'd to 880/1375MHz.
EEatGDL said:
hahahanoobs said:
Once again a nice [PC Game] GPU & CPU Performance Test. The CPU clock scaling in this game surprised me the most... in a good way.
Question 1: Why do you test so many AMD CPU's versus Intel CPU's (4 Intel vs 8 AMD)?
Question 2: For the 1920x1080 graphics card test, you mention the 570 and 480 hitting 60fps (both $300+ cards), followed by the 6870 ($200) that only gets 52fps. Why not also at least mention the 6950 ($300) hitting 58fps?
I'm not crying bias, but it seems you favour AMD CPU's and nVIDIA GPU's. I believe it should be even for all sides. Same for any CPU core/clock scaling tests. For the record I have a 2500K @ 4.6GHz paired with an unlocked 6950 2GB OC'd to 880/1375MHz.
Because of previous experiences with many other games, actually there's a lot of AMD fanboys who still think testers don't want to show up the potential of Bulldozer, that's why Steve decided to OC it this time to see how up it goes, we all know by now (newEgg TV, LinusTechTips, etc.) that the i5 beats the Bulldozer, so there's no need to use either an i7 for a CPU scaling test.
But yeah, Steve could have tested more cards as usual; but simply running this tests for ME3 I consider it a waste of time, like the one with Duke Nukhem. Before reading the test I already knew the FPS would be extremely high without effort [ran the game maxed out with a GT 540M @60 FPS average].