Evolve Benchmarked, Performance Review

Steve

Posts: 3,041   +3,149
Staff member

evolve benchmarked performance review amd nvidia gpu gaming benchmark evolve

Although it's almost exactly two years old now you could still make a strong case for the visuals in Crysis 3. With that in mind we've been eagerly anticipating more games based on the latest build of CryEngine. Now, that wait is over: after being delayed from its original October 2014 release date, Evolve has launched this week on PC, Xbox One and PS4.

Developed by Turtle Rock Studios (the creator of Left 4 Dead), Evolve is the company's latest squad-based co-op shooter and unsurprisingly, it calls for some fairly beefy hardware with the recommended system specifciations being an Intel Core i7-920 or AMD A8-3870K coupled with a GeForce GTX 760 or Radeon R9 280, as well as 6GB of RAM and 50GB of storage.

Read the complete article.

 
Last edited by a moderator:
Yet another game where the 7970 performs about as well as a 780, and the 290X meets or beats a "Superior" 980.
 
4047MB @ 25x14 resolution, damn.

"From a driver perspective, the latest GeForce 347.52 WHQL release worked fine but the GeForce Experience software was crashing for much of our testing."

Select Custom during installation and uncheck everything but the driver and PhysX.
 
Interesting benchmarks. On one hand it shows the 970 truly shining @ 1080p and on the other it shows that the memory partition is a REAL issue and it does in fact cripple performance.
 
4047MB @ 25x14 resolution, damn.

"From a driver perspective, the latest GeForce 347.52 WHQL release worked fine but the GeForce Experience software was crashing for much of our testing."

Select Custom during installation and uncheck everything but the driver and PhysX.

That has nothing to do with the crashing issue. The crashing was caused by Nvidia using html table in the release notes for the latest driver, geforce experience couldn't parse it so it would crash. The only fix was wait until Nvidia fixes it on their end because the notes come from server side
 
4047MB @ 25x14 resolution, damn.

"From a driver perspective, the latest GeForce 347.52 WHQL release worked fine but the GeForce Experience software was crashing for much of our testing."

Select Custom during installation and uncheck everything but the driver and PhysX.

That has nothing to do with the crashing issue. The crashing was caused by Nvidia using html table in the release notes for the latest driver, geforce experience couldn't parse it so it would crash. The only fix was wait until Nvidia fixes it on their end because the notes come from server side

Yes I can confirm that this information is correct.
 
After the flame war on the Intel Core i3 vs AMD 8230E or whatever it was, this just puts that more into perspective. A Core i5 at 3.20GHz was happily beating a 9000 series AMD clocked at 4.7GHz.
I don't care what synthetic Benchmarks the AMD chip is better at, things that actually affect me (such as gaming performance) and therefore, actually matter, the Intel chips are clearly better.
 
After the flame war on the Intel Core i3 vs AMD 8230E or whatever it was, this just puts that more into perspective. A Core i5 at 3.20GHz was happily beating a 9000 series AMD clocked at 4.7GHz.
I don't care what synthetic Benchmarks the AMD chip is better at, things that actually affect me (such as gaming performance) and therefore, actually matter, the Intel chips are clearly better.

Sadly it isn't the first time we have seen this. Still AMD fans tell me this will all change once DX12 games arrive, I hope they are right. The CPU landscape will have also changed a lot by then though so who knows ;)

I just love that "GeForce GTX 970 (3584 + 512MB)" part, it really made my day :D
Also, thanks for the test!

It only seems right now.
 
Sadly it isn't the first time we have seen this. Still AMD fans tell me this will all change once DX12 games arrive, I hope they are right. The CPU landscape will have also changed a lot by then though so who knows
The only problem is that the hundreds of currently DX11 games will not magically update themselves for DX12, so if someone does not limit herself/himself to the very latest games, this won't help AMD CPUs as much as some fans hope. Not to mention that not every game will use DX12 (AFAIK it needs much more knowledge and precision), so the DX11.3 titles also have to be taken into account.
 
Interesting benchmarks. On one hand it shows the 970 truly shining @ 1080p and on the other it shows that the memory partition is a REAL issue and it does in fact cripple performance.
Yeah very interesting to see that in reality and not just in the news.

I hate games that favor Intel CPUs... It means the developers are lazy and refuse to optimize it for the "inferior" platform. I heard the game has a ton of horrible IAP anyways so it is already off my list.

Nonetheless, another awesome review Steve. Maybe introduce 4K testing and drop the 768p because barely anyone games on that resolution on a desktop.
 
My gtx 970 doesnt go above 32 fps ever in this game why? using fx 8320 cpu in my pc
 
This game looks to be pretty!
My old tired rig is starting to show its age.

, and the 290X meets or beats a "Superior" 980.
The 980 beats the 290 without all the baggage, in a game where AMD GPU's do well. That being said the results are so close only fools will argue over 3-7 FPS.
 
Last edited:
Well perhaps something is bottlenecking your graphics card ? Like ya AMD cpu there.
I too wanted to say I love that the 970 is now with its correct stats of 3.5+.5 tho I have one.

I really don't know what to do about the card. It out performs the 290x @ 1080p settings and uses under half the power to get there. And if I got a refund I would only be able to get the 290x radiator like the sun edition, 300watts ... and if you get 2 x 2gb with the 960 in sli you still only have 2gb of vram so its like dang the 970 is the sweet spot. Shown here it doesnt suck even tho its a 3.5gb card. I just gotta accept right ? :(

I will just bend over and let Nvidia have their way .... how ever wrong it seems. Think someone needs to tell Nvidia you measure the shaft not the nuts too....

The only other thing is that the hate that this game has got. And rightly so. £120 for everything ? 2K can go join Nvidia in the you dont deserve any money you greedy swines corner. Who the hell ? All this Day 1 announced content ? And some of it obtainable on the day. The whole DLC and Pay 2 Win marketing has gone off the scale. And virtual skins ... digital art, 10kb of data... for extortionate prices. My favorite is one of the reviews for CS:GO:
"Knife in real life: 10$
Knife in game: 400$

10/10"
I think people have lost the plot of whats acceptable. I think I am on the wrong side of the nuthouse walls.
 
Steve, thanks for adding the Pentium and Celeron to the test bench to put things in perspective. There's a huge lack of reference for those chips in game benchmarks; and sometimes you might be surprised by the price/performance in games that don't need 4 cores/threads -for people with extremely low budget.

I have a Core i5 myself and a friend plays with a SB Celeron and it has worked pretty well for him. Time ago he moved from a LGA775 Pentium Dual-Core [don't remember model] to the LGA1155 current Celeron -keeping GPU (GT 220), HDD, and everything else possible. His performance in games more than tripled and I was absolutely surprised.
 
Nice review @Steve , seems this game has some work ahead though to make sure everything functions properly. I am actually thinking about getting it and trying it out at 4K on my system to see how well it handles.

Surprising on the CPU landscape though, not surprising they focused on quad core optimizations for this game since a majority of users (Probably an overwhelming majority) have some form of quad core.

The GPU results blew my mind, but it may just be an optimization thing that is necessary to bring things more into perspective. I was shocked to see the GTX 970 choke so hard at 1600p like that. I guess the game must really want an extreme amount of Vram or there is something else causing the problem like some optimizations from the driver/game side.
 
After the flame war on the Intel Core i3 vs AMD 8230E or whatever it was, this just puts that more into perspective. A Core i5 at 3.20GHz was happily beating a 9000 series AMD clocked at 4.7GHz.

Well, by all means, the FX excel when proper multithreading is used. The 8000 series have 8 cores, but the cache is shared at any given time by 2 cores. The i5 for example has separated cache for each core. Evolve sees an i5 as an i5. It also sees an 8350 at 4 GHz as in FX 4300 at 4 GHz, unfortunately. Nobody argued that in 4 core aware game, Intel will beat down on AMD.

Normally you shouldn't be using an 980/290X/970 with an FX in the first place IMO, so it's still not a problem. DX12 will be AMDs savior.

Like always, it all depends on what GPU you are using. If you are using a 285/960 or lower, you'll get the same performance with an Intel or AMD GPU, since the GPU becomes the bottleneck.
 
Like always, it all depends on what GPU you are using. If you are using a 285/960 or lower, you'll get the same performance with an Intel or AMD GPU, since the GPU becomes the bottleneck.

same performance with an Intel or AMD CPU**, since the GPU becomes the bottleneck
 
I love how consistently the 3770k is up at the top of bencmarks still. It just reaffirms that I don't need to upgrade my CPU or my graphics for a good long time.
 
As far as I know Mr GRyder. The 4gb VRAM becomes an issue at just 1440p. Now I dunno if thats all games at that would say herro I need all 4gb of ya VRAM, but then I saw tests somone say the topple starts at 3.2gb of the VRAM and you get the jerky jerky.

Example: GeForce Experience. Optimized CS:GO 2715x1527. Over 1440P. Squished into 1080p apparently for extra sexiness. After a while got choppy in a few places ? Connection ? Or VRAM buffer butter stutter ?
Works fine at 1080p tho.

Do wanna see how this game looks, when its £10 the lot count me in.
 
Would it be possible to put AMD A8 and A10 APU's in the mix for future reviews since we sell quite a few of them as mid range machines.

Instead of the Phenom's on there thats been off the market for ages now it would be nice to get a idea of where the A8/10's land in terms of CPU performance.

Also have to agree inspite of all the (3.5+500mb) stuff with the 970 it does seem a pretty solid card still for the price, even at X1600 res.
 
"Those hoping to run Evolve with maximum eye candy should get by with relatively modest hardware when playing at 1366x768"

So... who games at 1366x768?
 
"Those hoping to run Evolve with maximum eye candy should get by with relatively modest hardware when playing at 1366x768"

So... who games at 1366x768?

http://store.steampowered.com/hwsurvey?platform=pc

Apparently a large bulk of PC gamers. 27.5% of all Steam gamers play at 1366x768 while 33.7% play at 1080p. The rest are scattered across various other far less popular resolutions.

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007617 600012666

A large portion of cheap desktop monitors run at 1366x768, as do many laptops so it makes sense.
 
Interesting benchmarks. On one hand it shows the 970 truly shining @ 1080p and on the other it shows that the memory partition is a REAL issue and it does in fact cripple performance.
GTX 980 1080p to 1440p :frame drop =37 fps
GTX 970 1080p to 1440p :frame drop =39 fps
so.............?
 
Back