The Witcher 3: Wild Hunt Benchmarked, Performance Review

In your list of gpus used it lists Titan X but no info was released in the review. Oversight?

Excellent overall article Steve.
 
Ever think the reason AMD is pro open source is because what they've come up with isn't that great and would go nowhere if it was exclusive to AMD? Hairworks vs TressFX? That's like a gorilla fighting a penguin. Mantle? Inconsistent as [expletitive]. MS and Khronos will take low level API's further than AMD ever could on their own.
That's a great anti-AMD rant, Mr. Troll. I just hope that you'll be really happy if AMD goes bankrupt and leaves the GPU market, Nvidia is left as the only player in that market, and you get slapped in the face by absurd prices because there's no competition so you don't have any other option besides bend over. Isn't that what you guys want, to win the fabled GPU wars and acquire the right to pay absurd monopoly prices?
Also, I really, really do not understand how anyone in their right mind could ever dream of considering Nvidia's atrocious anticompititive measures "good experience" and "what the customers want". I can tell you with all the certainty that any Nvidia user that isn't a petty loyalist or something does NOT want AMD GPUs to be deliberately crippled just to say "lol, I have a better card than you". Also, any Nvidia user with half a brain working understands that Nvidia taking control from the developers, adding their own closed-source code and optimizations to the game that the developers themselves can't touch, and require them not to allow any other company to have a chance to optimize it for their hardware, is not only an incredibly bad move, but also a slippery slope. What's next, will you be happy if we start seeing big releases that are contractully obligated to be exclusive to Nvidia GPUs? Would you be happy if AMD did the same and you got locked out of a game release?
If your argument is that Nvidia's anti-computivive practices are good and it's what the customers want, I don't know what planet or plane of oblivion you live in. AMD users aren't complaining that Nvidia tries to offer a better experience for their customers, they are complaining that Nvidia tries to stop other companies from offering a good experience to theirs. And if you think AMD should start doing the same to Nvidia users, so that everyone gets screwed in half of their games, you're out of your mind.

to be honest I would pay monopoly double prices happily , so I wouldnt have to listen to amd fan boys go on and on about mantle and freesync etc.

but I doubt they would charge more anyway as intel has had a monopoly for years and there prices have stayed the same
 
AMD 280x is on a par with GTX 780 when lagworks is disabled lol.

280X is *faster* than the 780 when the tessellation override is used. Nvidia really screwed over Fermi and Kepler users compared to Maxwell.

This is not surprising.

The 280x/7970 GHz has slowly been erasing the lead the 780 had at launch over it in the last year.

And its either because GCN is just aging better than Kepler or NV is just not optimizing the drivers for Kepler anymore.

Choose with reason suits you best.
 
This is not surprising.

The 280x/7970 GHz has slowly been erasing the lead the 780 had at launch over it in the last year.

And its either because GCN is just aging better than Kepler or NV is just not optimizing the drivers for Kepler anymore.

Choose with reason suits you best.

I would go for AMD's GCN is better at rendering compute effects while Fermi and Kepler had always beeen slower. Since more and more games get more compute heavy with the advent of X1 and PS4 games, I think GCN will do just fine compared to Nvidia's older architectures.
 
I just got done downloading the game (got it free with my 970 that I purchased recently). I haven't tried it yet, but apparently GFE thinks I can run the game with most of the settings cranked at 1200p, but I'm betting it will stutter like crazy. GFE also completely maxes out GTA V if I let it "optimize" the settings and I get some drops from 60 fps down to 30 in a lot of areas, which takes away from the experience. I just started using GFE when I got this card (wanted to see if it actually worked as it should) so I'm new to Nvidia's optimization profiles. I actually thought GFE would be modest with the settings for the newest games, but Nvidia apparently really does like to "crank that s#!t up", lol.
 
What exactly is the difference between TressFX and HairWorks?! I think nVidia and AMD should make a unified version of this... but hey, that ain't gonna happen :(.
Basically just different rendering tech to use what their GPUs excel at. For TressFX, Afaik it's more compute focused, which is where GCN does better at. HairWorks is heavily tessellation based, which Maxwell and ilk excel at, and AMD does poorly (hence imo the GameWorks argument here doesn't hold merit, just business). Nvidia doesn't have GCN to include in their cards, and AMD can't do a thing about tessellation performance in its cards, so them working together is never gonna happen.

Which looks better? Hard to say since so few titles have used both. TressFX in Tomb Raider I felt was a bit floaty at times, but overall satisfactory. HairWorks in Far Cry 4 was poorly done as you could see layers separating hair placement (guessing Ubisoft's fault). HairWorks in Witcher actually does look very nice, but I would like to see the current TressFX version in a game to compare it to.

I just got done downloading the game (got it free with my 970 that I purchased recently). I haven't tried it yet, but apparently GFE thinks I can run the game with most of the settings cranked at 1200p, but I'm betting it will stutter like crazy. GFE also completely maxes out GTA V if I let it "optimize" the settings and I get some drops from 60 fps down to 30 in a lot of areas, which takes away from the experience. I just started using GFE when I got this card (wanted to see if it actually worked as it should) so I'm new to Nvidia's optimization profiles. I actually thought GFE would be modest with the settings for the newest games, but Nvidia apparently really does like to "crank that s#!t up", lol.
I don't know what Nvidia is shooting for anymore with their own "playable settings for I.e. card". 1440P at max settings (no AA) with HairWorks is well under 50 frames average, let alone in the woods, for my SLI 970s. For GTAV I have a friend who is obsessed with getting the minimum frame rate to 60 for his 970, and he decided to try GFE and I sure heard about it...many of the areas are very open and will be performance degrading regardless of the settings.
 
Last edited:
Day 4: Still no official mention of AMD's tessellation override ridiculing Nvidia's Fermi and Kepler GPUs.
 
Hmmm, strange results.

I'm running
32gb Ram
Intel i5-3570k
Geforce 680 GTX 2gb

Hairworks off
Shadows set to high

Foliage Visibility Range set to high
Everything else maxed out running at 1080p

I'm getting 45 to 55 fps.

 
Seeing basically everything fail at 4k me so much more impressed with my Titan X SLI setup running at 1400/8000. I'm running the game 100% completely maxed out at 4k with ~50fps average framerates depending on what's going on; minimum framerates around 40. It's beautiful.
 
Seeing basically everything fail at 4k me so much more impressed with my Titan X SLI setup running at 1400/8000. I'm running the game 100% completely maxed out at 4k with ~50fps average framerates depending on what's going on; minimum framerates around 40. It's beautiful.

Yikes.... so you can't even get 60FPS with 2 Titan X in SLI? Once again proof that 4k isn't really viable yet... Wonder if it gets to 60FPS with 3 of them.... or would you need a 4th?!?!
 
Seeing basically everything fail at 4k me so much more impressed with my Titan X SLI setup running at 1400/8000. I'm running the game 100% completely maxed out at 4k with ~50fps average framerates depending on what's going on; minimum framerates around 40. It's beautiful.

Yikes.... so you can't even get 60FPS with 2 Titan X in SLI? Once again proof that 4k isn't really viable yet... Wonder if it gets to 60FPS with 3 of them.... or would you need a 4th?!?!

60fps is possible if I turn off AA. I'm happy with 50fps; the framerate is incredibly steady between 45-55.

From what I've seen, this game does not scale well at all with 3 or 4 cards, so I'm not sure performance would be much better - certainly wouldn't be worth an extra grand or two.
 
I surmised that the 'Uber ' setting was an experiment when witcher 2 came out, so my quad R290X will do just splendidly at 4k +...happy gaming... ;)
 
Hi,

There's something that puzzles me about the CPU benchmarks, perhaps someone can clear it up for me. Essentially an FX 9590 is an overclocked FX 8350. Only difference between them being the base frequency and turbo frequency. Thus an FX 9590 clocked at 4GHz should effectively be an FX 8350 with turbo disabled.

So I'm reading the following results:
FX 9590 @4.7 - 94 FPS avg
FX 8350 @4.0 - 89 FPS avg

So basically a 5 FPS drop for .7 GHz drop. So far so good. But then when you go to the CPU frequency scaling graph, the FX 9590 has almost no performance drop when scaling down from 4.5 GHz to 2.5 GHz.

At 4.0 GHz the FX 9590 shows 92 FPS on average, though it is basically an FX 8350 with turbo disabled so it should perform similar or lower than the 89 FPS of the FX 8350 (assuming that the 8350 doesn't throttle down immensely whereas the 9590 @4.0 has thermal control disabled).

And even more confusing, the FX 9590 @2.5 GHz scored 90 FPS avg and 78 FPS min. That's actually BETTER than the FX 8350 which is basically the same chip clocked at almost double the frequency.

How is this possible without the results being somehow wrong?

Thanks for the explanation and if this has been somehow explained above and I've missed it then nevermind me :)

Cheers,
Theo.
 
"Moreover, had CD Projekt Red left HairWorks out of the game, there's a good chance we would have ended up with a game that looks more like the one we saw in 2013's trailer, minus the lifelike hair of course." Really? We're blaming Nvidia for graphics downgrades too now? FFS!
 
Was HairWorks enabled for the benchmarks on the first few pages?

From the first page...

"The medium setting disables Nvidia's HairWorks, high only applies it to Geralt and Ultra uses HairWorks on everything with hair."
Lmao, why don't you try all utra benchmark with hair work OFF? Oh thats right, there goes your Nvidia cash because now you get 60 fps on AMD 290x and GTX 770 780. Fking misleading piece of trash.
 
Lmao, why don't you try all utra benchmark with hair work OFF? Oh thats right, there goes your Nvidia cash because now you get 60 fps on AMD 290x and GTX 770 780. Fking misleading piece of trash.

You got me. Nvidia paid us to say GameWorks sucked and that the game would have been better with TressFX. You are putting it all together Yuzez, life must be easy for you ;)
 
While I can understand the need for an attractive headline , make no mistake : The Witcher 3 is not the new Crysis.

Crysis is an 8 year old game that puts modern games to shame even today in many aspects. In contrast, TW3 looks really nice today but I highly doubt that is going to be as future proof as Crysis.Crysis is both a GPU and more importantly a CPU hog. It's really hard to get 1080p 60 FPS all the time , even with high end Intel CPUs.

The editor added that comment and I think you might be reading a little too much into it. I took it as more of a crack at how badly the game brings high-end hardware to its knees which it obviously does do.

As an FX6300 user I am really happy with how optimized TW3 is : This should be the standard for modern open world games Rockstar , Ubisoft , Bethesda and Rocksteady should take note.

Well optimized for FX processors or simply doesn’t require much CPU processing power? If the game was a heavy CPU user you wouldn’t find the FX processors near the top of the graph no matter how well the game was optimized.

The game simply doesn’t use much CPU processing power, the FX-8350 only saw 40% load while the Core i7 processor that we tested with saw less than 10% load on average.

Have you seen that VRAM usage?? Granted , textures are not top notch but neither are Watch Dogs , Shadow of Mordor , Project Cars and all the post 2013 games that started demanding more VRAM out of nowhere.

What I am trying to convey is that the game is very well optimized and runs really smooth. Patches and mods are only going to make it more attractive

Wait for a high resolution texture pack then you might see the game require more than 2GB of VRAM at 1080p.

So to run this at Ultra 1440p @ 60fps would require about 3 780s?

Don't risk it mate, bring a fourth :)

You should be banned from public exposure for posts like this.

This is a game that actually uses CPU resources it has at its disposal PROPERLY. And CPU performace IS SCALED nicely.

If what you say was true G3258 would perform SAME as i7/FX83xx, but even i3 chokes compared to non OC FX 6300.

Unfortunately people like you just cant comprehend some things.
 
Ever think the reason AMD is pro open source is because what they've come up with isn't that great and would go nowhere if it was exclusive to AMD? Hairworks vs TressFX? That's like a gorilla fighting a penguin. Mantle? Inconsistent as [expletitive]. MS and Khronos will take low level API's further than AMD ever could on their own.
That's a great anti-AMD rant, Mr. Troll. I just hope that you'll be really happy if AMD goes bankrupt and leaves the GPU market, Nvidia is left as the only player in that market, and you get slapped in the face by absurd prices because there's no competition so you don't have any other option besides bend over. Isn't that what you guys want, to win the fabled GPU wars and acquire the right to pay absurd monopoly prices?
Also, I really, really do not understand how anyone in their right mind could ever dream of considering Nvidia's atrocious anticompititive measures "good experience" and "what the customers want". I can tell you with all the certainty that any Nvidia user that isn't a petty loyalist or something does NOT want AMD GPUs to be deliberately crippled just to say "lol, I have a better card than you". Also, any Nvidia user with half a brain working understands that Nvidia taking control from the developers, adding their own closed-source code and optimizations to the game that the developers themselves can't touch, and require them not to allow any other company to have a chance to optimize it for their hardware, is not only an incredibly bad move, but also a slippery slope. What's next, will you be happy if we start seeing big releases that are contractully obligated to be exclusive to Nvidia GPUs? Would you be happy if AMD did the same and you got locked out of a game release?
If your argument is that Nvidia's anti-computivive practices are good and it's what the customers want, I don't know what planet or plane of oblivion you live in. AMD users aren't complaining that Nvidia tries to offer a better experience for their customers, they are complaining that Nvidia tries to stop other companies from offering a good experience to theirs. And if you think AMD should start doing the same to Nvidia users, so that everyone gets screwed in half of their games, you're out of your mind.

to be honest I would pay monopoly double prices happily , so I wouldnt have to listen to amd fan boys go on and on about mantle and freesync etc.

but I doubt they would charge more anyway as intel has had a monopoly for years and there prices have stayed the same

Intel was chrged and found guilty and paid nice fine because of their crappy business practices

NVidia has accumulated so much crap over the years, and even now AMD GPUs perform better for same price unless NVidia plays dirty.
 
You should be banned from public exposure for posts like this.

This is a game that actually uses CPU resources it has at its disposal PROPERLY. And CPU performace IS SCALED nicely.

If what you say was true G3258 would perform SAME as i7/FX83xx, but even i3 chokes compared to non OC FX 6300.

Unfortunately people like you just cant comprehend some things.

Honestly I can't even work out who you are talking to in that mess. I assume it's me.

You say the Core i3 chokes when compared to a non-overclocked FX-6300 and yet the Core i3 was faster when comparing minimum and average frame rates. More over with a GeForce GTX 980 on hand the Core i3 belted out 84fps which wasn't a great deal slower than the uber expensive processors.

What CPU resources does it use? It clearly doesn't use more than 4-cores and it really doesn't matter if you have a 2nd gen Core i5 or a 6th gen Core i5...

I guess the game does scale well since a Core i5 processor will deliver the same performance at 2.5GHz and 4.5GHz using a high-end GPU at a low-ish resolution.

Anyway I am not sure who it is that "just cant comprehend some things" so maybe you are right :)

Intel was chrged and found guilty and paid nice fine because of their crappy business practices

NVidia has accumulated so much crap over the years, and even now AMD GPUs perform better for same price unless NVidia plays dirty.

That explains why they are broke today then ;)

http://www.cnbc.com/2015/07/15/intel-q2-earnings-2015.html
 
Back