The Witcher 3: Wild Hunt Benchmarked, Performance Review

Steve

Posts: 3,043   +3,153
Staff member

witcher wild hunt benchmarked performance review gpu cpu performance the witcher wild hunt the witcher 3

It was around this time four years ago that The Witcher 2: Assassins of Kings impressed critics with everything from its demanding PC graphics and combat mechanics to its rich storytelling and environments. Selling nearly two million copies in its first year, the game was a great success for CD Projekt Red (CDPR) so it came as no surprise when a follow-up was announced.

After much anticipation and a few delays, The Witcher 3: Wild Hunt has launched this week to a similar degree of critical acclaim, though there is some controversy surrounding the game's reveal vs. launch graphics.

In 2013, CDPR teased a world that was bound to bring PC gaming graphics to a new level. That changed with the arrival of the Xbox One and Playstation 4. To optimize Wild Hunt for consoles, it seems CDPR has downgraded some of the visuals that were previously expected to be in the PC build.

Read the complete review.

 
Was HairWorks enabled for the benchmarks on the first few pages?

From the first page...

"The medium setting disables Nvidia's HairWorks, high only applies it to Geralt and Ultra uses HairWorks on everything with hair."
 
Since AMD's tessellation override was found to do miracles in Witcher 3, could you guys do a sidenote on that? I'd like to see proper scores for more AMD GPUs.

I mean, using x16 or even x8 tessellation factor with Water on high/ultra and Hairworks on maintains 100% visual quality while running the game 15-20 fps faster, something not Nvidia GPUs can't claim.
 
AMD 280x is on a par with GTX 780 when lagworks is disabled lol.

280X is *faster* than the 780 when the tessellation override is used. Nvidia really screwed over Fermi and Kepler users compared to Maxwell.
 
Imo you should have done the entire benchmark without hairworks and then do a small comparison on how hairworks ruins performance for pretty much every GPU instead of the other way around.

I'd really like to know if my 280X can handle this game on High settings without hairworks.
 
WHERE IS GTX 750TI ???
Every time you post a benchmark , 750ti is on the list but it's not in your charts !
It seems you copy- paste the first page of article everytime

The GTX 750 Ti cannot play the game on any of the settings we used, so it missed the cut off. Yes, it was a copy and paste error, believe it or not I don't write those system specs each time sorry for the error it is fixed now.

Plus I just learnt what CTRL+C and CTRL+V does ... SO LET ME USE IT!!!
 
"Do note, Maxwell GPUs are up to three times faster at tessellation than previous-generation GPUs, so you may experience lower HairWorks performance on other graphics cards due to the extensive use of tessellated hairs."

http://www.geforce.com/whats-new/gu...-hunt-graphics-performance-and-tweaking-guide
ain't that just dandy. Lets make extensive use of tessellation on terrain and water on anything higher than lowest settings, so the old tech will burn. I just hope the visuals are at least partly worth it.
 
Imo you should have done the entire benchmark without hairworks and then do a small comparison on how hairworks ruins performance for pretty much every GPU instead of the other way around.

I'd really like to know if my 280X can handle this game on High settings without hairworks.

Shadows on Low (they look 95% the same on Ultra), Foliage High, everything else on/maxed, 1080p, 280X, constant 40+ everywhere.

Open Catalyst Control Center and set at the Tessellation Settings Override and Tessellation level x8.

You'll get a massive boost in performance and visuals for 0 IQ loss. You can also play with Hairworks set on All with no problem.
 
While I can understand the need for an attractive headline , make no mistake : The Witcher 3 is not the new Crysis.

Crysis is an 8 year old game that puts modern games to shame even today in many aspects. In contrast, TW3 looks really nice today but I highly doubt that is going to be as future proof as Crysis.Crysis is both a GPU and more importantly a CPU hog. It's really hard to get 1080p 60 FPS all the time , even with high end Intel CPUs.

As an FX6300 user I am really happy with how optimized TW3 is : This should be the standard for modern open world games Rockstar , Ubisoft , Bethesda and Rocksteady should take note. Regarding IQ, Crysis scaled massively when changing the presets : Low looks worse than the OG Far Cry and Ultra is stunning. In contrast , you only find marginal IQ gains when going from Low to Ultra in TW3. So maxing out the game does not justify the GPU power it demands.

Have you seen that VRAM usage?? Granted , textures are not top notch but neither are Watch Dogs , Shadow of Mordor , Project Cars and all the post 2013 games that started demanding more VRAM out of nowhere.

What I am trying to convey is that the game is very well optimized and runs really smooth. Patches and mods are only going to make it more attractive
 
While I can understand the need for an attractive headline , make no mistake : The Witcher 3 is not the new Crysis.

Crysis is an 8 year old game that puts modern games to shame even today in many aspects. In contrast, TW3 looks really nice today but I highly doubt that is going to be as future proof as Crysis.Crysis is both a GPU and more importantly a CPU hog. It's really hard to get 1080p 60 FPS all the time , even with high end Intel CPUs.

The editor added that comment and I think you might be reading a little too much into it. I took it as more of a crack at how badly the game brings high-end hardware to its knees which it obviously does do.

As an FX6300 user I am really happy with how optimized TW3 is : This should be the standard for modern open world games Rockstar , Ubisoft , Bethesda and Rocksteady should take note.

Well optimized for FX processors or simply doesn’t require much CPU processing power? If the game was a heavy CPU user you wouldn’t find the FX processors near the top of the graph no matter how well the game was optimized.

The game simply doesn’t use much CPU processing power, the FX-8350 only saw 40% load while the Core i7 processor that we tested with saw less than 10% load on average.

Have you seen that VRAM usage?? Granted , textures are not top notch but neither are Watch Dogs , Shadow of Mordor , Project Cars and all the post 2013 games that started demanding more VRAM out of nowhere.

What I am trying to convey is that the game is very well optimized and runs really smooth. Patches and mods are only going to make it more attractive

Wait for a high resolution texture pack then you might see the game require more than 2GB of VRAM at 1080p.

So to run this at Ultra 1440p @ 60fps would require about 3 780s?

Don't risk it mate, bring a fourth :)
 
Another open world title ported to PC from consoles a month ago used CPU extensively, but hardly used top tier GPUs. Interesting to see different aproach to sandbox design on the same hardware. Although the speed at which You travel through the map probably plays a major role in coding here. Anyway, both new AAA PC titles force You to upgrade Your rig to meet current and future HQ settings, yet allow those less fortunate play it on older machines without sacrificing nexgen consoles quality.
 
Great review @Steve , glad to see how The Witcher 3 stacks up. I found the hairworks part the most interesting, I have not gotten the game yet but was curious how this would turn out. Though I cannot believe the differences with it turned off and on (Performance wise), I am curious how much of a difference visually it will make to me.

Ill have to see when I finally pick it up.
 
God, I can't believe how much Nvidia screwed up with this whole hairworks thing. Instead of providing a competitive option, they ended up putting sticks in their own wheels. Serves them right - just when I had the idea of maybe finally opting to purchase an Nvidia card, this whole fuss made me not go that way and rather wait for the new R9 3** series cards. Shame on you Nvidia. Shame.
 
I was wondering where the Titan X was, very intrigued to see how high in the graphs it would be :)
 
That is one gory scene/screen capture: mutilation.
(on this note, as an old man, I 'demand' that hbo's game of thrones must show more skins, preferably from lena headey)
 
I play on a 2560x1440 monitor with a 780 on high with just hairworks turned off and I average 35 and 40 fps. Very impressed with that but this game is showing that I might need to upgrade my GPU sometime next year.

Also it looks like I wont need to upgrade my 3770k for at least 3 to 4 more years. It's still up there with the best of them!

Note: Please edit rather than double post.
 
Ugh. I thought I would be fine with my 760s for a few years, but it looks like they're becoming has beens pretty quickly. Oh well, I still have my PS4 and Xbox One. :-(
 
Back