In your list of gpus used it lists Titan X but no info was released in the review. Oversight?
Excellent overall article Steve.
That's a great anti-AMD rant, Mr. Troll. I just hope that you'll be really happy if AMD goes bankrupt and leaves the GPU market, Nvidia is left as the only player in that market, and you get slapped in the face by absurd prices because there's no competition so you don't have any other option besides bend over. Isn't that what you guys want, to win the fabled GPU wars and acquire the right to pay absurd monopoly prices?Ever think the reason AMD is pro open source is because what they've come up with isn't that great and would go nowhere if it was exclusive to AMD? Hairworks vs TressFX? That's like a gorilla fighting a penguin. Mantle? Inconsistent as [expletitive]. MS and Khronos will take low level API's further than AMD ever could on their own.
Also, I really, really do not understand how anyone in their right mind could ever dream of considering Nvidia's atrocious anticompititive measures "good experience" and "what the customers want". I can tell you with all the certainty that any Nvidia user that isn't a petty loyalist or something does NOT want AMD GPUs to be deliberately crippled just to say "lol, I have a better card than you". Also, any Nvidia user with half a brain working understands that Nvidia taking control from the developers, adding their own closed-source code and optimizations to the game that the developers themselves can't touch, and require them not to allow any other company to have a chance to optimize it for their hardware, is not only an incredibly bad move, but also a slippery slope. What's next, will you be happy if we start seeing big releases that are contractully obligated to be exclusive to Nvidia GPUs? Would you be happy if AMD did the same and you got locked out of a game release?
If your argument is that Nvidia's anti-computivive practices are good and it's what the customers want, I don't know what planet or plane of oblivion you live in. AMD users aren't complaining that Nvidia tries to offer a better experience for their customers, they are complaining that Nvidia tries to stop other companies from offering a good experience to theirs. And if you think AMD should start doing the same to Nvidia users, so that everyone gets screwed in half of their games, you're out of your mind.
AMD 280x is on a par with GTX 780 when lagworks is disabled lol.
280X is *faster* than the 780 when the tessellation override is used. Nvidia really screwed over Fermi and Kepler users compared to Maxwell.
This is not surprising.
The 280x/7970 GHz has slowly been erasing the lead the 780 had at launch over it in the last year.
And its either because GCN is just aging better than Kepler or NV is just not optimizing the drivers for Kepler anymore.
Choose with reason suits you best.
Basically just different rendering tech to use what their GPUs excel at. For TressFX, Afaik it's more compute focused, which is where GCN does better at. HairWorks is heavily tessellation based, which Maxwell and ilk excel at, and AMD does poorly (hence imo the GameWorks argument here doesn't hold merit, just business). Nvidia doesn't have GCN to include in their cards, and AMD can't do a thing about tessellation performance in its cards, so them working together is never gonna happen.What exactly is the difference between TressFX and HairWorks?! I think nVidia and AMD should make a unified version of this... but hey, that ain't gonna happen .
I don't know what Nvidia is shooting for anymore with their own "playable settings for I.e. card". 1440P at max settings (no AA) with HairWorks is well under 50 frames average, let alone in the woods, for my SLI 970s. For GTAV I have a friend who is obsessed with getting the minimum frame rate to 60 for his 970, and he decided to try GFE and I sure heard about it...many of the areas are very open and will be performance degrading regardless of the settings.I just got done downloading the game (got it free with my 970 that I purchased recently). I haven't tried it yet, but apparently GFE thinks I can run the game with most of the settings cranked at 1200p, but I'm betting it will stutter like crazy. GFE also completely maxes out GTA V if I let it "optimize" the settings and I get some drops from 60 fps down to 30 in a lot of areas, which takes away from the experience. I just started using GFE when I got this card (wanted to see if it actually worked as it should) so I'm new to Nvidia's optimization profiles. I actually thought GFE would be modest with the settings for the newest games, but Nvidia apparently really does like to "crank that s#!t up", lol.
Seeing basically everything fail at 4k me so much more impressed with my Titan X SLI setup running at 1400/8000. I'm running the game 100% completely maxed out at 4k with ~50fps average framerates depending on what's going on; minimum framerates around 40. It's beautiful.
Seeing basically everything fail at 4k me so much more impressed with my Titan X SLI setup running at 1400/8000. I'm running the game 100% completely maxed out at 4k with ~50fps average framerates depending on what's going on; minimum framerates around 40. It's beautiful.
Yikes.... so you can't even get 60FPS with 2 Titan X in SLI? Once again proof that 4k isn't really viable yet... Wonder if it gets to 60FPS with 3 of them.... or would you need a 4th?!?!
Lmao, why don't you try all utra benchmark with hair work OFF? Oh thats right, there goes your Nvidia cash because now you get 60 fps on AMD 290x and GTX 770 780. Fking misleading piece of trash.Was HairWorks enabled for the benchmarks on the first few pages?
From the first page...
"The medium setting disables Nvidia's HairWorks, high only applies it to Geralt and Ultra uses HairWorks on everything with hair."
Lmao, why don't you try all utra benchmark with hair work OFF? Oh thats right, there goes your Nvidia cash because now you get 60 fps on AMD 290x and GTX 770 780. Fking misleading piece of trash.
While I can understand the need for an attractive headline , make no mistake : The Witcher 3 is not the new Crysis.
Crysis is an 8 year old game that puts modern games to shame even today in many aspects. In contrast, TW3 looks really nice today but I highly doubt that is going to be as future proof as Crysis.Crysis is both a GPU and more importantly a CPU hog. It's really hard to get 1080p 60 FPS all the time , even with high end Intel CPUs.
The editor added that comment and I think you might be reading a little too much into it. I took it as more of a crack at how badly the game brings high-end hardware to its knees which it obviously does do.
As an FX6300 user I am really happy with how optimized TW3 is : This should be the standard for modern open world games Rockstar , Ubisoft , Bethesda and Rocksteady should take note.
Well optimized for FX processors or simply doesn’t require much CPU processing power? If the game was a heavy CPU user you wouldn’t find the FX processors near the top of the graph no matter how well the game was optimized.
The game simply doesn’t use much CPU processing power, the FX-8350 only saw 40% load while the Core i7 processor that we tested with saw less than 10% load on average.
Have you seen that VRAM usage?? Granted , textures are not top notch but neither are Watch Dogs , Shadow of Mordor , Project Cars and all the post 2013 games that started demanding more VRAM out of nowhere.
What I am trying to convey is that the game is very well optimized and runs really smooth. Patches and mods are only going to make it more attractive
Wait for a high resolution texture pack then you might see the game require more than 2GB of VRAM at 1080p.
So to run this at Ultra 1440p @ 60fps would require about 3 780s?
Don't risk it mate, bring a fourth
That's a great anti-AMD rant, Mr. Troll. I just hope that you'll be really happy if AMD goes bankrupt and leaves the GPU market, Nvidia is left as the only player in that market, and you get slapped in the face by absurd prices because there's no competition so you don't have any other option besides bend over. Isn't that what you guys want, to win the fabled GPU wars and acquire the right to pay absurd monopoly prices?Ever think the reason AMD is pro open source is because what they've come up with isn't that great and would go nowhere if it was exclusive to AMD? Hairworks vs TressFX? That's like a gorilla fighting a penguin. Mantle? Inconsistent as [expletitive]. MS and Khronos will take low level API's further than AMD ever could on their own.
Also, I really, really do not understand how anyone in their right mind could ever dream of considering Nvidia's atrocious anticompititive measures "good experience" and "what the customers want". I can tell you with all the certainty that any Nvidia user that isn't a petty loyalist or something does NOT want AMD GPUs to be deliberately crippled just to say "lol, I have a better card than you". Also, any Nvidia user with half a brain working understands that Nvidia taking control from the developers, adding their own closed-source code and optimizations to the game that the developers themselves can't touch, and require them not to allow any other company to have a chance to optimize it for their hardware, is not only an incredibly bad move, but also a slippery slope. What's next, will you be happy if we start seeing big releases that are contractully obligated to be exclusive to Nvidia GPUs? Would you be happy if AMD did the same and you got locked out of a game release?
If your argument is that Nvidia's anti-computivive practices are good and it's what the customers want, I don't know what planet or plane of oblivion you live in. AMD users aren't complaining that Nvidia tries to offer a better experience for their customers, they are complaining that Nvidia tries to stop other companies from offering a good experience to theirs. And if you think AMD should start doing the same to Nvidia users, so that everyone gets screwed in half of their games, you're out of your mind.
to be honest I would pay monopoly double prices happily , so I wouldnt have to listen to amd fan boys go on and on about mantle and freesync etc.
but I doubt they would charge more anyway as intel has had a monopoly for years and there prices have stayed the same
You should be banned from public exposure for posts like this.
This is a game that actually uses CPU resources it has at its disposal PROPERLY. And CPU performace IS SCALED nicely.
If what you say was true G3258 would perform SAME as i7/FX83xx, but even i3 chokes compared to non OC FX 6300.
Unfortunately people like you just cant comprehend some things.
Intel was chrged and found guilty and paid nice fine because of their crappy business practices
NVidia has accumulated so much crap over the years, and even now AMD GPUs perform better for same price unless NVidia plays dirty.
I'm guessing you haven't checked out the latest AMD Fury reviews out then...AMD GPUs perform better for same price unless NVidia plays dirty.
Key word was as in long forgotten history.Intel was chrged and found guilty and paid nice fine because of their crappy business practices
What exactly is your job title at AMD?NVidia has accumulated so much crap over the years, and even now AMD GPUs perform better for same price unless NVidia plays dirty.