In response to Evernessinces now deleted comment:
” “”So Nvidia cards benefit from DX12 more than AMD cards do now? I guess that makes sense, DX12 is mainstream now and Nvidia have more incentive in making it work for their customers””
Two problems with this assumption.
1. AMD optimized this title, not Nvidia.
2. One game does not make a trend nor does it wipe away Nvidia performance it other DX 12 titles.
If anything you could take from this, it's that maybe Nvidia should be asking AMD to optimize more games for them, as they seem to get a heck of allot more performance in DX12 on Nvidia cards then Nvidia has ever gotten in the past themselves.”
This isn’t the first example, the 1080ti appears to gain quite a lot from DX12, more so than Vega. It seems to becoming more and more commonplace. It makes sense to me, now that DX12 is mainstream Nvidias driver teams are optimising for it more. If you notice it’s usually the more popular titles that perform better on Nvidia cards, I don’t think this is a coincidence, I think Nvidia target the software most users run and optimise for that. Sounds obvious really but now that DX12 is mainstream I expect to see Nvidia cards benefiting more from it than AMD cards because I really do think Nvidia wins most of its battles with its drivers, or rather AMD loses most of its battles from its drivers. I’m not saying Nvidia drivers are more stable than AMD drivers but they seem to run games with better frame rates and they do come out more frequently and promptly.
I think it’s a massive crying shame what’s happened to Radeon over the last few years. I’ve traditionally preferred Radeon parts to GeForce as I seem to end up usually opting to buy Radeon. But now there isn’t a hope in hell im going Radeon once the new Nvidia parts are out. And I really need a new card. AMD are doing to the GPU market what they did to the CPU market a few years back - giving up and allowing the opposing company to charge what they like and it’s us gamers who end up paying, quite literally. I personally hope Intel come in and can be competitive a bit more consistently than AMD have been.
The standouts were the 4GB FuryX at 4k. Very impressive for such an old card and a new title.
Also, the 7970ghz demolishes the GTX 680 by about 50%, and that was the 4GB model. The GTX 780 does MUCH better, so it looks like bandwidth is the issue here and not buffer size.
Did I miss something? I can’t find the 680 or the 7970 or even the 280x or the 770 (same cards) on these graphs. And I looked as I’m running crossfire 280x!