Strange Brigade Benchmarked: 30+ GPUs tested

So Nvidia cards benefit from DX12 more than AMD cards do now? I guess that makes sense, DX12 is mainstream now and Nvidia have more incentive in making it work for their customers.
 
So Nvidia cards benefit from DX12 more than AMD cards do now? I guess that makes sense, DX12 is mainstream now and Nvidia have more incentive in making it work for their customers.
Pascal did receive a few good extra DX12 features compared to Maxwell. But I expect that a proper Async Compute implementation will only come with the new GPUs they are releasing now. It also depends a lot on the game and what type of optimisations they've done.

On a side-note, I really like that even lower end cards can play this game. Very few games these days launch with such good optimisations.
 
"In fact, this is becoming a common trend, where every game AMD has a hand in developing turns out being extremely well optimized for all."

Good guy AMD. Glad to see the red team playing fair. Classy.
 
"In fact, this is becoming a common trend, where every game AMD has a hand in developing turns out being extremely well optimized for all."

Good guy AMD. Glad to see the red team playing fair. Classy.
Are we all just prentending to forget that it was less than 12 months ago that AMD launched Vega with a rebate that was terminated moments after the review embargo was lifted on day one in order to purposefully deceive reviewers and by extension consumers by putting out reviews stating that the cards were better value than they actually are? Or the entire Vega frontier edition debacle that turned out to be early access to the Vega tech at extravagant prices?

AMD are no better to consumers than Nvidia. Both companies only care about your money.
 
Both AMD and Nvidia did great with such a nice looking game.

The standouts were the 4GB FuryX at 4k. Very impressive for such an old card and a new title.
Also, the 7970ghz demolishes the GTX 680 by about 50%, and that was the 4GB model. The GTX 780 does MUCH better, so it looks like bandwidth is the issue here and not buffer size.


We see further evidence of this on the low end with the 2GB GTX 1050 matching or beating the 3GB GTX 1050, despite less shaders. We also see it on the high end, where the GTX 1080 is faster than usual in comparison to the GTX 1070ti. Same goes for AMD where the RX 580 at higher resolutions begins to fall on its face in comparison to it's older Rx 390x and FuryX cousins
 
So Nvidia cards benefit from DX12 more than AMD cards do now? I guess that makes sense, DX12 is mainstream now and Nvidia have more incentive in making it work for their customers.

Two problems with this assumption.

1. AMD optimized this title, not Nvidia.

2. One game does not make a trend nor does it wipe away Nvidia performance it other DX 12 titles.

If anything you could take from this, it's that maybe Nvidia should be asking AMD to optimize more games for them, as they seem to get a heck of allot more performance in DX12 on Nvidia cards then Nvidia has ever gotten in the past themselves.
 
If anything you could take from this, it's that maybe Nvidia should be asking AMD to optimize more games for them, as they seem to get a heck of allot more performance in DX12 on Nvidia cards then Nvidia has ever gotten in the past themselves.
I laugh, but it's true :D
 
AMD are no better to consumers than Nvidia. Both companies only care about your money.

I'll be surprised if AMD are still making GPU's this time next year.
They are getting their ***'s kicked all over the place, nobody cared about a-sync compute (except fanboys using it as an argument) and every card runs DX12 good enough now. Pile on year after year of hot running cards with no OC headroom unless your on water, and you get the market share results you have today.

On the other side on the coin people are excited about Ray-Tracing... Nvidia has the entire GPU market on lockdown. While it will be 2-3 years before ray-tracing is actually needed for PC gaming (list of games is very small now, and the technology is still in its infancy) this is a technology people are truly looking forward too, not just biased fanboys making claim for Nvidia.
As far as this game the new Asura engine by Rebellion (first seen in Sniper Elite 4) is well optimized and that shows in the results with older and newer GPU's. Looks a lot like the Unreal 4 engine, wonder if some of the code has been replicated in their engine?
 
In response to Evernessinces now deleted comment:

” “”So Nvidia cards benefit from DX12 more than AMD cards do now? I guess that makes sense, DX12 is mainstream now and Nvidia have more incentive in making it work for their customers””

Two problems with this assumption.

1. AMD optimized this title, not Nvidia.

2. One game does not make a trend nor does it wipe away Nvidia performance it other DX 12 titles.

If anything you could take from this, it's that maybe Nvidia should be asking AMD to optimize more games for them, as they seem to get a heck of allot more performance in DX12 on Nvidia cards then Nvidia has ever gotten in the past themselves.”

This isn’t the first example, the 1080ti appears to gain quite a lot from DX12, more so than Vega. It seems to becoming more and more commonplace. It makes sense to me, now that DX12 is mainstream Nvidias driver teams are optimising for it more. If you notice it’s usually the more popular titles that perform better on Nvidia cards, I don’t think this is a coincidence, I think Nvidia target the software most users run and optimise for that. Sounds obvious really but now that DX12 is mainstream I expect to see Nvidia cards benefiting more from it than AMD cards because I really do think Nvidia wins most of its battles with its drivers, or rather AMD loses most of its battles from its drivers. I’m not saying Nvidia drivers are more stable than AMD drivers but they seem to run games with better frame rates and they do come out more frequently and promptly.

I think it’s a massive crying shame what’s happened to Radeon over the last few years. I’ve traditionally preferred Radeon parts to GeForce as I seem to end up usually opting to buy Radeon. But now there isn’t a hope in hell im going Radeon once the new Nvidia parts are out. And I really need a new card. AMD are doing to the GPU market what they did to the CPU market a few years back - giving up and allowing the opposing company to charge what they like and it’s us gamers who end up paying, quite literally. I personally hope Intel come in and can be competitive a bit more consistently than AMD have been.

The standouts were the 4GB FuryX at 4k. Very impressive for such an old card and a new title.
Also, the 7970ghz demolishes the GTX 680 by about 50%, and that was the 4GB model. The GTX 780 does MUCH better, so it looks like bandwidth is the issue here and not buffer size.

Did I miss something? I can’t find the 680 or the 7970 or even the 280x or the 770 (same cards) on these graphs. And I looked as I’m running crossfire 280x!
 
In response to Evernessinces now deleted comment:

” “”So Nvidia cards benefit from DX12 more than AMD cards do now? I guess that makes sense, DX12 is mainstream now and Nvidia have more incentive in making it work for their customers””

Two problems with this assumption.

1. AMD optimized this title, not Nvidia.

2. One game does not make a trend nor does it wipe away Nvidia performance it other DX 12 titles.

If anything you could take from this, it's that maybe Nvidia should be asking AMD to optimize more games for them, as they seem to get a heck of allot more performance in DX12 on Nvidia cards then Nvidia has ever gotten in the past themselves.”

This isn’t the first example, the 1080ti appears to gain quite a lot from DX12, more so than Vega. It seems to becoming more and more commonplace. It makes sense to me, now that DX12 is mainstream Nvidias driver teams are optimising for it more. If you notice it’s usually the more popular titles that perform better on Nvidia cards, I don’t think this is a coincidence, I think Nvidia target the software most users run and optimise for that. Sounds obvious really but now that DX12 is mainstream I expect to see Nvidia cards benefiting more from it than AMD cards because I really do think Nvidia wins most of its battles with its drivers, or rather AMD loses most of its battles from its drivers. I’m not saying Nvidia drivers are more stable than AMD drivers but they seem to run games with better frame rates and they do come out more frequently and promptly.

I think it’s a massive crying shame what’s happened to Radeon over the last few years. I’ve traditionally preferred Radeon parts to GeForce as I seem to end up usually opting to buy Radeon. But now there isn’t a hope in hell im going Radeon once the new Nvidia parts are out. And I really need a new card. AMD are doing to the GPU market what they did to the CPU market a few years back - giving up and allowing the opposing company to charge what they like and it’s us gamers who end up paying, quite literally. I personally hope Intel come in and can be competitive a bit more consistently than AMD have been.



Did I miss something? I can’t find the 680 or the 7970 or even the 280x or the 770 (same cards) on these graphs. And I looked as I’m running crossfire 280x!

my comment isn't deleted.... See...

Two problems with this assumption.

1. AMD optimized this title, not Nvidia.

2. One game does not make a trend nor does it wipe away Nvidia performance it other DX 12 titles.

If anything you could take from this, it's that maybe Nvidia should be asking AMD to optimize more games for them, as they seem to get a heck of allot more performance in DX12 on Nvidia cards then Nvidia has ever gotten in the past themselves.


This isn’t the first example, the 1080ti appears to gain quite a lot from DX12, more so than Vega. It seems to becoming more and more commonplace. It makes sense to me, now that DX12 is mainstream Nvidias driver teams are optimising for it more. If you notice it’s usually the more popular titles that perform better on Nvidia cards, I don’t think this is a coincidence, I think Nvidia target the software most users run and optimise for that. Sounds obvious really but now that DX12 is mainstream I expect to see Nvidia cards benefiting more from it than AMD cards because I really do think Nvidia wins most of its battles with its drivers, or rather AMD loses most of its battles from its drivers. I’m not saying Nvidia drivers are more stable than AMD drivers but they seem to run games with better frame rates and they do come out more frequently and promptly.

You are going to have to provide specific examples of that, because otherwise the last DX12/Vulkan game I saw was wolfenstien and Nvidia got their butts kicked in that one and it's no secret that Nvidia typically looses performance in DX12, hence the writter's comments on AMD optimizing well across the board.

I think it’s a massive crying shame what’s happened to Radeon over the last few years. I’ve traditionally preferred Radeon parts to GeForce as I seem to end up usually opting to buy Radeon. But now there isn’t a hope in hell im going Radeon once the new Nvidia parts are out. And I really need a new card. AMD are doing to the GPU market what they did to the CPU market a few years back - giving up and allowing the opposing company to charge what they like and it’s us gamers who end up paying, quite literally. I personally hope Intel come in and can be competitive a bit more consistently than AMD have been.

7nm Vega ought to be a decent card, jumping from 14nm to 7nm should be a good bump all around. Just don't expect a 2080 Ti competitor. Otherwise there's nothing we can do competition wise. AMD needed more money 2-3 years ago and that shows in their current GPU products.
 
, the 1080ti appears to gain quite a lot from DX12, more so than Vega. It seems to becoming more and more commonplace. It makes sense to me, now that DX12 is mainstream Nvidias driver teams are optimising for it more. If you notice it’s usually the more popular titles that perform better on Nvidia cards, I don’t think this is a coincidence, I think Nvidia target the software most users run and optimise for that.

If AMD's DX12 performance was so much better, why has no one bought them? If people cared about A-sync compute why has no one bought them?
Nvidia didn't even care about AMD's cards running some games in DX12 a little better then their cards, many of those games sucked anyways, it was just a little something for AMD fanboys to covet when DX12 released and was still in its infancy...even now its still growing; whats more sad is, Nvidia GPU's weren't even optimized nearly as much for DX12 in some games and they still won, matched or competed with AMD's best effort.

I like AMD and I respect the AMD fanboys using every crum of data to convince themselves their projected reality is what everyone else thinks, unfortunately you can tell how people think with how they use their wallets.
 
If AMD's DX12 performance was so much better, why has no one bought them? If people cared about A-sync compute why has no one bought them?
Nvidia didn't even care about AMD's cards running some games in DX12 a little better then their cards, many of those games sucked anyways, it was just a little something for AMD fanboys to covet when DX12 released and was still in its infancy...even now its still growing; whats more sad is, Nvidia GPU's weren't even optimized nearly as much for DX12 in some games and they still won, matched or competed with AMD's best effort.

I like AMD and I respect the AMD fanboys using every crum of data to convince themselves their projected reality is what everyone else thinks, unfortunately you can tell how people think with how they use their wallets.

Yeah, DX12 alone won't fix the issues AMD has with it's cards. Nvidia is slowly adding more DX12 and Async Compute features to it's hardware as it should be. I wonder though with the 20xx series though if Nvidia will be leaning more towards DX12, I'm not 100% sure if their RT libraries are written in DX11 or DX12. At this point turning should be a very competent DX12 card anyways so either or shouldn't matter too much.
 
It would be foolish to write off AMD in the video card space just yet. After all Ryzen showed that they can design and deliver a competitive product even after *6* years of not being competitive at the top end. They've been slowly regaining CPU market share and profitability and it may be premature to think that they can't do it again with refinement+7nm Vega.

However, there will be a few months or more of 20x0 cards out with that market sated before new Vega so it had damn well better be at least as good as the 2080. They can make do without a 2080 Ti competitor and AMD can still make money and market share out of competition at 2080 and below, but if it's only 2070 and below then the mindshare will be lost, perhaps for the final time. All the customers willing to pay top $€£ will go somewhere else. The tech press and Youtube gamers will all have Nvidia, and being relegated only to graphics pros and the bargain bin is not what AMD's GPU team is working towards.
 
The standouts were the 4GB FuryX at 4k. Very impressive for such an old card and a new title.
Also, the 7970ghz demolishes the GTX 680 by about 50%, and that was the 4GB model. The GTX 780 does MUCH better, so it looks like bandwidth is the issue here and not buffer size.

Did I miss something? I can’t find the 680 or the 7970 or even the 280x or the 770 (same cards) on these graphs. And I looked as I’m running crossfire 280x!


Go to the Hardware Unboxed video. It contains more detail.
 
I see a gap between the GTX 970 and the GTX 1060 that didn't exist at launch, this is what makes me worry about buying "old tech" vs the latest. I wonder if a RTX 2070 will be better than a GTX 1080 ti on the long run, even if the latter outperforms the former. Only benchmarks and time will tell.
 
Something wrong with your testing, Steve. You show the GTX 1060 3GB keeping pace with 4GB+ cards even at 4K. Impossible. Countless experts in numerous comment threads declared 3GB of VRAM dead on arrival and literally unplayable in 2016. Two years later you photoshop it into a graph and expect us not to notice? Nice try.

/s
 
Something wrong with your testing, Steve. You show the GTX 1060 3GB keeping pace with 4GB+ cards even at 4K. Impossible. Countless experts in numerous comment threads declared 3GB of VRAM dead on arrival and literally unplayable in 2016. Two years later you photoshop it into a graph and expect us not to notice? Nice try.

/s

Yep, similar to people saying 4GB of Fuji X wasn't enough back in 2015. 3 years later and the only places it comes up short is Wolfenstein 2 on Uber and RE 7 with crazy shadows.

I have a feeling that Vulcan requires more frame buffer than Dx12. Since this game does better on Dx12, there really isn't much point. However, Vulcan doesn't seem to need as much bandwidth where in Wolf 2, the Rx 580 beats the 390x and the 1070ti nearly matches the 1080.

For this game, it may be best for APU users and GTX 1050 3 GB users to use Vulcan instead as they are more bandwidth limited.
 
Really enjoying this game! Rebellion has done a great job with their engine and this looks and performs better than SE4 and NZT.
 
AMD caught INTEL off guard but I don't think this will work on "Tight T-shirts" aka Nvidia CEO which he is known to wear clothes too small and thinks he is ripped lol.
 
Back