AMD Radeon RX 480 Review: Performance for the masses

Sorry but where are you getting that from? Tweaktown (via extremetech) reports that Rise is a "major Nvidia win."

That's the issue with speaking in absolutes - it takes one example to disprove your point.

It depends. DX12 helps on some very heavy situations on that game. Something you don't see on benchmarks.

And how do you know that? Because some AMD PR flack says so?
The vast majority of reviews use power draw at the wall for their power consumption figures,
EVERY site that that measures power draw on a per rail basis is reporting that their RX 480 samples are in violation of the PCI-E slot specification, The only reason that it hasn't been more widely reported is because most sites don't/can't invest in the equipment for detailed testing.

As you can see, even GTX960 takes more than 75W from 6-pin power connector so no good either on Nvidia http://www.pcper.com/reviews/Graphi...s-Radeon-RX-480/Evaluating-ASUS-GTX-960-Strix

Bury your head in the sand, and keep spouting the AMD company line, but consumer RX 480's are already frying motherboards

Now THAT's reliable source. Also, card was overclocked. So move on.

https://www.reddit.com/r/nvidia/comments/3j5e9b/analysis_async_compute_is_it_true_nvidia_cant_do

Not sure how much of this is true but one thing is for sure, Nvidia will be just fine.
One way or another they will figure it out.

Nvidia still does not support async shaders on drivers.
 
Your bias leads you to believe that's due to AMD's superiority. In fact that chart (which I cannot find referenced to anything other than fake reddit sites) is displaying AMD's poor driver optimization.

ETA: That chart refers to nothing more than a bunch of numbers. Give me 10 minutes and I can have another chart that shows the opposite, I just need to make it. How about the accompanying article please?
Source for the chart, with the whole history of how the chart came to be... Links are from old to new;
http://www.neogaf.com/forum/showpost.php?p=165601331&postcount=183
http://www.neogaf.com/forum/showthread.php?t=1058295
http://www.neogaf.com/forum/showthread.php?t=1220928

Not that it matters. You will dismiss everything anyway. But the ones that actually have a brain capable of critical thinking will understand.
 
As you can see, even GTX960 takes more than 75W from 6-pin power connector so no good either on Nvidia http://www.pcper.com/reviews/Graphi...s-Radeon-RX-480/Evaluating-ASUS-GTX-960-Strix
Hardly news. Asus's 960 Strix and various previous TOP branded cards are noted to problematic. How does this excuse AMD's reference card?
Now THAT's reliable source. Also, card was overclocked. So move on.
And I'm sure you'll still ignore the evidence as it mounts, As for overclocked, the card was still using AMD software within AMD specified power limits. You are a first rate apologist.
 
Hardly news. Asus's 960 Strix and various previous TOP branded cards are noted to problematic. How does this excuse AMD's reference card?

So custom cards are worse than reference? Usually other way.

And I'm sure you'll still ignore the evidence as it mounts, As for overclocked, the card was still using AMD software within AMD specified power limits. You are a first rate apologist.

WTF? So breaking something when overclocking is OK if you are using manufacturer's own tools? So overclocking from BIOS and breaking motherboard is OK because then manufacturer provided settings were used. Good logic indeed.
 
So you are essentially claiming following:

8GB memory will be useless for RX 480. No proof for this claim.

DX12 performance will be useless for RX 480. No proof for this claim.

GTX 1060 (specs and name currently unknown) will rock against RX 480. No proof for this one either.

These Nvidia fanboys seem to be too stupid to think about future unless Nvidia may perhaps have some kind of advantage.

I claimed those things based on information at hand, thy were more than an educated guess. The RX 480 isn’t faster than the R9 390X so why would 8GB of VRAM be required? We are yet to see a case where the 290X 4GB is slower than the 390X 8GB under playable conditions when comparing the two clock for clock. Therefore you can safely assume the same will be true for the RX 480, at least in the vast majority of video games.

I never claimed DX12 performance will be useless for the RX 480, I don’t really even know what that means.

The GTX 1060 specs and name are currently unknown to you pal! ;)

It's more like 30W+ extra, acc. to Anandtech GDDR vs HMB deep dive, they estimated 38-50 W for the DRAM on a 290X (4 GB and slower speed).

We will have to see once we have 4GB cards. Realistically though it doesn’t matter because you shouldn’t be comparing the power efficiency of the RX 480 to the previous generation Nvidia architecture anyway. The GTX 1060 won’t be a Maxwell part ;)

I still have some remarks for you...

Or maybe it says something about you, if I confuse you with certain people... I have nothing against you personally, rather, it's the double standards regarding these brands that I despise.

No it says a lot about a select few people who do the same thing every time an AMD or Nvidia GPU review is published. The same immature people who aren’t interested in a proper discussion, skip that and go right to trolling and fanboyish behavior. Even on a positive review, it’s amazing.

I get that. But you also have to stop comparing it to just nVidia GPUs. The node jump offered AMD as much power efficiency as nVidia's. In other words, the jump in efficiency from the R9 390 to the RX 480 is about the same as the jump from the 980 Ti to the GTX 1070, if not slightly better. That is a good thing for AMD compared. Obviously it still warrants comparing it to nVidia, but, it has to be compared in the right context. Worse than nVidia in power/performance? Yes it is as of right now, if you take all the DX11 games. Has anyone even bothered to do a DX12 performance/watt? Of course they haven't.... The picture would be quite different then, and we all know it. And then you'll understand the difference between Pascal and GCN.

I don’t think there is much interest in doing a performance per watt evaluation on one game.

The margin between the GTX 1070 and GTX 980 Ti is greater than that of the RX 480 to the R9 390.

“The node jump offered AMD as much power efficiency as nVidia's. In other words, the jump in efficiency from the R9 390 to the RX 480 is about the same as the jump from the 980 Ti to the GTX 1070, if not slightly better”

The RX 480 and R9 390 deliver roughly the same performance and on average the RX 480 consumed 52 watts less.

The GTX 1070 is around 5-10% faster than the GTX 980 Ti and on average consumed 80 watts less. The GTX 1080 is more impressive again as it consumes around 30 watts more than the 1070 on our test system and is around 30% faster than the 980 Ti.

And what were the gains on the RX 480...?

I said in a previous post, 8% on average. So half that of the 1070 and 1080 :S
 
No it says a lot about a select few people who do the same thing every time an AMD or Nvidia GPU review is published. The same immature people who aren’t interested in a proper discussion, skip that and go right to trolling and fanboyish behavior. Even on a positive review, it’s amazing.
Because you're never wrong or at fault of course right? Just like nVidia...

I don’t think there is much interest in doing a performance per watt evaluation on one game.
And with remarks like these you're supposed to be taken seriously...? Here's the list for you, without mentioning the game you're referring to;
Rise of the Tomb Raider
Gears of War Ultimate Edition
Hitman
Quantum Break
Forza Motorsport 6: Apex
Total War: Warhammer

The margin between the GTX 1070 and GTX 980 Ti is greater than that of the RX 480 to the R9 390.

The RX 480 and R9 390 deliver roughly the same performance and on average the RX 480 consumed 52 watts less.

The GTX 1070 is around 5-10% faster than the GTX 980 Ti and on average consumed 80 watts less. The GTX 1080 is more impressive again as it consumes around 30 watts more than the 1070 on our test system and is around 30% faster than the 980 Ti.
52 watts less than an R9 390? You serious...? Come on dude... The RX 480 uses around 160W of power. Are you seriously saying that an R9 390 only used 210W? The R9 390 easily uses over 260W of power, meaning, the RX 480 uses 100 watts less. Here;
power_average.png


That's a 50% reduction in power use for same performance. Once again, you're downplaying AMD's achievements...

How do the 1070 & 1080 stack up in terms of percentage...?

I said in a previous post, 8% on average. So half that of the 1070 and 1080 :S
It's power limited sadly. AIB cards will likely fix that.
 
Because you're never wrong or at fault of course right? Just like nVidia...

And with remarks like these you're supposed to be taken seriously...? Here's the list for you, without mentioning the game you're referring to;
Rise of the Tomb Raider
Gears of War Ultimate Edition
Hitman
Quantum Break
Forza Motorsport 6: Apex
Total War: Warhammer

52 watts less than an R9 390? You serious...? Come on dude... The RX 480 uses around 160W of power. Are you seriously saying that an R9 390 only used 210W? The R9 390 easily uses over 260W of power, meaning, the RX 480 uses 100 watts less. Here;
power_average.png


That's a 50% reduction in power use for same performance. How do the 1070 & 1080 stack up in terms of percentage...?

It's power limited sadly. AIB cards will likely fix that.

I can only comment on the results I found, I sent my power results to AMD before publishing so that they could confirm them and they did.

As for the DX12 games...

Rise of the Tomb Raider (DX12 is Beta)
Gears of War Ultimate Edition (I admit I need to check this title out)
Hitman (This is almost the ProjectCARS of the DX12 world. Even under DX11 is sucks on Nvidia hardware).
Quantum Break (Last time I checked this game was a bit of a mess)
Forza Motorsport 6: Apex (Beta?)
Total War: Warhammer (DX12 is Beta)
 
Because you're never wrong or at fault of course right? Just like nVidia...

And with remarks like these you're supposed to be taken seriously...? Here's the list for you, without mentioning the game you're referring to;
Rise of the Tomb Raider
Gears of War Ultimate Edition
Hitman
Quantum Break
Forza Motorsport 6: Apex
Total War: Warhammer

52 watts less than an R9 390? You serious...? Come on dude... The RX 480 uses around 160W of power. Are you seriously saying that an R9 390 only used 210W? The R9 390 easily uses over 260W of power, meaning, the RX 480 uses 100 watts less. Here;
power_average.png


That's a 50% reduction in power use for same performance. Once again, you're downplaying AMD's achievements...

How do the 1070 & 1080 stack up in terms of percentage...?

It's power limited sadly. AIB cards will likely fix that.

I discussed this with steve earlier and his reasoning was that he was basing his "didn't improve much" on the new Nvidia Pascal cards, not previous gen AMD cards, which if true is incredibly confusing. When you say improve, you mean off the last iteration. The word improve cannot be used to compare.

He should have said "While the RX 480 has made large strides in the performance per watt category it still isn't up to par with Nvidia's latest offerings. We would have liked to see a bit more from AMD to really have the RX 480 seal the deal at it's current price point".

Of course this is to assume you want to compare cards in different price brackets. Right now AMD has the best price / performance and performance / watt in that mid range with the RX 480. That will likely change when the GTX 1060 releases. Frankly though I do not know how Nvidia is going to get the 1060 out so soon unless the rumors that it will also be using the GP100 die. The problem with that, as we've see with the 1080 and 1070, is yields are very low. Nvidia could very well release a better card but can it do it in volume? I would be very surprised if they released a GTX 1060 with a smaller die, it usually takes companies at least 4 months to redesign the card, get samples out, ect. We haven't heard anything about that so for now we just have to go with another GP100 binned card.
 
Source for the chart, with the whole history of how the chart came to be... Links are from old to new;
http://www.neogaf.com/forum/showpost.php?p=165601331&postcount=183
http://www.neogaf.com/forum/showthread.php?t=1058295
http://www.neogaf.com/forum/showthread.php?t=1220928

Not that it matters. You will dismiss everything anyway. But the ones that actually have a brain capable of critical thinking will understand.
Weirdly enough my post was deleted (I'm assuming for the referenced ad hominem attack though the bolded, quoted selection above is also one).

Here's a quick recap - the threads you've provided do not show any improvement in AMD cards over time. From this link comparing the 780 Ti to the 290X -

http://www.neogaf.com/forum/showthread.php?t=1058295

Their first image source is the TechPowerUp relative performance comparing the 780 Ti to the 290X. The second source, showing supposed improvement, is comparing a 980 Ti to the 780 Ti and 290X using relative performance. There has not been any improvement - only the basis for comparison has changed. The "closed gap" showing GTX cards "not aging well" does not exist from those posts that support the chart you provided.
 
I wish I could set the record straight and give you the information you are missing. Sadly I can't but the good news is you don't have to wait long at all.



I just re-tested the RX 480 and R9 390 power consumption when gaming at 1440p in the three games shown in the review. Measuring total system consumption from the wall I received the exact same results which, saw the RX 480 configuration consume 50 watts less on average.

I have seen a number of reviews with similar testing throwing up the same results as well and as I said AMD confirmed my results for me...

Here are the Gamers Nexus results, he found the RX 480 consumed 60 watts less than the R9 390X...

http://media.gamersnexus.net/images/media/2016/gpu/rx-480/rx-480-power-draw.png

Yep, all this talk will likely be irrelevant when the 1060 comes out and likely beats the RX 480 in power efficiency.
 
Well, all this debate over efficiency probably needs to be tabled
That problem is only on few cards, so no major problem.
Official Statement from AMD on the PCI-Express Overcurrent Issue
AMD are preparing a driver to cap power draw and presumably limit PCI-E slot draw to specification. Seems like a very good reason to pass on the reference card in favour of an AIB custom
 
I thought the combination of the 6 pin and PCI-Express limited it to 150W. Wouldn't TPB want to use 8 pin?

In practice, many 6-pin cards tend to draw somewhat more than 75W from power connector. 6-pin connector have no problem with bigger load. PCI Express connector may have.

6 pin connector because 8 pin connector would make RX 480 look like "225W card" and not "150W card". This whole power effiency and marketing thing has gone mad, yes.

And I expect that some (many?) custom cards have single 8 pin connector.
 
I thought the combination of the 6 pin and PCI-Express limited it to 150W. Wouldn't TPB want to use 8 pin?
I think custom cards will do just that. PCI-E plugs are rated at a nominal 75W but depending on the quality of the PSU and the wire gauge can pull considerably more.
What AMD will do is more effectively load balance the input power. Capping the board power at 150W effectively minimizes the risk (but will limit overclocking) - probably not much of an issue for those with a decent mobo, but could be cause for concern for OEMs using bargain basement 4-layer PCB motherboards from manufacturers like Pegatron.
 
In practice, many 6-pin cards tend to draw somewhat more than 75W from power connector. 6-pin connector have no problem with bigger load. PCI Express connector may have.

6 pin connector because 8 pin connector would make RX 480 look like "225W card" and not "150W card". This whole power effiency and marketing thing has gone mad, yes.

And I expect that some (many?) custom cards have single 8 pin connector.
It took me a while but this makes sense. Making the reference card 8-pin makes it look less efficient.

I agree with you and Zero below that the 8-pin will be coming especially in light of the current PCI-E overdraw issue.
 
I think some people count a 2% betterment of a 2 year old ASIC on a older process node as a monumental win. Go figure.
If the claim of a substantial perf/watt increase over Tonga is the point, then I'm not sure why he's harping on about the GTX 970...sounds like the very definition of plucking low hanging fruit that just highlights Tonga's abysmal efficiency.
perfwatt_2560_1440.png

How about putting same figure on DX12 performance per watt? Nobody cares about DX11 performance any more.


Correction .no one cares about DX 12 performance just yet ,not too mention ASYNC compute.

Comments like that just get me. you would think we just got set up to play these couple of titles .which I don't own one of yet.
I'll be upgrading my GPU's to play my current games at max settings .which is why most upgrade anyway .not so they can play that one new title ,that isn't that great anyway.lol.
 
So at least eight DX12 games coming 2016.

How about 2017 then? And 2018????

Do you actually think in 2018, your Rx480 mid range card is going to be able to play a new DX 12 title at decent frames.in high resolution.? ( Shakes head and face palms.)
 
Do you actually think in 2018, your Rx480 mid range card is going to be able to play a new DX 12 title at decent frames.in high resolution.? ( Shakes head and face palms.)

Yes, I think so. Radeon R9 280 was mid range card 2 years ago (and that is rebranded HD 7950 that is 4 years old) and still offers decent FPS at 1080p.
 
Yes, I think so. Radeon R9 280 was mid range card 2 years ago (and that is rebranded HD 7950 that is 4 years old) and still offers decent FPS at 1080p.
1080p isn't high resolution nor is "decent" a relevant statistic... If you are buying the 480 in order to play as-yet-unreleased titles in the future, you're an *****.

If you're buying this card, it's because you want a fairly cheap card capable of mid-high end performance NOW....

You want something that is "future proof" - it doesn't exist... But the 1070 is a far better bet...
 
1080p isn't high resolution nor is "decent" a relevant statistic... If you are buying the 480 in order to play as-yet-unreleased titles in the future, you're an *****.

We are talking about RX 480 here so high resolution is 1080p as it's suitable resolution for RX 480. Decent = 60 FPS.

So everyone who buys graphic card so that they are able to play also as-yet-unreleased titles in the future are *****s? "(y)"

If you're buying this card, it's because you want a fairly cheap card capable of mid-high end performance NOW....

You want something that is "future proof" - it doesn't exist... But the 1070 is a far better bet...

NOW? Not tomorrow? Not next year? Define more carefully your NOW.

1070 is not better bet as it has poor DX12 performance and it's much more expensive. I still say that for 1080p @ 60 Hz RX 480 is quite future proof card as cheap price is also future proofing. In case card or monitor breaks down, there is more money available for better one monitor and/or graphics card.
 
Yes, I think so. Radeon R9 280 was mid range card 2 years ago (and that is rebranded HD 7950 that is 4 years old) and still offers decent FPS at 1080p.

1080 p isn't high resolution anymore. 1600 p is where I run now for a long time and I need the second GPU to get decent frames on any of my current titles..WOT HD ,BF4,etc

so I get it. this is a great 1080p card for the masses.but no good to me. and a 200 dollar card blows my 500 dollar board I'll lose it .I'll SNAP! I do hope it's just a freak occurance,that.
This is bad for AMD ,they get a lot of this .and a limiting driver is not the way to fix BROKEN ! hardware.
 
1080 p isn't high resolution anymore. 1600 p is where I run now for a long time and I need the second GPU to get decent frames on any of my current titles..WOT HD ,BF4,etc

so I get it. this is a great 1080p card for the masses.but no good to me. and a 200 dollar card blows my 500 dollar board I'll lose it .I'll SNAP! I do hope it's just a freak occurance,that.
This is bad for AMD ,they get a lot of this .and a limiting driver is not the way to fix BROKEN ! hardware.

If you want 1600p resolution, then you are on wrong thread.

Nothing is broken, AMD only need to adjust card to take more power from 6-pin connector and less from PCI Express connector.

At least AMD fixes this, Nvidia never fixed GTX 970 and is still selling broken hardware.
 
Back