AMD Radeon RX 480 Review: Performance for the masses

AMD Vega will be that competition. And GTX 1060 is not yet out.

AMD didn't bother to release high end card now because no wise buyers will buy it.


How do you know that? It's possible that Nvidia NEVER releases GTX 1060. Much more is known about AMD Vega than is known about GTX 1060. Also GTX 1060 will have no chance against RX 480 unless playing some already old titles.

I think you're right - AMD didn't "bother" to release a high end card because no one would buy it... Because it will be priced the same as the 1080 and won't be as good...and consume far more power!

As to "how do I know the Nvidia 1060 will be released?" I think it's pretty obvious that it's coming... Whereas pas history has shown us that AMD simply can't compete with Nvidia in the high end market.

Hey, I'd love to be proven wrong - competition benefits the consumer - but if AMD could compete, they would.... They can't, so they aren't....
 
I think you're right - AMD didn't "bother" to release a high end card because no one would buy it... Because it will be priced the same as the 1080 and won't be as good...and consume far more power!

Because HBM2 cards are around corner, only stupid Nvidia fanboys buy GDDR5 based old stuff. That's why AMD released mid end product now.

As to "how do I know the Nvidia 1060 will be released?" I think it's pretty obvious that it's coming... Whereas pas history has shown us that AMD simply can't compete with Nvidia in the high end market.

What past history? As I already stated, only *****s bought old tech so called "high end" cards like GTX 980 Ti. My friend was going to buy one, I recommended buying R9 380 instead. He saved few hundred doing that by not buying 700$ card that loses half of it's value in three months. Not bad, eh? Also GTX 980 Ti is pretty far from high end now as AMD's mid end card offers around 75% performance of 980 Ti.

So why AMD should have released 28nm "high end card" when they knew that 14nm mid end card will be close to it's performance? AMD buyers are not as stupid as Nvidia buyers and AMD knows that.

Hey, I'd love to be proven wrong - competition benefits the consumer - but if AMD could compete, they would.... They can't, so they aren't....

Technically AMD is years ahead Nvidia, as DX12 is future and DX11 is past. Also complaining about AMD's drives issues is funny as Nvidia's async shaders supporting drivers are yet to be released, they should have been about two years ago :D
 
Technically AMD is years ahead Nvidia,
I know right!
If only Nvidia could get their hardware to run as hot and loud as AMD"s, suck as much power, while having just as crappy software.

Ohh wait.
AMD is always behind the curve, the drivers suck compared to Nvidia's, their power consumption sucks compared to Nvidias, thier overclock ability sucks compared to Nvidia's, and their pricing is always much cheaper, and its been like this for about 10 years.
Ahahahaha, AMD is king!
 
I know right!
If only Nvidia could get their hardware to run as hot and loud as AMD"s, suck as much power, while having just as crappy software.

BS. Nvidia just cannot design advanced chip like AMD and so power consumption is smaller. Advanced chips consume more power and that's not surprising.

About software, AMD has been supporting async shaders for years now. That's enough for that. Remember that Nvidia lied about that support.
 
As I know that troll is still messing around an spewing lots of useless garbage about stuff he has no clue about (got him on ignore which I would advise for everyone to do because talking sense into him is just like digging through dry concrete with a wooden spoon).

http://videocardz.com/61753/nvidia-geforce-gtx-1060-specifications-leaked-faster-than-rx-480
This just takes care of the "how do I know the Nvidia 1060 will be released?" part.
TPU shows up with this article right here:
https://www.techpowerup.com/223802/nvidia-to-launch-geforce-gtx-1060-next-week
Ignoring the fact that it's a spec leak and the rest of the content that's in the article should be taken with a grain of salt.

And I bet his next comment will be "What about DX12" :D.

So mate if you're going to be a half-decent troll at least look stuff up before you talk :D.
 
As I know that troll is still messing around an spewing lots of useless garbage about stuff he has no clue about (got him on ignore which I would advise for everyone to do because talking sense into him is just like digging through dry concrete with a wooden spoon).

http://videocardz.com/61753/nvidia-geforce-gtx-1060-specifications-leaked-faster-than-rx-480
This just takes care of the "how do I know the Nvidia 1060 will be released?" part.

TPU shows up with this article right here:
https://www.techpowerup.com/223802/nvidia-to-launch-geforce-gtx-1060-next-week
Ignoring the fact that it's a spec leak and the rest of the content that's in the article should be taken with a grain of salt.

And I bet his next comment will be "What about DX12" :D.

So mate if you're going to be a half-decent troll at least look stuff up before you talk :D.

How about putting some reliable sources next time?

Sources: BenLife.info, VideoCardz

Sounds VERY reliable sources to me "(y)". Well, Nvidia fanboys tend to believe every lie Nvidia tells so not surprising.

Even if that article holds true, it's paper launch with very limited availability:

The card is expected to be officially launched on the 7th of July, 2016. Market availability is expected to follow a week later, on 14th July.
 
That's funny, you speak of limited availability while standing behind AMD.

According to "reliable" sources, RX 480 availability is 25 times better than GTX 1080's:

http://wccftech.com/amd-rx-480-shipments-hit-us-stores/

Interestingly, yesterday a Microcenter branch in Overland Park Kansas revealed to a customer that they’re receiving a shipment of 100 RX 480 cards on the 28th of June. That’s exactly 25 times as many GTX 1080 cards the same branch had at launch, which was only four. We called the branch & confirmed. We were also told that their stock of GTX 1080 cards is currently limited to one per customer due to limited supply.
 
I still have some remarks for you...

Given the comments made so far in this thread I would say the confusion isn’t limited to that one post. You are trying to discredit what is one of the more positive RX 480 reviews on the internet. Again we pretty much stuck to the facts and figures on this one with few of our own personal thoughts and feelings.
Or maybe it says something about you, if I confuse you with certain people... I have nothing against you personally, rather, it's the double standards regarding these brands that I despise.

At some point you have to stop comparing AMD GPUs with just AMD GPUs. When you do, you quickly see that the RX 480 consumes the same amount of power as the much faster GTX 1070 and that obviously isn’t a good thing. I don’t think it is wrong to tick this off as a negative attribute of the product.
I get that. But you also have to stop comparing it to just nVidia GPUs. The node jump offered AMD as much power efficiency as nVidia's. In other words, the jump in efficiency from the R9 390 to the RX 480 is about the same as the jump from the 980 Ti to the GTX 1070, if not slightly better. That is a good thing for AMD compared. Obviously it still warrants comparing it to nVidia, but, it has to be compared in the right context. Worse than nVidia in power/performance? Yes it is as of right now, if you take all the DX11 games. Has anyone even bothered to do a DX12 performance/watt? Of course they haven't.... The picture would be quite different then, and we all know it. And then you'll understand the difference between Pascal and GCN.

GCN is not inefficient for no reason.

As for the overclocking angle. As I said we pushed the GTX 1080 14% above the base clock which really hiked up the GPU Boost 3.0 clocks, resulting in over 15% performance gains. Meanwhile the RX 480 struggled with a 7% overclock.
And what were the gains on the RX 480...?

The GTX 1080 was as easy as can be to overclock, it hit those frequencies without much tinkering. It took me over an hour to get the RX 480 stable at 1.35GHz and the evidence would suggest most gave up before making it even that far.
It's the reason I never buy reference cards. For the reference card itself, forgetting all past reviews, you were not unjust in this review, which is a good thing. But, when we look at the past, and negative points weigh 10 times as much when the brand is AMD, I have a problem with that.

Do with this info as you wish.
 
I know right!
If only Nvidia could get their hardware to run as hot and loud as AMD"s, suck as much power, while having just as crappy software.

Ohh wait.
AMD is always behind the curve, the drivers suck compared to Nvidia's, their power consumption sucks compared to Nvidias, thier overclock ability sucks compared to Nvidia's, and their pricing is always much cheaper, and its been like this for about 10 years.
Ahahahaha, AMD is king!
It looks like that to the ignorant. But really, AMD is ahead in terms of technology right now... They are more power hungry for a reason, but I wouldn't expect you to understand. At least, not until someone does DX12 performance/watt calculations.

To put things into perspective using DX12 as a reference, we have the following;...
GCN 1.0 supports FL11_1
GCN 1.1 supports FL12_0
GCN 1.2 supports FL12_0
Fermi supports FL11_0
Kepler supports FL11_0
Maxwell supports FL11_0
Maxwell 2 supports FL12_1 (sort of)

Note that GCN 1.1 and GCN 1.2 support FL12_0. This means that since 2013, AMD has GPUs on the market supporting pretty much all DX12 features. On top of that, they support the majority of these features on the highest tiers, including the 'hidden' 11_2 feature level. They are only missing conservative rasterization and ROV. Compare that to nVidia. Maxwell which was released in 2014, were still only capable of FL11_0!!! In fact, AMD's GCN 1.0 from 2011 has a higher rank in feature level support than 2014's Maxwell... Let that sink in for a moment...

......................

With Maxwell 2, they included the two missing features of AMD, conservative rasterization and ROV, making them capable of advertising FL12_1, as DirectX12.1, to pretend that their cards are superior, despite their technology actually being inferior. They also have a higher tier in tiled resources compared to AMD's GCN. They had to include these features in their cards (and fast) for them to be remotely competitive to the features that GCN was offering.

However, despite all this;
- AMD's is still a higher tier in resource binding, even GCN 1.0 which is from 2011.
- Stencil reference value from pixel shader is still only supported by GCN cards, again starting from GCN 1.0.
- All GCN cards have the full heap available for UAV slots for all stages, Maxwell 2 cards are limited to 64. This likely isn't changed with Pascal.
- GCN 1.0 cards have two asynchronous compute engines with two queues per unit (total of 4), which allow concurrent calculations of graphics + compute. Maxwell 2 still can't do this since they're limited by their required context switch. The same is true for Pascal. They can do asynchronous computing only, but they can't do concurrent graphics + compute. GCN doesn't have this limit since no context switch is required. GCN 1.1 increased the compute units from two to eight compared to GCN 1.0, and the queues from 2 to 8 per unit also (total of 64). And this isn't even being used yet. Developers are starting to experiment with it. It was dropped down to 4 with the RX 480, which is still more than whichever nVidia card can handle.
- GCN 1.2 and newer are the only cards that have minimum float precision. Maxwell 2 doesn't have it at all, and neither does Pascal.

People complaining about GCN being outdated and the architecture needing to be redesigned have no idea what they're talking about. In fact, from this perspective, Fermi, Kepler and Maxwell look more like rebrands than the GCN cards. The GCN architecture is one of the best that has ever been designed in terms of longevity and being future proof. It supported so many features since 2011 that are only now going to get used with DX12 and Vulkan. Why would AMD overhaul something completely that is just about to show its strength?

That's without touching the new stuff specifically added with Polaris...
 
That's without touching the new bloat specifically added with Polaris...
Fixed that for you. That's the same thing people say about Windows with all it's bloat, technologically advanced. Tell me at what point does it become a negative? And since this is an AMD thread I will not bring up the negativity AMD fans have toward nVidia's advancements. I'm sure nVidia would be there, if there was demand for any DX12 supply. At this point there is no demand for DX12, so why would nVidia release cards now that support it. Especially when they can use DX12 as reasons to upgrade later. The negativity you hold for nVidia are more than likely marketing decisions not an indicator of their capabilities. And if you think about AMD's desperation, you would understand why they have so much out on the table.
 
It looks like that to the ignorant. But really, AMD is ahead in terms of technology right now...
No, their not.
Talking about DX12 perks and some other specific strengths AMD has does nothing to bolster any argument. Overall its not even close, Nvidia's tech is smoother, more efficient and more powerful.
DX12 is in its infancy, in a year both companies will have much better DX12 performance.

They are more power hungry for a reason, but I wouldn't expect you to understand..
Maybe because they are losing customers, company credit and are fighting for their lives to stay afloat?
It's tough to sell a GPU that runs so hot from the factory it needs liquid and has little to no overclock headroom.
 
It's tough to sell a GPU that runs so hot from the factory it needs liquid and has little to no overclock headroom.

Can I ask how you interpret the Fury X as needing liquid cooling? Has the same TDP as the 390 and 390x, both of which were air cooled by all the AIB's, it's power demand was usually only 10-20 watts greater then the 390x (which could be the pump). I always saw it as more of a niche, they were gonna deliver a 980Ti performance card that could fit inside a smaller case with relative ease. The PCB size being small wasn't hard thanks to the HBM, so why not offer the unique capability?
 
I'm sorry, but where the are you getting the $240 for the 970? the cheapest is around $270 (most versions are above $300) and in europe it's above 270 euros.

Yeah, that $240 quote he gave was pretty low. I see them on sale for around $260 at the least and that's in limited quantities and with a mail-in-rebate. It's not like Nvidia has infinite stock of those cards too. A year from now the RX 480 will still sell first hand retail, the 970 will not.
 
The 1080 HAS no competition from AMD.... And probably will never have any... Kind of like the Titan X before it.... The fact that AMD has led with their midrange instead of high end card kind of confirms this.

AMD's only hope is to crank these out and hope that they can trick people into thinking they are getting a high end card for cheap - before Nvidia's competition comes out.

We KNOW the 1060 is coming - and pretty soon. We have no real idea what high end AMD card is coming - nor when.... But by looking at the power numbers of the 480, we know that it will be a bloodsucking beast....

I'd actually say that AMD's high end card will be pretty good with power. HBM 1 had pretty low power usage so I expect HBM 2 to better that. I still expect it to consume more power than Nvidia's though.
 
Fixed that for you. That's the same thing people say about Windows with all it's bloat, technologically advanced. Tell me at what point does it become a negative? And since this is an AMD thread I will not bring up the negativity AMD fans have toward nVidia's advancements. I'm sure nVidia would be there, if there was demand for any DX12 supply. At this point there is no demand for DX12, so why would nVidia release cards now that support it. Especially when they can use DX12 as reasons to upgrade later. The negativity you hold for nVidia are more than likely marketing decisions not an indicator of their capabilities. And if you think about AMD's desperation, you would understand why they have so much out on the table.

Once again, it takes Nvidia at least two years to make DRIVER that support async shaders. So at that speed, it takes at least 10 years for Nvidia to develop good DirectX 12 hardware. So your logic has been proven false.

As AMD has good DX12 support, AMD's cards are much more future proof. That also means AMD cards do not suffer value huge drops like GTX 980 Ti :D

No, their not.
Talking about DX12 perks and some other specific strengths AMD has does nothing to bolster any argument. Overall its not even close, Nvidia's tech is smoother, more efficient and more powerful.
DX12 is in its infancy, in a year both companies will have much better DX12 performance.

Nvidia's tech is "efficient" just because they have no hardware support for features. You just got facts that AMD is technologically much more advanced but still you deny. Makes me wonder why.

How do you know in a year Nvidia will have better DX12 performance? You don't know so you just guess. And guess wrong.
 
Once again, it takes Nvidia at least two years to make DRIVER that support async shaders.
OMG! Async shaders! Everybody run!
Lol.

As AMD has good DX12 support, AMD's cards are much more future proof. That also means AMD cards do not suffer value huge drops like GTX 980 Ti :D
AMD cards performance jumped all over the place and it used to drop/spike like crazy.
Min and average frames on AMD GPU's were terrible for years, CrossfireX was worse.
They did fix that right?
And future proof? Lol.

Nvidia's tech is "efficient" just because they have no hardware support for features.
Haaaa Ha hahaha
Yes Nvidia GPU's have no features!
Hahahahahahaha

You just got facts that AMD is technologically much more advanced but still you deny.
The only thing good about AMD tech is thats its cheap and easy to afford.
It's second rate, second best, and its priced accordingly.

What is the fastest, most efficient GPU on the market today? A GTX.
And your rebranding comment is funny, since AMD renamed the same GPU's for 3 generations. Are you kidding me?
 
Last edited:
OMG! Async shaders! Everybody run!
Lol.


AMD cards performance jumped all over the place and it used to drop/spike like crazy.
Min and average frames on AMD GPU's were terrible for years, CrossfireX was worse.
They did fix that right?
And future proof? Lol.


Haaaa Ha hahaha
Yes Nvidia GPU's have no features!
Hahahahahahaha


The only thing good about AMD tech is thats its cheap and easy to afford.
It's second rate, second best, and its priced accordingly.

Truth hurts it seems.

The 980 Ti dropped in price because of the release of the 1080. That had nothing to do with AMD.

GTX 980 Ti was not future proof and so price drop was huge.

Ohh the irony, telling someone their prediction is wrong.

Very easy to say that prediction is wrong.
 
GTX 980 Ti was not future proof and so price drop was huge.
Sorry that's just wrong. Nvidia released a new product - the GTX 1080 - that was 25-35% faster than the old product - 980 Ti - at the same price point - $50 cheaper to be honest - less than a year later.

It had nothing to do with future proofing or missing features. You're going to need to try harder than that.
 
Back