AMD Radeon RX 480 Review: Performance for the masses

The hype was so great with these cards that there were preorders posted in forums and a lot of swearing it will kill gtx1070 and saying it will push back nvidia back to the drawing board by followers. Price was also announced and was convincing. Until this review crush those dreams of AMD followers. Once gtx1070 stabilizes in the market and gtx1060 will be released, the price per performance and performance per watt of nvidia cards clearly undercuts radeons of this generation.

Clearly, AMD doesnt have what it takes to head on with nvidia. and this is the truth.

I recommend to save to buy a gtx1070 or wait for gtx1060 if you are not in a hurry. Avoid eating at mcdonalds or starbucks coffee for a month.

If you want the best card in pc, get an nvidia! If you are in a budget, and need a card right now, I am sure gtx970 will go down in price to counteract this and gtx1060 will be release this year.
 
AMD released RX 480 8gb at the $239.99 price point for ONE reason.

2 RX 480 at $ 479.98 in CROSSFIRE mode outperforms GTX 1080 costing $659.99!!!

The point is SCALABLE GPU add in boards. Buy the level of performance that you either want, need or can afford. Buy 1 now and if you need more then buy another.

The CROSSFIRE scaling of the second card is about 192%.

Why blow your money on an expensive NVidia GTX 1080 when you can spend far less and have better performance.

And you get Asynch Compute and a GPU that isn't broken running DX12.

AND you get to BUY AMERICAN!!!
That's a myth started by AMD and their fanboys. They don't even beat a 1070 overall. Here's the sad truth:
https://www.techpowerup.com/reviews/AMD/RX_480_CrossFire/21.html

As for buying American, AMD is an American company, but their products are made in Asia.
 
If you have to be told "why", there is no point telling you.

Hmmm....

that's it? That's all you got?

I don't need to be told why. Clearly your eductation lacks the subtlety of understanding a rhetorical question. So sad.

GTX 1080 is BROKEN running DX12.

GTX 1080 has NO Asynch Compute.

GTX 1080 costs $659.99 and is totally outclassed and outperformed by 2 count them 2, two RX 480 4 gb in CROSSFIRE mode costing $399.98!!!

So sad.
 
Hmmm....

that's it? That's all you got?
You asked why as if there was no reason. The biggest known reason is owning a single card and not having to deal with SLI/Xfire drivers. And yet you speak as if my knowledge is sad, you didn't even comprehend your own comment.
 
Hmmm....

that's it? That's all you got?

I don't need to be told why. Clearly your eductation lacks the subtlety of understanding a rhetorical question. So sad.

GTX 1080 is BROKEN running DX12.

GTX 1080 has NO Asynch Compute.

GTX 1080 costs $659.99 and is totally outclassed and outperformed by 2 count them 2, two RX 480 4 gb in CROSSFIRE mode costing $399.98!!!

So sad.
I'll say it again- 480 Xfire does not beat a 1080. In fact, overall it doesn't even beat the 1070. And the 1080/1070 have 8 GB Vram, so you'd have to compare the 8 GB 480s in Xfire to them- which would cost $480, not $400. I posted a link to this very legit review from a highly respected and popular tech site, but since you don't seem interested in facts, I'll just pull out the definitive quote from the review's conclusion:

"At just $398, about the same price as the cheapest GeForce GTX 1070 you can find, or $478 for a pair of 8 GB cards like those we have, Radeon RX 480 CrossFire is not a viable solution if you plan to buy two cards upfront. When averaged over all our games, it is consistently slower than a single GeForce GTX 1070 at all the resolutions that matter - 1080p, 1440p, and 4K. Instead of buying two cards upfront, you're much better off putting your monies into a single GTX 1070, not just for better performance but to dodge the spectre of application multi-GPU support, which continues to haunt both SLI and CrossFire."

As for DX12, - who gives a ****? There are exactly 13 games that use it at the moment, with only 9 more in the coming year (Wikipedia). There are literally thousands of games out there.

https://www.techpowerup.com/reviews/AMD/RX_480_CrossFire/1.html
 
If you have to be told "why", there is no point telling you.

Hmmm....

that's it? That's all you got?

I don't need to be told why. Clearly your eductation lacks the subtlety of understanding a rhetorical question. So sad.

GTX 1080 is BROKEN running DX12.

GTX 1080 has NO Asynch Compute.

GTX 1080 costs $659.99 and is totally outclassed and outperformed by 2 count them 2, two RX 480 4 gb in CROSSFIRE mode costing $399.98!!!

So sad.

Yeah sure go for the budget build that pulls twice as much power as an 1070 while running worse.Spoiler alert 1 of these $200 "bad boys" sucks ~88 Watts out of the PCI-E I wonder how people with a tight budget will find fitting this on a $50 MB. When you invest in a $200 GPU you expect it to run for at least 5 years time don't you? DX 12 might be the future but it's not going to be mainstream this year or the next for that matter by the time it will be used as reference these cards will be as useful as a bag of wet hair. And lets not start the multi-GPU debate I think everyone knows how CF/SLI scales and the support for it isn't doing it any favors, unless you're planning to "game" on synthetic benches. From my point of view they should've just rebranded the 390 series and call it a day.
 
AMD released RX 480 8gb at the $239.99 price point for ONE reason.

2 RX 480 at $ 479.98 in CROSSFIRE mode outperforms GTX 1080 costing $659.99!!!
Except actual benchmark results don't bear that out
perfrel_3840_2160.png

...even if you don't take into account
1. Games/engines with no CF support, and
2. The relative gains to be gained by overclocking
And you get Asynch Compute and a GPU that isn't broken running DX12.
Number of non borked DX12 titles? One?
EDIT: Ninja'ed by ddferrari
AND you get to BUY AMERICAN!!!
You mean the U.S. company whose principle owner is an Abu Dhabi investment company?
Well done on the guerrilla marketing- I hope Raja sends you that complimentary HD 5830
 
Last edited:
He probably thinks that AMD stands for Advanced Murican Devices. He missed out the part where America is made in China nowadays.
 
What nonsense are you spouting?!?
This card doesn't do better than any high end cards at dx12.... It just does better on dx12 than dx11!!

It's still a tad slower in benchmarks than the 980 using dx12.... It MIGHT be better than the 1060 in dx12.... But we won't know that until it's released....

So few DX12 benchmarks available that you cannot say that for sure. Currently Nvidia Async speed is 0, that's a fact.

I'm not a tech-savvy but AFAIK those ACEs were on chip since the first GCN generation, sitting there and consuming power, GCN 1.0 is 4 years old now, how many dx12 games did an average GCN owner play so far? Not saying GCN1 was a bad performer (actually still thinking HD 7900 series were beasts and are still going ok as of now when performance and value is considered) BUT how about perf/watt? I'd like AMD to pull ahead in competition but Let's not fool ourselves, if the only competitive argument of AMD is DX12 performance, it's not a sufficiently strong argument due to the fact that there are still so few titles around

Just looking how good GCN2 cards are compared to Nvidia's Kepler generation cards currently, It's safe to say GCN cards are much better long term investment. In case you haven't noticed, many other companies also add features that pump up power consumption but in the end that is Mostly benefical to end users. Like in this case, GCN gen 2 is still good while Nvidia's Kepler with newest drivers is not.

Also I don't remember anyone complaining about Intel adding useless AVX2 to Haswell that increases maximum power consumption noticeably.

Wow, so AMD may perform better than Nvidia in DX12- all 13 games that support DX12 at the moment... all while sucking down loads of power and moonlighting as a space heater! PASS

Yes, AMD represents future, Nvidia past.
 
I am looking to upgrade from a pair of 4 gig asus DCU2 670's , but power consumption , heat,and Bullit hole in foot ,has me.STILL looking ..

It is my belief, that NVidia was waiting for this prior to releasing the 1060. Which is gonna just slaughter this thing with power consumption and OC headroom.not to mention its gonna run somewhat cooler/quieter

and to think I was considering the jump back to AMD .(I still call it an ATI)...for the 480. now I have to wait for the 1060,and if NVidia can get them on shelves quick enough .only neveressence ,hardreset .and some coinminers will still buy the 480..though there is prolly already an asic card to outperform this at a comparable price.

sorry AMD better luck next release.

I hear Nvidia isn't making much money on consoles these days.I hear AMD is not making much money on consoles these days either..

Are these Nvidia fanboys having extremely small cases with no fans or something? Nobody cared a bit about power consumption until after Maxwell it's only thing that matters.

While AMD offers good features, they don't really care about some watts. Nvidia cripples features and so power consumption is lower. Downside is that after two years Nvidia still cannot even get async compute working at all (n)

As for DX12, - who gives a ****? There are exactly 13 games that use it at the moment, with only 9 more in the coming year (Wikipedia). There are literally thousands of games out there.

https://www.techpowerup.com/reviews/AMD/RX_480_CrossFire/1.html

After two years many gives.
 
I am looking to upgrade from a pair of 4 gig asus DCU2 670's , but power consumption , heat,and Bullit hole in foot ,has me.STILL looking ..

It is my belief, that NVidia was waiting for this prior to releasing the 1060. Which is gonna just slaughter this thing with power consumption and OC headroom.not to mention its gonna run somewhat cooler/quieter

and to think I was considering the jump back to AMD .(I still call it an ATI)...for the 480. now I have to wait for the 1060,and if NVidia can get them on shelves quick enough .only neveressence ,hardreset .and some coinminers will still buy the 480..though there is prolly already an asic card to outperform this at a comparable price.

sorry AMD better luck next release.

I hear Nvidia isn't making much money on consoles these days.I hear AMD is not making much money on consoles these days either..

Are these Nvidia fanboys having extremely small cases with no fans or something? Nobody cared a bit about power consumption until after Maxwell it's only thing that matters.

While AMD offers good features, they don't really care about some watts. Nvidia cripples features and so power consumption is lower. Downside is that after two years Nvidia still cannot even get async compute working at all (n)

As for DX12, - who gives a ****? There are exactly 13 games that use it at the moment, with only 9 more in the coming year (Wikipedia). There are literally thousands of games out there.

https://www.techpowerup.com/reviews/AMD/RX_480_CrossFire/1.html

After two years many gives.

In this case it kinda matters since you have the mainstream level card pushed down to 14nm with the same horrible power-hogging preset. Although I personally don't care as long as I have the wattage to support it but I do have a slight problem when a card is aimed at the masses who are looking into not bleeding their wallets dry and the release is accompanied by things like this:
"AMD’s Radeon RX 480 draws an average of 164W, which exceeds the company’s target TDP. And it gets worse. The load distribution works out in a way that has the card draw 86W through the motherboard’s PCIe slot. Not only does this exceed the 75W ceiling we typically associate with a 16-lane slot, but that 75W limit covers several rails combined and not just this one interface."
Oh and about the 2 years thing do you really believe that you would get to see dx 12 mature on the current gen (for both AMD and nVidia)?. This sounds like a post I saw where someone asked for waterblocks for a RX 480 because it makes sense to cheap out on the GPU and cash out on extreme cooling on a $200 component no? In case you haven't noticed these cards by AMD's standards are supposed to be aimed at the notebook segment as well. So let me ask you what's going to happen if you cut down a version of this GPU in order to meet the power requirements for a mobile device? Furthermore if the result of the power-draws is somehow GloFlo's manufacturing process what will happen to future products ?
Let's see the card in the tier that RX 480 is replacing is the R9 380 right?
So we have the R9 380 @ ~ 196 Watts and the RX 480 @ ~ 164 Watts . Not mentioning thermals and I doubt the AIB's would be able to pull a rabbit out of the hat with this one.
 
In this case it kinda matters since you have the mainstream level card pushed down to 14nm with the same horrible power-hogging preset. Although I personally don't care as long as I have the wattage to support it but I do have a slight problem when a card is aimed at the masses who are looking into not bleeding their wallets dry and the release is accompanied by things like this:
"AMD’s Radeon RX 480 draws an average of 164W, which exceeds the company’s target TDP. And it gets worse. The load distribution works out in a way that has the card draw 86W through the motherboard’s PCIe slot. Not only does this exceed the 75W ceiling we typically associate with a 16-lane slot, but that 75W limit covers several rails combined and not just this one interface."

And? That's just Single RX 480. Also it's passible that Toms Hardware has screwed up something. So don't say one card from not very trusted source = every RX 480 card. Don't believe everything you read.

Oh and about the 2 years thing do you really believe that yo'u would get to see dx 12 mature on the current gen (for both AMD and nVidia)?. This sounds like a post I saw where someone asked for waterblocks for a RX 480 because it makes sense to cheap out on the GPU and cash out on extreme cooling on a $200 component no? In case you haven't noticed these cards by AMD's standards are supposed to be aimed at the notebook segment as well. So let me ask you what's going to happen if you cut down a version of this GPU in order to meet the power requirements for a mobile device? Furthermore if the result of the power-draws is somehow GloFlo's manufacturing process what will happen to future products ?

Notebooks are useless for gaming anyway so nobody cares. Also for notebook gaming external GPU's are much better choice. Cut down versions probably are not so power hungry.

Let's see the card in the tier that RX 480 is replacing is the R9 380 right? So we have the R9 380 @ ~ 196 Watts and the RX 480 @ ~ 164 Watts . Not mentioning thermals and I doubt the AIB's would be able to pull a rabbit out of the hat with this one.

That's lot less power and much more speed. Stock cooler sucks yes, but that's just to make enough difference between custom cards and reference.
 
Power consumption is very relevant in this market, it is all about how much performance you can fit within a given power/heat limit. It simply defines who is the BIG DADDY. Now, I need to clear out one point, RX 480 is a very good product for what it is and I'm fairly certain it will sell a lot (especially if supply is ok and if nvidia doesn't release their mid-tier soon). The harsh part is, it will be uglier in the high end, it seems, it will be worse than Fury-X vs GTX 980 Ti "competition"...I tell you, looking at the current situation, GP100 will probably trash Vega so bad, let's wait and see, I'd love to be wrong on this one. If these power consumption figures are an indication, the situation could be even more embarrassing for AMD. I mean you can't release a (for example) 275w behemoth to compete with a 180w card and then say "look, we do better in DX12 games" this is what I mean
 
Power consumption is very relevant in this market, it is all about how much performance you can fit within a given power/heat limit. It simply defines who is the BIG DADDY.

No, it's not. Like I already said, before Maxwell nobody really cared about power consumption. ATX case + 140mm fans make power consumption mostly irrelevant.

Now, I need to clear out one point, RX 480 is a very good product for what it is and I'm fairly certain it will sell a lot (especially if supply is ok and if nvidia doesn't release their mid-tier soon). The harsh part is, it will be uglier in the high end, it seems, it will be worse than Fury-X vs GTX 980 Ti "competition"...I tell you, looking at the current situation, GP100 will probably trash Vega so bad, let's wait and see, I'd love to be wrong on this one. If these power consumption figures are an indication, the situation could be even more embarrassing for AMD. I mean you can't release a (for example) 275w behemoth to compete with a 180w card and then say "look, we do better in DX12 games" this is what I mean

What do you actually mean? AMD could easily cut down power consumption by sacrificing DX12 performance. This Nvidia's so called "superior power saving architecture" is mostly crippling hardware and doing many things on software. AMD offers much more complete package and so power consumption is higher. AMD also delivers what they promise, not like Nvidia who sells 3.5GB card as 4GB card or is yet to deliver async shaders supporting drivers that were supposed to be out two years ago. It seems Nvidia that fans accept any BS from Nvidia without criticism.
 
Notebooks are useless for gaming anyway so nobody cares. Also for notebook gaming external GPU's are much better choice. Cut down versions probably are not so power hungry.
That is simply untrue, I bought myself an Alienware 17r3 as my job now requires me to move around the country a lot and I spend a lot of time in hotels. The GTX 980m is masterful, pretty much maxing out most game settings.

I had no choice but to choose the 980m since AMD simply do not make an equivalent.

I assume you were trolling though since you said:
Also for notebook gaming external GPU's are much better choice.
They are notebooks, designed to be portable, adding another large box with an external GPU in pretty much defeats it's purpose. I have the Alienware "Graphics Amplifier" at home for when I'm not in a hotel, but the entire reason I moved away from a Desktop and to a Notebook was portability.
 
No, it's not. Like I already said, before Maxwell nobody really cared about power consumption. ATX case + 140mm fans make power consumption mostly irrelevant.



What do you actually mean? AMD could easily cut down power consumption by sacrificing DX12 performance. This Nvidia's so called "superior power saving architecture" is mostly crippling hardware and doing many things on software. AMD offers much more complete package and so power consumption is higher. AMD also delivers what they promise, not like Nvidia who sells 3.5GB card as 4GB card or is yet to deliver async shaders supporting drivers that were supposed to be out two years ago. It seems Nvidia that fans accept any BS from Nvidia without criticism.
You don't understand why power consumption is important. It may be tolerable in the lower end but at the higher end it limits the performance big time. The heat that can be dissipated over a given die size is limited, it's not unlimited. And it will cause problems like throttling and limited overclockability (check Fury-x). Most importantly, (as I stated before) power consumption defines and limits how much performance can be fit within a given bracket. I know Nvidia's hardware is not as feature-rich as AMD's but many people will check the raw performance of a card when making a purchase decision. And NO, AMD can't "easily" cut power consumption down like that, they would need to go back to drawing board to do that. It's how the hardware was designed from bottom to top, that would need a complete re-design
 
That is simply untrue, I bought myself an Alienware 17r3 as my job now requires me to move around the country a lot and I spend a lot of time in hotels. The GTX 980m is masterful, pretty much maxing out most game settings.

I had no choice but to choose the 980m since AMD simply do not make an equivalent.

I assume you were trolling though since you said:

Gaming with 17 inch display :D Last time I played with so little display was something like 12 years ago.

They are notebooks, designed to be portable, adding another large box with an external GPU in pretty much defeats it's purpose. I have the Alienware "Graphics Amplifier" at home for when I'm not in a hotel, but the entire reason I moved away from a Desktop and to a Notebook was portability.

I wouldn't even consider serious gaming if display is under 21 inch. Smaller screens for casual gamers, like those who use tablets.

You don't understand why power consumption is important. It may be tolerable in the lower end but at the higher end it limits the performance big time. The heat that can be dissipated over a given die size is limited, it's not unlimited. And it will cause problems like throttling and limited overclockability (check Fury-x). Most importantly, (as I stated before) power consumption defines and limits how much performance can be fit within a given bracket. I know Nvidia's hardware is not as feature-rich as AMD's but many people will check the raw performance of a card when making a purchase decision. And NO, AMD can't "easily" cut power consumption down like that, they would need to go back to drawing board to do that. It's how the hardware was designed from bottom to top, that would need a complete re-design

It's easier to dissipate heat from larger than smaller core. More features to core also mean bigger die that makes heat dissipation easier.

Fury X also had HBM memory that may have played big part with limited overclocking.

AMD chips have much more raw performance power. AMD just concentrates on DX12 performance as DX11 is already obsolete. Between GCN 2 and GCN 4, AMD had about three and half years so AMD could easily make less feature rich "GCN 5" that has lower power consumption. They just didn't want to. AMD's GCN2 cards are still good option where Nvidia's Keplers (Project Cars is Nvidia optimized game) https://www.techspot.com/review/1000-project-cars-benchmarks/page6.html

Everything has it's price. For those who want long lasting video card, should avoid Nvidia at all costs.
 
And? That's just Single RX 480. Also it's passible that Toms Hardware has screwed up something. So don't say one card from not very trusted source = every RX 480 card. Don't believe everything you read.

Sadly enough it's not just tom's hardware that says it. Although I'm pretty sure AMD's gonna patch this with a bios update. The card itself puts out decent performance I'll give them that but it comes at a cost.


Notebooks are useless for gaming anyway so nobody cares. Also for notebook gaming external GPU's are much better choice. Cut down versions probably are not so power hungry.

AMD seem to care since their chunk of the mobile market is on a collision course with rock-bottom lately. Cutting down isn't an option when the competition can stick in a full-fledged desktop GPU due to better power design and better thermals. As for external GPU's they are kind of expensive and I don't think anybody on a budget will throw out $300 to stick a $200 GPU in it plus the value of a decent enough notebook. Again we're talking about the mainstream market. As for the desktop part if the AIB's manage to cool that thing down then it's a steal for that price. But if a mainstream GPU which by default shouldn't be that beefy spec wise manages to put out those thermals and that power consumption it would make things difficult for AMD to grab a seat at the higher tier table. I personally wouldn't want a space heater in my case just because it can "async" and bumps up the frame rates by 5%-10%. I rather have something that lets me stick to 60 FPS and runs cool enough so if needed in the future I can crank up the clocks to make up for the new game engines and whatnot.
 
Why is this card getting negative points for cooling when the GTX 1080 is thermally throttled and got a perfect score? This makes a bit more noise than the GTX 1080 but keeps the card cooler yet you are only docking AMD points. Same thing for overclocking, Nvidia reference cards couldn't even overclock due to heat issues.

Steve, the author, stated the reason he gave the GTX 1080 a perfect score was because it was the best price/performance at the time. What about the RX 480? It is the best price/performance right now.
This place is nVidia biased. You see it with their tone regarding new driver releases as well. They have to always degrade AMD in one way or another.
 
Sadly enough it's not just tom's hardware that says it. Although I'm pretty sure AMD's gonna patch this with a bios update. The card itself puts out decent performance I'll give them that but it comes at a cost.

AMD states officially that problem is not present or is only about few cards.

AMD seem to care since their chunk of the mobile market is on a collision course with rock-bottom lately. Cutting down isn't an option when the competition can stick in a full-fledged desktop GPU due to better power design and better thermals.

Discrete graphic chips are coming rarity on mobile market anyway except those boxes.

As for external GPU's they are kind of expensive and I don't think anybody on a budget will throw out $300 to stick a $200 GPU in it plus the value of a decent enough notebook. Again we're talking about the mainstream market. As for the desktop part if the AIB's manage to cool that thing down then it's a steal for that price. But if a mainstream GPU which by default shouldn't be that beefy spec wise manages to put out those thermals and that power consumption it would make things difficult for AMD to grab a seat at the higher tier table. I personally wouldn't want a space heater in my case just because it can "async" and bumps up the frame rates by 5%-10%. I rather have something that lets me stick to 60 FPS and runs cool enough so if needed in the future I can crank up the clocks to make up for the new game engines and whatnot.

Value notebooks are equipped with Intel's "extreme" graphics anyway. Also on mobile side, GPU integrated with CPU is future. Discrete mobile GPU's are quite soon vanishing. As soon as memory bandwidth problems are solved. HBM2?

Async has proven to bumb up framerates 40% on best scenario. Also even current DX11 games with DX12 patches show so much improvement that DX11 speed can be taken out of consideration. Remember that games tested in this article are quite old so most gamers have already played them. High end graphic card sales are also quite small portion of all sales.
 
It's easier to dissipate heat from larger than smaller core. More features to core also mean bigger die that makes heat dissipation easier.

Fury X also had HBM memory that may have played big part with limited overclocking.

AMD chips have much more raw performance power. AMD just concentrates on DX12 performance as DX11 is already obsolete. Between GCN 2 and GCN 4, AMD had about three and half years so AMD could easily make less feature rich "GCN 5" that has lower power consumption. They just didn't want to. AMD's GCN2 cards are still good option where Nvidia's Keplers (Project Cars is Nvidia optimized game) https://www.techspot.com/review/1000-project-cars-benchmarks/page6.html

Everything has it's price. For those who want long lasting video card, should avoid Nvidia at all costs.
Dude I don't object to many things you keep saying but no there's no such thing as "long lasting video card" unless you go for the highest end card. And more features on board means more cost too you know this. And no, DX11 is not obsolote Oo come on man last I checked the majority of the games still are DX11 based, in fact there's only a handful of DX12 games around so far. And only a couple of them are kind of popular, hitman and tomb raider etc. Most popular in benchmarks is probably Ashes of Singularity, which honestly isn't that much appealing for the majority of people, but yeah it can be a benchmark tool. It seems to me AMD rushed into the DX12 thing too early OR maybe they could design the new arch more efficiency in mind
 
Dude I don't object to many things you keep saying but no there's no such thing as "long lasting video card" unless you go for the highest end card. And more features on board means more cost too you know this. And no, DX11 is not obsolote Oo come on man last I checked the majority of the games still are DX11 based, in fact there's only a handful of DX12 games around so far. And only a couple of them are kind of popular, hitman and tomb raider etc. Most popular in benchmarks is probably Ashes of Singularity, which honestly isn't that much appealing for the majority of people, but yeah it can be a benchmark tool. It seems to me AMD rushed into the DX12 thing too early OR maybe they could design the new arch more efficiency in mind

Most games right now are DX11 but if I was buying new graphic card, I would look to the future and DX12. Generally there are two kind of buyers: those who think about future and those who not. Nvidia fanboys seem to fall on former category. Like those dumbasses that bought GTX 980 Ti for 700$ two months ago and value is at most 400$, that makes 150$ a month or 5$ a day. It was no surprise that 14/16nm cards are awesome against over 4 year old 28nm parts.
 
Good performance for the price. Like others though, I am really disappointed with the power consumption. Hopefully this will be fixed with driver optimizations/bios updates the future.

Not surprising that DirectX 12 features on hardware (like ACE's) mean higher power consumption. Still better choice than Nvidia's software async shaders. They promised drivers last august, still waiting :D


You are still waiting? for Drivers? for async shaders? on an NVIDIA card? BUT WHY? are you then going to rush out and purchase an Nvidia card ?
especially If the driver level version outperforms the hardware version .software is way easier to update than hardware is.I would like to see the driver version now as well.if it outperforms the hardware version.lol.I hear bubbles popping..
 
Most games right now are DX11 but if I was buying new graphic card, I would look to the future and DX12. Generally there are two kind of buyers: those who think about future and those who not. Nvidia fanboys seem to fall on former category. Like those dumbasses that bought GTX 980 Ti for 700$ two months ago and value is at most 400$, that makes 150$ a month or 5$ a day. It was no surprise that 14/16nm cards are awesome against over 4 year old 28nm parts.
The people that bought 980ti are still solid in gaming experience. I bought r9 380x 8 months ago and I paid 260$ (in my country) but now there's rx480 for 240 bucks you see we all are screwed :) By the way Nvidia's new-gen cards are not bad in DX12 titles either, they use their raw strength to make up for their lack of hardware async it seems. That's what I mean, who gives a fck if a card without hardware schedules can perform on par with the one that has it on-board?
 
Back