Radeon R9 390X, R9 390 & R9 380 Review: One last rebadge before unleashing the Fury

Steve

Posts: 3,044   +3,153
Staff member

radeon r9 amd radeon r9 380 r9 390 r9 390x

AMD made great strides in January 2012 when it released its first GCN-based GPU. Codenamed 'Tahiti XT', the Radeon HD 7970 squeezed 4.3 billion transistors onto a 352 mm2 die. In the time since, the company has delivered GCN 1.1 and GCN 1.2 upgrades but save for the Radeon R9 290 and R9 290X, none of its releases have been particularly exciting.

We've been wondering what AMD's next move would be. Surprisingly -- or perhaps unsurprisingly for the cynics among us -- that next move is yet another round of rebadged Radeons, at least until the R9 Fury X lands next week.

Since these first Radeon 300 series GPUs are rebranded, AMD is ripping the band aid off quickly by releasing them all together versus trickling them out over the next few months. Today's launch brings the Radeon R9 390X, R9 390, R9 380, R7 370 and R7 360. We had lingering hopes for truly updated GPUs, but what's old is new again at AMD so say hello to a familiar family of Radeons.

Read the complete review.

 
I kinda hope the Fury X destroy's the 980Ti, Bringing Nvidia price's down to something I can afford :)
That and AMD need to stay in the competition, They did show Lara Croft running at 45fps @ 5k res with all the Graphic options turned to the max (not sure about anti-aliasing though).

I hope it's enough!
 
I figured the initial 300 series cards were going to be a bust, but at least the 390 actually offers some value to Radeon users. I'm honestly much more curious about the performance of the Fury Nano, even moreso than the Fury and FuryX. I mean, my next build will include a ship of the line card and not the Nano, but I'm always in the mood to designing and/or building SFF computers.
 
I bet many will be disappointed cause there was so much hype about new AMD cards. Of course, we haven't seen Fury (X) yet.
 
I bet many will be dissapointed cause there was so much hype about new AMD cards. Ofcourse, we haven't seen Fury (X) yet.

AMD always makes room for a bit of extra disappointment ;) There is more to come ;)

It could be Furiously disappointing this time :)
 
It still seems like Nvidia is the best bang for your dollar, they run cooler, consume less electricity and overclock better... AMD really needs to pickup their game.

Only 2 cards worth getting for the normal consumer.
$200 range 960
$300 range $970
 
It still seems like Nvidia is the best bang for your dollar, they run cooler, consume less electricity and overclock better... AMD really needs to pickup their game.
Only 2 cards worth getting for the normal consumer.
$200 range 960
$300 range $970
I beg to differ about that, on both cases.
First, in the $200 range the R9 380 is a better option in the long run due to the GTX 960 having disappointing memory bandwidth. In the future, as VRAM performance requirements increase, the GTX 960 will start to fall behind. And yes, Nvidia's card is more energy efficient, but as you can see in the review the R9 380 system doesn't even break 250W consumption, meaning energy savings will be insignificant and neither GPU will have any sort of issue with PSU wattages, so not really that relevant.
As for the $330 slot, AMD offers you slightly higher performance (and the lead increases at higher resolutions) and over twice as much VRAM. Specially after the "3.5 GB" story, the R9 390 is easily the best option, specially for playing at 2560x1440 and higher. Again, Nvidia does have a slight edge in power consumption, but 20W less won't give you any significant power savings, and with the R9 390 system consuming around 300W it won't be any worry for PSUs either.
On the other hand, the 390X is a disappointment. They might as well have not released in and let the 390 be their 300-series flagship.
 
It just reminds me of 2008, when I picked up 8800GT MSI Zilent for around 100euros, and freshly released 9800GT could be yours for a mere 150-200 euros. So no precedent here but still it looks like Chapter 11 is breathing down AMD's neck.
 
Bit of a fail here. I guess my 290X alone or in Crossfire should be fine for now. Only have Fury X, Fury Nano, Fury to look at now since Single cards are better than multiple.
 
Quite a few mistakes and some misleading stuff in this article.

"The R9 380 averaged 48fps at 1080p, making it just 2% faster than the R9 285 and 4% faster than the GTX 960. That said, it was a whopping 21% slower than the R9 280X."

In this example the R9 280x pulled 51 FPS. To be 21% faster it would've needed ~58 FPS.

And you keep bringing up the 7950 when talking about the R9 380 aka R9 285. They're nothing alike. The 285/380 are based based on Tonga, a core released for the first time last year. The 7950/280 are indeed a much older core.

Also, the Tonga core is not hindered by the 256 bit bus since it has compression, which the 384 bit 7950/R9 280 do not.
 
Last edited by a moderator:
I made an account to say that your power consumption charts are off the charts! downright fantasy. Dude are you serious? where did you get those stupid numbers?

Total System power consumption with R9 390X is 300watts?? and you show it's identical to 290X? both guru3d and techpowerup showed 290X ALONE consumes 280watts at peak!! your charts tells us that 300W it's system power consumption? 390X has actually a lower TDP than 290X! yet the wattage draw number shown here is identical to 290X.

But this is the fun part. You jibberish system power consumption ?! charts (metro last light) shows that R9 390 (TDP 275W) consumes less than a GTX980 which is a 165W TDP card! LOL!! and in tomb raider you readings shows that 390 consumes only 10Watts more than 970 ?! 148W card?

Your readings are a joke! if I was Nvidia I would put you on a blacklist for inability to tell the difference between a 165W card vs 275W . But thanks for the laugh!
you need to learn the difference between TDP, power consumption and peak power consumption.
all reviews have the GTX980 consume around 30-50W less compared to the 290X, depending on the game.
 
Last edited:
I made an account to say that your power consumption charts are off the charts! downright fantasy. Dude are you serious? where did you get those stupid numbers?
Total System power consumption with R9 390X is 300watts?? and you show it's identical to 290X? both guru3d and techpowerup showed 290X ALONE consumes 280watts at peak!! your charts tells us that 300W it's system power consumption? 390X has actually a lower TDP than 290X! yet the wattage draw number shown here is identical to 290X.
But this is the fun part. You jibberish system power consumption ?! charts (metro last light) shows that R9 390 (TDP 275W) consumes less than a GTX980 which is a 165W TDP card! LOL!! and in tomb raider you readings shows that 390 consumes only 10Watts more than 970 ?! 148W card?
This comment is absolutely hilarious.
First of all, TDP is not related to power consumption, like Puiu just said. TDP is the estimated ammount of heat that must be dissipated by the cooler, which AMD/Nvidia can control through chip leakage. It's possible to have a very power efficient chip with a lot of leakage and therefore a very high TDP, the same way it's possible to have a not very efficient chip with low leakage, which would have a low TDP (but probably burn itself to death since it won't transfer enough heat to the cooler and internat temperature would rise). Nvidia saying a GPU has a TDP of 150W DOES NOT MEAN the card consumes 150W under load. Similarly, an AMD card with 250W+ TDP also doesn't mean the card consumes 250W+. On top of that, AMD and Nvidia calculate TDP differently, so you can't even correlate them directly to begin with.
Also, outlier cases, like your "peak power consumption", is not a relevant way to compare power consumption. Every card can be prone to spikes so long as the power is available, if you look at peak power consumption you'll be overstimating the power consumption of every GPU, from both AMD and Nvidia, by a lot.
Finally, don't tell me you actually believed these advertised TDPs for the GTX 970 and the R9 290X somehow meant that the 970 can manage to be as fast as the 290X by using just half as much power, did you? I know some people like to drink Nvidia's kool-aid, but you'd be drowning in it and in desperate need for a lifeguard.
Your readings are a joke! if I was Nvidia I would put you on a blacklist for inability to tell the difference between a 165W card vs 275W
And where should the people who run this site put you for coming here to make a baseless complaint like that, while yourself being ignorant about what TDP and power consumption actually is?
 
It's so sad to see all the Nvidia fanboys comments about the 300 and fury series. Nobody expected the 300 series to blow away the competition but what we did expect is better performance at a lower price point and that's what we got and AMD delivered.

gtx 750ti < R7 370 $140 vs $150
gtx 960 < R9 380 $210 vs $199
gtx 970 < R9 390 $350 vs $329
gtx 980 < R9 390x $530 vs $429
gtx 980ti < fury and fury x $670 vs $549 and $649
 
I simply do not understand why so many people are complaining about 390x being a respin of 290x. nVidia does the same thing all the time and no one ran around claiming the sky was falling when they did it. Granted the cost for 390x is to high...but that is almost always the case at launch...3 months from now they will most likely be priced much more in line with 290x.

Honestly, do people think AMD or even nVidia has the money to replace every single card in their entire lineup at one time ? This is ridiculous and equivalent to expecting Chevy to replace every car they sell at once...business just doesn't work that way.

The new Fiji based card tech like HBM will filter down to the main stream cards once production ramps and yields improve. People need to remember this isn't a simple memory swap to the next best version of GDDR...this is a whole architecture change that will take time to filter down.

In the mean time memory speed and size boosts and other small improvements on the current 290/280 platform make sense since the manufacturing yields are solid and the cards have been successful in the market place.
 
It's not biased, it's probably disappointingly based on true benchmarks. It agrees mainly with the fact that the new 300 series benchmarks improve with higher resolution due to more VRAM. However, it also asserts that the performance gain isn't huge at 1080p, at least not significant in relation to the older 200 series. That isn't surprising because AMD admitted to using the same and older technology.. I don't see how this can come to a surprise to you. AMD consumers wanted more performance per value at higher resolutions, that's exactly what you got. The issues with the frame-rates when it came down to the 960 vs 380/280x at 1080p supported the fact that contemporary games use drivers catering to NVIDIA's technology, or use drivers that cater towards NVIDIA products. That's also true. The raw power coming from the 380 is probably similar to that of the 960, but AMD has always had difficulty keeping up simply because of game to GPU compatibility.
 
That being said @Kelorth, you should be impressed at the 300 series benchmarks. While some NVIDIA users experience issues with far too little VRAM, AMD decided to take advantage of those cons by providing it's users with more vram at an affordable price.
 
The only good thing about the new 300 series is that R9 380 now has a 4 GB options (basically now battling with the previously superior 4 GB 960) and that the R9 390 is the same price with the 290X and seems to perform better or on par and also has 8 GBs of VRAM.

So 2 GPUs out of the 300 series are winners in my book so far.

Not to mention a stock 390X (while a respin of a 290X) is still cheaper than a 980 and almost as fast. Certainly faster than a 970 in the majority of cases.
 
Thanks for another timely review Steve.
I notice that ComputerBase, who also did a review of the 300-series cards, were unable to use the 300-series driver with the 200-series cards
That being said @Kelorth, you should be impressed at the 300 series benchmarks. While some NVIDIA users experience issues with far too little VRAM, AMD decided to take advantage of those cons by providing it's users with more vram at an affordable price.
8GB 290X's have been around for a while. A current comparison:
Sapphire R9 290X Tri-X 8GB.....$375....1020MHz core/5500MHz mem (eff.)
Sapphire R9 390X Tri-X 8GB.....$430....1055MHz core/6000MHz mem (eff.)
 
Quite a few mistakes and some misleading stuff in this article.

"The R9 380 averaged 48fps at 1080p, making it just 2% faster than the R9 285 and 4% faster than the GTX 960. That said, it was a whopping 21% slower than the R9 280X."

In this example the R9 280x pulled 51 FPS. To be 21% faster it would've needed ~58 FPS.

And you keep bringing up the 7950 when talking about the R9 380 aka R9 285. They're nothing alike. The 285/380 are based based on Tonga, a core released for the first time last year. The 7950/280 are indeed a much older core.

Also, the Tonga core is not hindered by the 256 bit bus since it has compression, which the 384 bit 7950/R9 280 do not.

Thanks Hashtagz, for an article with “quite a few mistakes” you were kind to only bring up a single typo regarding a percentage figure.

Not sure it is misleading to say the R9 380 has its roots in the 7950. We understand what Tonga is, we reviewed it on launch day.

I m curious as to why you kept hairworks on for the tests?

The game looks far better with it turned on...

So, how much did NVidia pay you this time? Lmao.

Nvidia can't have paid much for this article since we picked AMD for 4 of the possible 6 recommendations...

https://www.techspot.com/guides/912-best-graphics-cards-2014/page7.html

So biased it's pathetic tbh.

Trust me that’s not what is pathetic here.
 
Back