TechSpot

Radeon R9 390X, R9 390 & R9 380 Review: One last rebadge before unleashing the Fury

By Steve
Jun 18, 2015
Post New Reply
  1. radeon r9 amd radeon r9 380 r9 390 r9 390x

    AMD made great strides in January 2012 when it released its first GCN-based GPU. Codenamed 'Tahiti XT', the Radeon HD 7970 squeezed 4.3 billion transistors onto a 352 mm2 die. In the time since, the company has delivered GCN 1.1 and GCN 1.2 upgrades but save for the Radeon R9 290 and R9 290X, none of its releases have been particularly exciting.

    We've been wondering what AMD's next move would be. Surprisingly -- or perhaps unsurprisingly for the cynics among us -- that next move is yet another round of rebadged Radeons, at least until the R9 Fury X lands next week.

    Since these first Radeon 300 series GPUs are rebranded, AMD is ripping the band aid off quickly by releasing them all together versus trickling them out over the next few months. Today's launch brings the Radeon R9 390X, R9 390, R9 380, R7 370 and R7 360. We had lingering hopes for truly updated GPUs, but what's old is new again at AMD so say hello to a familiar family of Radeons.

    Read the complete review.

     
  2. Burty117

    Burty117 TechSpot Chancellor Posts: 2,920   +687

    I kinda hope the Fury X destroy's the 980Ti, Bringing Nvidia price's down to something I can afford :)
    That and AMD need to stay in the competition, They did show Lara Croft running at 45fps @ 5k res with all the Graphic options turned to the max (not sure about anti-aliasing though).

    I hope it's enough!
     
  3. madboyv1

    madboyv1 TechSpot Paladin Posts: 1,333   +267

    I figured the initial 300 series cards were going to be a bust, but at least the 390 actually offers some value to Radeon users. I'm honestly much more curious about the performance of the Fury Nano, even moreso than the Fury and FuryX. I mean, my next build will include a ship of the line card and not the Nano, but I'm always in the mood to designing and/or building SFF computers.
     
  4. Nobina

    Nobina TS Evangelist Posts: 861   +340

    I bet many will be disappointed cause there was so much hype about new AMD cards. Of course, we haven't seen Fury (X) yet.
     
  5. VitalyT

    VitalyT Russ-Puss Posts: 3,156   +1,431

    AMD always makes room for a bit of extra disappointment ;) There is more to come ;)

    It could be Furiously disappointing this time :)
     
  6. It still seems like Nvidia is the best bang for your dollar, they run cooler, consume less electricity and overclock better... AMD really needs to pickup their game.

    Only 2 cards worth getting for the normal consumer.
    $200 range 960
    $300 range $970
     
  7. I beg to differ about that, on both cases.
    First, in the $200 range the R9 380 is a better option in the long run due to the GTX 960 having disappointing memory bandwidth. In the future, as VRAM performance requirements increase, the GTX 960 will start to fall behind. And yes, Nvidia's card is more energy efficient, but as you can see in the review the R9 380 system doesn't even break 250W consumption, meaning energy savings will be insignificant and neither GPU will have any sort of issue with PSU wattages, so not really that relevant.
    As for the $330 slot, AMD offers you slightly higher performance (and the lead increases at higher resolutions) and over twice as much VRAM. Specially after the "3.5 GB" story, the R9 390 is easily the best option, specially for playing at 2560x1440 and higher. Again, Nvidia does have a slight edge in power consumption, but 20W less won't give you any significant power savings, and with the R9 390 system consuming around 300W it won't be any worry for PSUs either.
    On the other hand, the 390X is a disappointment. They might as well have not released in and let the 390 be their 300-series flagship.
     
    Puiu likes this.
  8. crabbos

    crabbos TS Rookie

    If the Fury isn't awesome I'll soon be trading in my 280x for a Nvidia alternative.
     
  9. noel24

    noel24 TS Maniac Posts: 304   +154

    It just reminds me of 2008, when I picked up 8800GT MSI Zilent for around 100euros, and freshly released 9800GT could be yours for a mere 150-200 euros. So no precedent here but still it looks like Chapter 11 is breathing down AMD's neck.
     
  10. Bit of a fail here. I guess my 290X alone or in Crossfire should be fine for now. Only have Fury X, Fury Nano, Fury to look at now since Single cards are better than multiple.
     
  11. Hashtagz

    Hashtagz TS Rookie

    Quite a few mistakes and some misleading stuff in this article.

    "The R9 380 averaged 48fps at 1080p, making it just 2% faster than the R9 285 and 4% faster than the GTX 960. That said, it was a whopping 21% slower than the R9 280X."

    In this example the R9 280x pulled 51 FPS. To be 21% faster it would've needed ~58 FPS.

    And you keep bringing up the 7950 when talking about the R9 380 aka R9 285. They're nothing alike. The 285/380 are based based on Tonga, a core released for the first time last year. The 7950/280 are indeed a much older core.

    Also, the Tonga core is not hindered by the 256 bit bus since it has compression, which the 384 bit 7950/R9 280 do not.
     
    Last edited by a moderator: Jun 18, 2015
    Tibeardius likes this.
  12. Peter Farkas

    Peter Farkas TS Addict Posts: 214   +67

    Nice article, thanks TechSpot! Good to see I am still good with my GTX 970. :)
     
    hahahanoobs and Steve like this.
  13. cldmstrsn

    cldmstrsn TS Booster Posts: 102   +52

    I m curious as to why you kept hairworks on for the tests?
     
  14. Puiu

    Puiu TS Evangelist Posts: 1,915   +537

    I'm surprised on how much cooler they ran. any news on how well they overclock?
     
  15. Puiu

    Puiu TS Evangelist Posts: 1,915   +537

    you need to learn the difference between TDP, power consumption and peak power consumption.
    all reviews have the GTX980 consume around 30-50W less compared to the 290X, depending on the game.
     
    Last edited: Jun 18, 2015
    SuperVeloce and Cryio like this.
  16. This comment is absolutely hilarious.
    First of all, TDP is not related to power consumption, like Puiu just said. TDP is the estimated ammount of heat that must be dissipated by the cooler, which AMD/Nvidia can control through chip leakage. It's possible to have a very power efficient chip with a lot of leakage and therefore a very high TDP, the same way it's possible to have a not very efficient chip with low leakage, which would have a low TDP (but probably burn itself to death since it won't transfer enough heat to the cooler and internat temperature would rise). Nvidia saying a GPU has a TDP of 150W DOES NOT MEAN the card consumes 150W under load. Similarly, an AMD card with 250W+ TDP also doesn't mean the card consumes 250W+. On top of that, AMD and Nvidia calculate TDP differently, so you can't even correlate them directly to begin with.
    Also, outlier cases, like your "peak power consumption", is not a relevant way to compare power consumption. Every card can be prone to spikes so long as the power is available, if you look at peak power consumption you'll be overstimating the power consumption of every GPU, from both AMD and Nvidia, by a lot.
    Finally, don't tell me you actually believed these advertised TDPs for the GTX 970 and the R9 290X somehow meant that the 970 can manage to be as fast as the 290X by using just half as much power, did you? I know some people like to drink Nvidia's kool-aid, but you'd be drowning in it and in desperate need for a lifeguard.
    And where should the people who run this site put you for coming here to make a baseless complaint like that, while yourself being ignorant about what TDP and power consumption actually is?
     
    SuperVeloce likes this.
  17. drocdoc

    drocdoc TS Rookie

    It's so sad to see all the Nvidia fanboys comments about the 300 and fury series. Nobody expected the 300 series to blow away the competition but what we did expect is better performance at a lower price point and that's what we got and AMD delivered.

    gtx 750ti < R7 370 $140 vs $150
    gtx 960 < R9 380 $210 vs $199
    gtx 970 < R9 390 $350 vs $329
    gtx 980 < R9 390x $530 vs $429
    gtx 980ti < fury and fury x $670 vs $549 and $649
     
  18. I simply do not understand why so many people are complaining about 390x being a respin of 290x. nVidia does the same thing all the time and no one ran around claiming the sky was falling when they did it. Granted the cost for 390x is to high...but that is almost always the case at launch...3 months from now they will most likely be priced much more in line with 290x.

    Honestly, do people think AMD or even nVidia has the money to replace every single card in their entire lineup at one time ? This is ridiculous and equivalent to expecting Chevy to replace every car they sell at once...business just doesn't work that way.

    The new Fiji based card tech like HBM will filter down to the main stream cards once production ramps and yields improve. People need to remember this isn't a simple memory swap to the next best version of GDDR...this is a whole architecture change that will take time to filter down.

    In the mean time memory speed and size boosts and other small improvements on the current 290/280 platform make sense since the manufacturing yields are solid and the cards have been successful in the market place.
     
  19. Kelorth

    Kelorth TS Rookie

    So, how much did NVidia pay you this time? Lmao.
     
  20. Kelorth

    Kelorth TS Rookie

    So biased it's pathetic tbh.
     
  21. It's not biased, it's probably disappointingly based on true benchmarks. It agrees mainly with the fact that the new 300 series benchmarks improve with higher resolution due to more VRAM. However, it also asserts that the performance gain isn't huge at 1080p, at least not significant in relation to the older 200 series. That isn't surprising because AMD admitted to using the same and older technology.. I don't see how this can come to a surprise to you. AMD consumers wanted more performance per value at higher resolutions, that's exactly what you got. The issues with the frame-rates when it came down to the 960 vs 380/280x at 1080p supported the fact that contemporary games use drivers catering to NVIDIA's technology, or use drivers that cater towards NVIDIA products. That's also true. The raw power coming from the 380 is probably similar to that of the 960, but AMD has always had difficulty keeping up simply because of game to GPU compatibility.
     
  22. That being said @Kelorth, you should be impressed at the 300 series benchmarks. While some NVIDIA users experience issues with far too little VRAM, AMD decided to take advantage of those cons by providing it's users with more vram at an affordable price.
     
  23. Cryio

    Cryio TS Booster Posts: 192   +58

    The only good thing about the new 300 series is that R9 380 now has a 4 GB options (basically now battling with the previously superior 4 GB 960) and that the R9 390 is the same price with the 290X and seems to perform better or on par and also has 8 GBs of VRAM.

    So 2 GPUs out of the 300 series are winners in my book so far.

    Not to mention a stock 390X (while a respin of a 290X) is still cheaper than a 980 and almost as fast. Certainly faster than a 970 in the majority of cases.
     
  24. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    Thanks for another timely review Steve.
    I notice that ComputerBase, who also did a review of the 300-series cards, were unable to use the 300-series driver with the 200-series cards
    8GB 290X's have been around for a while. A current comparison:
    Sapphire R9 290X Tri-X 8GB.....$375....1020MHz core/5500MHz mem (eff.)
    Sapphire R9 390X Tri-X 8GB.....$430....1055MHz core/6000MHz mem (eff.)
     
    Steve likes this.
  25. Steve

    Steve TechSpot Editor Topic Starter Posts: 2,218   +1,244

    Thanks Hashtagz, for an article with “quite a few mistakes” you were kind to only bring up a single typo regarding a percentage figure.

    Not sure it is misleading to say the R9 380 has its roots in the 7950. We understand what Tonga is, we reviewed it on launch day.

    The game looks far better with it turned on...

    Nvidia can't have paid much for this article since we picked AMD for 4 of the possible 6 recommendations...

    http://www.techspot.com/guides/912-best-graphics-cards-2014/page7.html

    Trust me that’s not what is pathetic here.
     
    Peter Farkas and Puiu like this.

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...