Radeon R9 390X, R9 390 & R9 380 Review: One last rebadge before unleashing the Fury

Such a great deal of partiality in this review. Its introduction was refuted by its own findings, and its conclusion was equally vapid.

There are some observations that should have made it OBVIOUS that the 390/X GPUs are NOT the same GPUs as found on the 290/X. HardwareCanucks's review was quite balanced - possibly because they were told by AMD what was done to the GPU.

Notice how this review claims, over and over, that the 390/X *IS* the 290/X just overclocked and with more memory? Notice how the extra RAM and higher clocks didn't result in any extra power draw?

AMD made some minor changes to the Hawaii GPU, mostly for power efficiency (saved about 40W - which was then taken right back with more RAM and clock speed...), but they also slightly improved performance per cycle (2~5%). This can be seen in the results, most notably, for the 390 in this review.

Now, the 390, at, effectively, R9 290X clocks, is about 8% faster than the R9 290X and draws less power while having twice the RAM at 20% higher clocks! That's noteworthy, I'd say.

This is not to say that the 390/X are worlds better, but they ARE better, than the 290/X. They are also the new SECOND TIER of performance - and it is common for the last generation's top tier to fall back to second tier. AMD just didn't change the names...

Also, the 285 became the 380, and it is faster than the 280X, contrary to the article's claims (though its results, again, bore this out).
 
On the other hand, the 390X is a disappointment. They might as well have not released in and let the 390 be their 300-series flagship.

Agreed!

They could have released the 390X with lower clocks and just beat out the old 290X - and even just had 4GB of 6GHz as an option (so we could see the power savings) and they would have had a killer option.
 
I bet many will be dissapointed cause there was so much hype about new AMD cards. Ofcourse, we haven't seen Fury (X) yet.

AMD always makes room for a bit of extra disappointment ;) There is more to come ;)

It could be Furiously disappointing this time :)

Would you stop bashing AMD on every article on the website? It's obvious you have an agenda but at least do it convincingly. The only disappointment I see here is that the cards aren't a bit faster. If they really were Nvidia would look worse then it does now.
 
Such a great deal of partiality in this review. Its introduction was refuted by its own findings, and its conclusion was equally vapid.

There are some observations that should have made it OBVIOUS that the 390/X GPUs are NOT the same GPUs as found on the 290/X. HardwareCanucks's review was quite balanced - possibly because they were told by AMD what was done to the GPU.

Notice how this review claims, over and over, that the 390/X *IS* the 290/X just overclocked and with more memory? Notice how the extra RAM and higher clocks didn't result in any extra power draw?

AMD made some minor changes to the Hawaii GPU, mostly for power efficiency (saved about 40W - which was then taken right back with more RAM and clock speed...), but they also slightly improved performance per cycle (2~5%). This can be seen in the results, most notably, for the 390 in this review.

Now, the 390, at, effectively, R9 290X clocks, is about 8% faster than the R9 290X and draws less power while having twice the RAM at 20% higher clocks! That's noteworthy, I'd say.

This is not to say that the 390/X are worlds better, but they ARE better, than the 290/X. They are also the new SECOND TIER of performance - and it is common for the last generation's top tier to fall back to second tier. AMD just didn't change the names...

Also, the 285 became the 380, and it is faster than the 280X, contrary to the article's claims (though its results, again, bore this out).

The R9 290X is not the same GPU as the R9 390X you say? HardwareCanucks were told by AMD what was changed. Okay so what exactly has been changed?

So AMD made power saving improvements but it’s the exact same GCN 1.1 architecture. Adding more memory that isn’t being used won’t necessarily increase power consumption. Moreover a 50MHz boost in core clock speed isn’t going to equate to massive changes in power consumption, we would also have to look more closely at the voltages Gigabyte and HIS were using.

Also, the 285 became the 380, and it is faster than the 280X, contrary to the article's claims (though its results, again, bore this out).

Yet when you average out the results from the 8 games tested it isn’t, so I am not sure which results bore that out.
 
There are some observations that should have made it OBVIOUS that the 390/X GPUs are NOT the same GPUs as found on the 290/X. HardwareCanucks's review was quite balanced - possibly because they were told by AMD what was done to the GPU.
Notice how this review claims, over and over, that the 390/X *IS* the 290/X just overclocked and with more memory? Notice how the extra RAM and higher clocks didn't result in any extra power draw?
AMD made some minor changes to the Hawaii GPU, mostly for power efficiency (saved about 40W - which was then taken right back with more RAM and clock speed...), but they also slightly improved performance per cycle (2~5%). This can be seen in the results, most notably, for the 390 in this review.
So AMD made power saving improvements but it's the exact same GCN 1.1 architecture. Adding more memory that isn't being used won't necessarily increase power consumption. Moreover a 50MHz boost in core clock speed isn't going to equate to massive changes in power consumption, we would also have to look more closely at the voltages Gigabyte and HIS were using.
It's also important to note that AMD doesn't need to make changes to the chip itself to increase energy efficiency, they can do it simply by better binning. And it's entirely normal for the manufacturing process for a given chip to become more efficient over the time that chip is being produced. Nothing needs to be physically changed in Hawaii to see the benefits we're seeing in 390/390X reviews.
Also, as far as I know, the 390 and 390X do not have the same improvements Tonga had in tesselation performance, so it indeed was not updated to GCN 1.2.
Finally, fellow Guest, it's not possible to know whether the 390 runs at the same frequency as the 290X, or if there has been an improvement in per-cycle performance. Hawaii doesn't have a fixed base frequency, under load it starts at the highest possible frequency (the advertised one) and gradually lowers it as load and TDP/temperature demands. Given how those new Hawaii chips are naturally more efficient due to the usual improvements of the manufacturing process, it's more likely that it simply is less agressive in reducing the frequency under load, when compared to the 290X, rather than having physical changes on the chip that enable higher performance at the same frequency.
 
It's also important to note that AMD doesn't need to make changes to the chip itself to increase energy efficiency, they can do it simply by better binning. And it's entirely normal for the manufacturing process for a given chip to become more efficient over the time that chip is being produced. Nothing needs to be physically changed in Hawaii to see the benefits we're seeing in 390/390X reviews.
Also, as far as I know, the 390 and 390X do not have the same improvements Tonga had in tesselation performance, so it indeed was not updated to GCN 1.2.
Finally, fellow Guest, it's not possible to know whether the 390 runs at the same frequency as the 290X, or if there has been an improvement in per-cycle performance. Hawaii doesn't have a fixed base frequency, under load it starts at the highest possible frequency (the advertised one) and gradually lowers it as load and TDP/temperature demands. Given how those new Hawaii chips are naturally more efficient due to the usual improvements of the manufacturing process, it's more likely that it simply is less agressive in reducing the frequency under load, when compared to the 290X, rather than having physical changes on the chip that enable higher performance at the same frequency.

It is also important to note that we are not using reference cards. The Gigabyte R9 290X WindForce 3X card runs a very aggressive power profile whereas the new HIS R9 390X doesn’t. So this is likely why we see little to no difference in consumption.
 
The R9 290X is not the same GPU as the R9 390X you say? HardwareCanucks were told by AMD what was changed. Okay so what exactly has been changed?
Personally I'd say not a lot. One site normalized the clocks on the 390X and 290X. The results tend to speak for themselves.
1434612549l1GBQzJE5q_9_1.gif


The other results, look just as tight - well within margin of error territory. All that appears to have been done is AMD validated the memory controllers for 1500MHz. Overclocking headroom is on par with the 290/290X (1200 core / 6600 eff. memory seems to be the limit), and the newer card generally sucks more wattage.

So AMD made power saving improvements
That seems discretionary on the part of AIB's (the dialled-in VDDC you mentioned). Pity no one got a reference card to review. Add ~15W for the extra 4GB of GDDR5 (AMD's figures not mine), and some allowance for the clock bump, and it seems a wash judging by the reviews around the net if you take into account some of the games using neither the full vRAM or GPU resources.
Hardware.info had the largest separation, Kyle's sample was up there also, as was Tom's - all reviewed the MSI Gaming OC version, whereas your HIS, CB's Sapphire, and HotHardware's PowerColor showed little variance.
 
Last edited:
Personally I'd say not a lot. One site normalized the clocks on the 390X and 290X. The results tend to speak for themselves.
1434612549l1GBQzJE5q_9_1.gif


The other results, look just as tight - well within margin of error territory. All that appears to have been done is AMD validated the memory controllers for 1500MHz. Overclocking headroom is on par with the 290/290X (1200 core / 6600 eff. memory seems to be the limit), and the newer card generally sucks more wattage.


That seems discretionary on the part of AIB's (the dialled-in VDDC you mentioned). Pity no one got a reference card to review. Add ~15W for the extra 4GB of GDDR5 (AMD's figures not mine), and some allowance for the clock bump, and it seems a wash judging by the reviews around the net if you take into account some of the games using neither the full vRAM or GPU resources.
Hardware.info had the largest separation, Kyle's sample was up there also, as was Tom's - all reviewed the MSI Gaming OC version, whereas your HIS, CB's Sapphire, and HotHardware's PowerColor showed little variance.

Thanks for the info DBZ, another great post. I was going to run some clock-for-clock 290X vs. 390X tests over the weekend but Hard|OCP have confirmed what I expected to find.

Also you asked before about the Catalyst 15.15 driver working with the 200 series, honestly I haven’t had a chance to check yet. AMD only uploaded the driver for us less than 48 hours before the release date/time so it was a mad scramble to get the review out on time.

Finally having read through all the press materials, tech docs and review guide I found no mention of changes to the core for the R9 390X. If AMD did make improvements here you can bet they would be making a lot of noise about it. Obviously the focus is on Fury and from everything I have seen so far that GPU looks amazing!
 
Finally having read through all the press materials, tech docs and review guide I found no mention of changes to the core for the R9 390X. If AMD did make improvements here you can bet they would be making a lot of noise about it.
I don't think there was any silicon changes. If there were, then minor alterations (that would have netted a great marketing haul) such as HDMI 2.0 and hardware H.265 transcode engine would have been added. R&D restraints were probably also behind the Trinidad (ex- Curacao, and ex-Pitcairn) R7 370 still expected to soldier on with the aforementioned lack of HDMI 2.0, H.265, as well as no FreeSync or TrueAudio support. Strange choices for a card that might be viewed as HTPC/light gamer option in the market.
Obviously the focus is on Fury and from everything I have seen so far that GPU looks amazing!
Well, that sounds very encouraging! The 24th can't arrive soon enough. I'm hoping that the GPU is more than a doubled-up Tonga sans GDDR5 controllers and interface.
 
Would you stop bashing AMD on every article on the website? It's obvious you have an agenda but at least do it convincingly. The only disappointment I see here is that the cards aren't a bit faster. If they really were Nvidia would look worse then it does now.

I don't really have any agenda with that. I used to like AMD, had couple desktops on their platform, with the last CPU being AMD FX-60, which ended up bursting up, with quite a smell, while my last video card from them was Radeon 5770. Having said so, the only other PC component that went into flames on me literally was nVidia 280 graphics card.

I'm a big proponent of going green on energy saving in every direction, and I don't like when someone is trying to fake new products simply by raising the power profile, is what AMD has been doing for quite some time. For me personally that one reason is enough to look away from AMD altogether. My current graphics card is nVidia GTX 780, which was a technical marvel for the time when it was made, it is pretty, fast, absolutely quiet, and not very power-hungry.

AMD has disappointed me over the years, not only with the lack of products, but also with the never-ending practice of re-branding the old products and then pitching them as new ones.

From everything I have found so far about this Fury X, it is another power-hog, for one thing, and all the performance that AMD claims is as always very questionable.

So yes, I don't really like the company. I for one was thinking of them being complete muppets when they went all bust on buying ATI instead of investing that much money into better CPU-s. But hey, that's history now, right. They just don't seem to learn from the past mistakes very much.
 
5 IS 25% LARGER THAN 4, BUT 4 IS ONLY 20% SMALLER THAN 5. MATHEWMATICS 102.
AMd must have begged Mfrs to bring out these cards, especially by saying you only have to redesign the heatsink shrouds, nothing else has changed. ARe Amd the coyote who'se gone over the cliff edge, but is still suspended in midair with they're legs wibbling furiously?
I suspect these cards will be discounted within weeks/months , or bundles added, to keep them in the market. Just bought a 285 tho.
 
I made an account to say that your power consumption charts are off the charts! downright fantasy. Dude are you serious? where did you get those stupid numbers?
Total System power consumption with R9 390X is 300watts?? and you show it's identical to 290X? both guru3d and techpowerup showed 290X ALONE consumes 280watts at peak!! your charts tells us that 300W it's system power consumption? 390X has actually a lower TDP than 290X! yet the wattage draw number shown here is identical to 290X.
But this is the fun part. You jibberish system power consumption ?! charts (metro last light) shows that R9 390 (TDP 275W) consumes less than a GTX980 which is a 165W TDP card! LOL!! and in tomb raider you readings shows that 390 consumes only 10Watts more than 970 ?! 148W card?
This comment is absolutely hilarious.
First of all, TDP is not related to power consumption, like Puiu just said. TDP is the estimated ammount of heat that must be dissipated by the cooler, which AMD/Nvidia can control through chip leakage. It's possible to have a very power efficient chip with a lot of leakage and therefore a very high TDP, the same way it's possible to have a not very efficient chip with low leakage, which would have a low TDP (but probably burn itself to death since it won't transfer enough heat to the cooler and internat temperature would rise). Nvidia saying a GPU has a TDP of 150W DOES NOT MEAN the card consumes 150W under load. Similarly, an AMD card with 250W+ TDP also doesn't mean the card consumes 250W+. On top of that, AMD and Nvidia calculate TDP differently, so you can't even correlate them directly to begin with.
Also, outlier cases, like your "peak power consumption", is not a relevant way to compare power consumption. Every card can be prone to spikes so long as the power is available, if you look at peak power consumption you'll be overstimating the power consumption of every GPU, from both AMD and Nvidia, by a lot.
Finally, don't tell me you actually believed these advertised TDPs for the GTX 970 and the R9 290X somehow meant that the 970 can manage to be as fast as the 290X by using just half as much power, did you? I know some people like to drink Nvidia's kool-aid, but you'd be drowning in it and in desperate need for a lifeguard.
Your readings are a joke! if I was Nvidia I would put you on a blacklist for inability to tell the difference between a 165W card vs 275W
And where should the people who run this site put you for coming here to make a baseless complaint like that, while yourself being ignorant about what TDP and power consumption actually is?

so I can run a r9 380 and fx 8350 with a 500w psu? evga 80 plus one. sorry im kinda lost with this
 
I don't really have any agenda with that. I used to like AMD, had couple desktops on their platform, with the last CPU being AMD FX-60, which ended up bursting up, with quite a smell, while my last video card from them was Radeon 5770. Having said so, the only other PC component that went into flames on me literally was nVidia 280 graphics card.

I'm a big proponent of going green on energy saving in every direction, and I don't like when someone is trying to fake new products simply by raising the power profile, is what AMD has been doing for quite some time. For me personally that one reason is enough to look away from AMD altogether. My current graphics card is nVidia GTX 780, which was a technical marvel for the time when it was made, it is pretty, fast, absolutely quiet, and not very power-hungry.

AMD has disappointed me over the years, not only with the lack of products, but also with the never-ending practice of re-branding the old products and then pitching them as new ones.

From everything I have found so far about this Fury X, it is another power-hog, for one thing, and all the performance that AMD claims is as always very questionable.

So yes, I don't really like the company. I for one was thinking of them being complete muppets when they went all bust on buying ATI instead of investing that much money into better CPU-s. But hey, that's history now, right. They just don't seem to learn from the past mistakes very much.

There's really nothing AMD can do about the rebrands. They're like butter spread out on too much toast. Is re-branding wrong? Yes, but AMD doesn't have the money to do a whole lineup.

I'm much less inclined to forgive Nvidia for what it has done and continues to do. Locking competition out of the market with Nvidia gameworks, gouging when AMD doesn't have a product out, and lying about video card specs. Yeah, their cards are more power efficient. Does that really mean you are willing to hand them the market and allow them to continue to degrade it? I really don't think that's a worthy trade-off.
 
so I can run a r9 380 and fx 8350 with a 500w psu? evga 80 plus one. sorry im kinda lost with this
Yes.
This review shows that a system with a i7-4770K and a R9 380 consumes about 250W under load. Even if we assume the FX-8350 will consume a whole extra 100W, just to be safe, you're still at 350W only, with plenty of headroom on your 500W PSU. More realistically, though, I'd guess you'd be consuming about 300W.
 
Yes.
This review shows that a system with a i7-4770K and a R9 380 consumes about 250W under load. Even if we assume the FX-8350 will consume a whole extra 100W, just to be safe, you're still at 350W only, with plenty of headroom on your 500W PSU. More realistically, though, I'd guess you'd be consuming about 300W.

It's not about the wattage headroom, it's about the voltage. A good 500w PSU should be able to handle a high end card while a 600w should be able to handle ultra high end.
 
Last edited:
It's not about the wattage headroom, it's about the voltage. A good 500w PSU should be able to handle a high end card while a 600w should be able to handle high end.
That makes no sense whatsoever.
All PSUs provide rails on the exact same three voltages, 3V, 5V and 12V. Similarly, all graphics cards (that require PCIe power connectors) rely on the 12V rail. You won't find any hardware that deviates from that.
If you meant the amperage provided in the 12V rail, any 500W PSU from any half-decent brand will have more than enough amps to power a mid-range GPU like the R9 380. And your EVGA one obviously will.
 
That makes no sense whatsoever.
All PSUs provide rails on the exact same three voltages, 3V, 5V and 12V. Similarly, all graphics cards (that require PCIe power connectors) rely on the 12V rail. You won't find any hardware that deviates from that.
If you meant the amperage provided in the 12V rail, any 500W PSU from any half-decent brand will have more than enough amps to power a mid-range GPU like the R9 380. And your EVGA one obviously will.

Congrats for saying I make no sense and then turning around and repeating what I said in different language. Voltage varies if inadequate amperage is provided.
 
Congrats for saying I make no sense and then turning around and repeating what I said in different language. Voltage varies if inadequate amperage is provided.
Irrelevant. Your PSU does not provide inadequate amperage. It provides up to 40A on the 12V rail, which is enough for literally any single-GPU card in existance today.
"Not having enough amps" is a problem that stopped existing a decade ago, except if you look at the lowest quality PSUs in the market, which you already shouldn't be buying any due to other bigger concerns (like catching on fire or failing and killing your entire computer).
 
It's not about the wattage headroom, it's about the voltage.
Voltage varies if inadequate amperage is provided.
In my opinion you are unintentionally contradicting yourself. You correctly stated that voltage will vary if inadequate amperage is provided. But in the first quote above you stated it is the voltage that matters. Voltage is the constant. Amperage is what it is about, which you defined in the last quote above. And since wattage is the product of both voltage and current (amperage), it is about wattage headroom. Because wattage headroom makes it amperage headroom, since voltage is the constant.
 
In my opinion you are unintentionally contradicting yourself. You correctly stated that voltage will vary if inadequate amperage is provided. But in the first quote above you stated it is the voltage that matters. Voltage is the constant. Amperage is what it is about, which you defined in the last quote above. And since wattage is the product of both voltage and current (amperage), it is about wattage headroom. Because wattage headroom makes it amperage headroom, since voltage is the constant.

Wattage headroom does not always equal stable voltages or higher amperage. That's mostly going to depend on the components used. You see many low quality power supplies that say 600w but don't offer anywhere near enough amps to bring stable voltage to a high end video card.
 
You see many low quality power supplies that say 600w but don't offer anywhere near enough amps to bring stable voltage to a high end video card.
I don't disagree with you. But a power supply being rated at a higher value than it actually is for stability is a completely different topic.
 
You see many low quality power supplies that say 600w but don't offer anywhere near enough amps to bring stable voltage to a high end video card.
You realize EVGA is a reputable company and that is not the case with your PSU, right?
Let's go with what Steve said. You asked if you can run a FX-8350 and a R9 380 on the EVGA 500W 80+. The answer is "yes".
 
You realize EVGA is a reputable company and that is not the case with your PSU, right?
Let's go with what Steve said. You asked if you can run a FX-8350 and a R9 380 on the EVGA 500W 80+. The answer is "yes".

I think your accidentally replied to the wrong person. I wasn't the one who asked for help nor gave recommendations.
 
Back