AMD Radeon R9 295X2 Review: A Dual-GPU Beast

Thanks for another great review @Steve. Ignore the complainers...

I gotta say, AMD has made a beast of a card for $1500. Now that they have cured the temperature and sound issues, they now need to focus on lowering the power consumption. Also it would have been nice to see SLI 290Xs in there, but the review was already hard enough probably so I can cut you some slack Steve :p.
Some people will always find something to complain about of course. I already choose to ignore them @JC713.

Power consumption!?!!?!?!?!? What, we don't want lower power consumption, I want the power usage to be well over 800watts, require 4 8 pin connectors and require a reactor to power it!!!!

You know, I would buy one of these if they had a waterblock version and just sell one of my 290X cards. Its only a matter of time, but im not sure I would want one anyways since I have yet to need anymore power.
 
Some people will always find something to complain about of course. I already choose to ignore them @JC713.

Power consumption!?!!?!?!?!? What, we don't want lower power consumption, I want the power usage to be well over 800watts, require 4 8 pin connectors and require a reactor to power it!!!!

You know, I would buy one of these if they had a waterblock version and just sell one of my 290X cards. Its only a matter of time, but im not sure I would want one anyways since I have yet to need anymore power.
I would expect the cryptocurrency miners to go crazy over this. It will easily gain them back the $1500.
 
I would expect the cryptocurrency miners to go crazy over this. It will easily gain them back the $1500.
To high a price premium along with the fact its got an AIO will probably scare most of them off. While it could do close to 2mhs, of hashrate (according to charts on the 290X) most would rather by a trio of the reference cards and turn the fan speed up to 80%+.

When your running something at 100% 24/7, having something like this would be taking a serious rick more than just a basic Blower fan.
 
The 'ignore the complainers' comments are validated by the quality of the posters saying it. Lol b!tch please. :D
Thanks for another great review @Steve. Ignore the complainers...
Also it would have been nice to see SLI 290Xs in there, :p.
Lol.
 
Last edited:
The price of dual GPU's have always been ridiculous, now they're sublime. Personally I could never justify that kind of outlay just to play games.
As has been mentioned, reference duallies are usually close to the cost of two single GPU cards - which is as it should be, unless the boards are in short supply.
The GTX 690, the TUL HD 7970X2 designs, HD 6990, HD 5970, HD 3870 X2 were all priced as such, while the GTX 590, HD 5970, HD 4870 X2, GTX 295 were cheaper than two single cards.

The pricing of this board falls more in line with the Asus ROG Mars/Ares line- which will befit it's likely limited production status and feature set.
Well done to AMD and Asetek. Mission accomplished. Back on top of the benchmark charts. From a marketing standpoint the card ranks 10/10. From the price/performance standpoint, I'd give it a 6/10 (since a couple of vendor OC'ed 290's run about $950, and two 290X Lightnings -whose base clock sits at the max OC for the 295X2- can be had for $1400)

Just as an aside, ComputerBase measured power draw. It sounds like not only will the PSU need to be in the 1k+ watt class/28A per 12V rail/50A dedicated for single rail, but it could also pay to check if (and by how much) the cables heat. Seems that the 295X2 exceeds the AWG18 electrical specification
ddXN2Ca.jpg
 
Last edited:
Single

Which is major disappointment that in a way. hurts this article. Not having R9 290/290X's in CrossfireX to compare to a 295 is shockingly ridiculous considering you included everything else but the 2 MAJOR setups people will compare this with.

These graphs have a lot of blank space and take up too much room for what they are showing IMO. It's ok I don't mind zooming, others may not be so happy.

This is a very minor complaint, I wouldn't care so much using my 30" at home but for 1280X1024, damn.

That's fantastic but even with the same driver and tech dual GPU cards have had different performance in the past then their comparative CrossfireX/SLi counterparts, sometimes they are slower/choppier/faster, etc etc.

I love your site and I have a lot of respect for you Steve, but your wrong on this one.

I am sorry that you find it shockingly ridiculous but like I said I didn’t have a second R9 290 at the time, I failed to make one magically appear but I did try.

Crossfire is crossfire on the same GPUs. Two Radeon HD 7970’s offer the exact performance as the 7990 when clock speeds are the same, same case with two GTX 680’s and the GTX 690. Never once in the past has a dual-GPU card not mimicked the performance of the two GPU’s it was based on being used separately in Crossfire or SLI.

It’s not really a matter of being wrong, as I said I would have most certainly included the R9 290 results if I had a second card. But because we know with absolute certainty that the R9 290 Crossfire cards are 5% slower on average I don’t see it as being a big deal.

Finally while not using this as an excuse it might help you understand our situation a bit better. As of Wednesday morning last week I didn’t even know the Radeon R9 295X2 existed with any degree of certainty.

By Wednesday afternoon I had signed the NDA and was told a card was on its way. Thursday night I had the spec sheets and knew what the card consisted of and what it would be competing with. Saturday the R9 295X2 arrived at mid-day and testing began. I needed the article to be completed in 48 hours so that editing and what not could take place before the release.

This didn’t give me enough time to test the R9 295X2 to the degree we would have liked let alone update our results with more Crossfire and SLI results. I only managed to get a second R9 290X card on launch day and I still haven’t had time to run all the necessary tests to update the results.

As has been mentioned, reference duallies are usually close to the cost of two single GPU cards - which is as it should be, unless the boards are in short supply.
The GTX 690, the TUL HD 7970X2 designs, HD 6990, HD 5970, HD 3870 X2 were all priced as such, while the GTX 590, HD 5970, HD 4870 X2, GTX 295 were cheaper than two single cards.

The pricing of this board falls more in line with the Asus ROG Mars/Ares line- which will befit it's likely limited production status and feature set.
Well done to AMD and Asetek. Mission accomplished. Back on top of the benchmark charts. From a marketing standpoint the card ranks 10/10. From the price/performance standpoint, I'd give it a 6/10 (since a couple of vendor OC'ed 290's run about $950, and two 290X Lightnings -whose base clock sits at the max OC for the 295X2- can be had for $1400)

Just as an aside, ComputerBase measured power draw. It sounds like not only will the PSU need to be in the 1k+ watt class/28A per 12V rail/50A dedicated for single rail, but it could also pay to check if (and by how much) the cables heat. Seems that the 295X2 exceeds the AWG18 electrical specification
ddXN2Ca.jpg

They should have included a 3rd 8-pin connector for sure. I was also concerned with how much current was going through each of those two connectors.

While the card may impress a lot, the inability to skip pages while reading does not.

Do you not see the floating index button on the right side of the page?
 
The 'ignore the complainers' comments are validated by the quality of the posters saying it. Lol b!tch please. :D

Lol.
Lol I was talking about you complaining about the ads, not the 290X complaint. Also I was aiming that comment at not just you xD. But yeah I was technically agreeing with you about the 290X, I should have put "Like Ams said"...
 
I am sorry that you find it shockingly ridiculous but like I said I didn’t have a second R9 290 at the time, I failed to make one magically appear but I did try.

Crossfire is crossfire on the same GPUs. Two Radeon HD 7970’s offer the exact performance as the 7990 when clock speeds are the same, same case with two GTX 680’s and the GTX 690. Never once in the past has a dual-GPU card not mimicked the performance of the two GPU’s it was based on being used separately in Crossfire or SLI.

It’s not really a matter of being wrong, as I said I would have most certainly included the R9 290 results if I had a second card. But because we know with absolute certainty that the R9 290 Crossfire cards are 5% slower on average I don’t see it as being a big deal.

Finally while not using this as an excuse it might help you understand our situation a bit better. As of Wednesday morning last week I didn’t even know the Radeon R9 295X2 existed with any degree of certainty.

By Wednesday afternoon I had signed the NDA and was told a card was on its way. Thursday night I had the spec sheets and knew what the card consisted of and what it would be competing with. Saturday the R9 295X2 arrived at mid-day and testing began. I needed the article to be completed in 48 hours so that editing and what not could take place before the release.

This didn’t give me enough time to test the R9 295X2 to the degree we would have liked let alone update our results with more Crossfire and SLI results. I only managed to get a second R9 290X card on launch day and I still haven’t had time to run all the necessary tests to update the results.



Do you not see the floating index button on the right side of the page?
Eh dont stress about it Steve. No big deal. On the bright side, you made a heck of a review. Now go get some rest xD.
 
Crossfire is crossfire on the same GPUs. Two Radeon HD 7970’s offer the exact performance as the 7990 when clock speeds are the same, same case with two GTX 680’s and the GTX 690. Never once in the past has a dual-GPU card not mimicked the performance of the two GPU’s it was based on being used separately in Crossfire or SLI.
Thats not even close to being true.
http://www.squidoo.com/gtx690-vs-gtx680-sli-review
There's a 4-7 FPS difference in many games.
13FPS difference in Far Cry 2 @ 1440p.

Even on your own review of a 7990 (had Crossfirex 7970's too), they had a 2-6 FPS difference in many games. It was damn close, but different. And what about power usage between the setups? Who is better there?
And how about game stability?
If you go back even further with 6970's and the 6990 there was a 2 to 17 FPS difference!
http://www.tested.com/tech/pcs/1985-radeon-6990-vs-radeon-6970-crossfire-vs-gtx-570-sli/

I don't care if the difference is 2FPS, or 18FPS.
Different is different.
Your dead wrong on this one.

Get another 290X.
I'll donate towards it :D
 
Last edited:
Thats not even close to being true.
http://www.squidoo.com/gtx690-vs-gtx680-sli-review
There's a 4-7 FPS difference in many games.

Even on your own review of a 7990 vs Crossfirex 7970's, they had a 2-6 FPS difference in many games. It was damn close, but different. And what about power usage between the setups? Who is better there?
And how about game stability?
If you go back even further with 6970's and the 6990 there was a 2 to 17 FPS difference!
http://www.tested.com/tech/pcs/1985-radeon-6990-vs-radeon-6970-crossfire-vs-gtx-570-sli/

I don't care if the difference is 2FPS, or 18FPS.
Different is different.
Your dead wrong on this one.

When I said “when clock speeds are the same” I really did mean that. If you under clock each GTX 680 to the same speed as the 690 then you get the same SLI results. That 4-7fps difference in performance isn’t down to the dual-GPU design or SLI, it has to do with the 91MHz clock speed difference on the GPU.

You are calling me dead wrong but don’t understand those cards had a subtle clock speed variation :D

In the case of the R9 290 and R9 290X you will see the 5% difference I spoke of because clock speeds a much the same.
 
Do you not see the floating index button on the right side of the page?


If you scroll down to a certain point Index gets cut off. Maybe allow it to drop a scroll or two lower. Also, may I suggest instead of it just saying Index, make it say Article Index, or (Browse) Article Pages.

Great article Steve.
 
Thats not even close to being true.
http://www.squidoo.com/gtx690-vs-gtx680-sli-review
There's a 4-7 FPS difference in many games.
13FPS difference in Far Cry 2 @ 1440p.

Even on your own review of a 7990 (had Crossfirex 7970's too), they had a 2-6 FPS difference in many games. It was damn close, but different. And what about power usage between the setups? Who is better there?
And how about game stability?
If you go back even further with 6970's and the 6990 there was a 2 to 17 FPS difference!
http://www.tested.com/tech/pcs/1985-radeon-6990-vs-radeon-6970-crossfire-vs-gtx-570-sli/

I don't care if the difference is 2FPS, or 18FPS.
Different is different.
Your dead wrong on this one.

Get another 290X.
I'll donate towards it :D
I really want to see you do a review... it is difficult as heck to get something of this quality out in 48 hours... you have no right to complain to Steve if you havent done a review yourself, or even in that amount of time. If you are unhappy with Steves review or methodology, go to another site where you are satisfied.
 
I really want to see you do a review... it is difficult as heck to get something of this quality out in 48 hours... you have no right to complain to Steve if you havent done a review yourself, or even in that amount of time. If you are unhappy with Steves review or methodology, go to another site where you are satisfied.

Most other sites had a similar tight time frame though, the US sites speak of having the card up to a week prior of the release so that has to help. Still I haven’t come across a review that included Crossfire 290’s. Most included Crossfire 290X cards but as I said the results are identical to the 295X2 and I did say that in our review.

You can't please everyone, I don’t try to but I do like to try and explain why things were the way they were.
 
That 4-7fps difference in performance isn’t down to the dual-GPU design or SLI, it has to do with the 91MHz clock speed difference on the GPU.e.
True in that case, but not always.
The 6990 and stock 6970 only have a 30mhz difference on the core, but the performance results are very different at times. They might be the same (or very similar) GPU's but they are configured/utilized/engineered differently for their particular design, if you matched the clock speed of a 690 to 680's you would still have varying results, some rather noticeable.
2 GPU's on a single PCB is not the same as 2 Single GPU's bridged together using dual PCIe lanes. The results have always varied like I have just shown, near 20FPS on a couple examples.
 
Last edited:
True in that case, but not always.
The 6990 and stock 6970 only have a 30mhz difference on the core, but the performance results are very different at times. They might be the same GPU's but they are configured differently, I betcha if you matched the clock speed of a 690 to 680's you would still have vary results.
2 GPU's on a single PCB is not the same as 2 Single GPU's bridged together using dual PCIe lanes.

You know things I just don’t then, we will have to agree to disagree.

Also again it pays to have the right facts. The 6970's are clocked 50MHz higher and also feature memory that is clocked 125MHz higher resulting in 6% more bandwidth per GPU.

Time and time again it has been proven that sticking two GPU's on a single PCB does nothing to improve performance over two single cards if all other specifications are the same. If you think otherwise then you are the one that doesn't know what you are talking about.
 
Also again it pays to have the right facts. The 6970's are clocked 50MHz higher and also feature memory that is clocked 125MHz higher resulting in 6% more bandwidth per GPU.
That still doesn't explain the difference in performance when its 5-15% different, better or worse.
Edit:
(and thats not an excuse especially from a tech site, Your average joe builder who buys a PC may not know all the clock/chipset differences, he just pops on the net for charts and see's a difference. Not everyone studies all the facts and overclocks the bejezus out of their builds, they just wanna see the damn difference regardless of what it is)
Time and time again it has been proven that sticking two GPU's on a single PCB does nothing to improve performance.
I won't argue which one is faster, just that they are/could be different in performance even with similar clocks/whatever.
Plus..don'tcha just wanna know out of pure curiosity (especially power draw) ? :D
 
Last edited:
That still doesn't explain the difference in performance when its 5-15% different, better or worse.
Edit:
(And thats not an excuse. Your average joe builder who buys a PC may not know all the clock/chipset differences, he just pops on the net for charts and see's a difference. Not everyone studies all the facts and overclocks the bejezus out of their builds.

I won't argue which one is faster, just that they are/could be different in performance even with similar clocks/whatever.
Plus..don'tcha just wanna know out of pure curiosity (especially power draw) ? :D

I said I have tested/am testing two 290X's in Crossfire. I got a second 290X the day the review went live and I am still running tests. Power draw is virtually identical, slightly higher when the 290X's are in normal mode and lower when they are in Uber mode. Clock for the clock they are identical in terms of performance, no surprises there.
 
Power draw is virtually identical, slightly higher when the 290X's are in normal mode and lower when they are in Uber mode. .
Interesting...thats not always been the case but these newer GPU's run so well this isn't as big of a deal anymore. Still, its nice to know.

Clock for the clock they are identical in terms of performance, no surprises there.
What about at eyefinity type resolutions?
Can I be anymore of an @$$hole in this thread?
 
Interesting...thats not always been the case but these newer GPU's run so well this isn't as big of a deal anymore. Still, its nice to know.

What about at eyefinity type resolutions?

Obviously I have't had time to test Eyefinity but can't imagine any possible reason why things would change here between the R9 290X CF cards and the R9 295X2, not a single reason, certainly not one that makes any logical sense anyway.
 
I honestly didn't think there would be much of a performance difference, maybe a few FPS here and there (although I sure as hell wanted to see them side by side just to know) but the GPU temps/overclocking headroom and power draw comparisons make it worthwhile alone especially if I was a 290X owner or possible owner/buyer.

The GTX 590 was hot hot hot!
I sent that microwave back the same week.

I handled myself poorly in this thread but I think you will agree its nice to see how they stack up against one another in all tests, not just raw power but temps, overclock headroom and power draw, regardless of how small the difference may be.
 
Last edited:
I handled myself poorly in this thread but I think you will agree its nice to see how they stack up against one another in all tests, not just raw power but temps, overclock headroom and power draw, regardless of how small the difference may be.

I do agree and wish it could have been done in time for this review.
 
Back