AMD Radeon R9 390X variant will come water-cooled with High Bandwidth Memory

I don't think AMD is aiming to compete with the 980. The 290x is only like 15% slower on average, this is a new arch with new memory I'm expecting it to be in between the Titan X and the 980.If they pull a rabbit out of their hat it may be faster than the Titan X but I'm expecting the latter.

With the imminent launch of the 980 Ti (6GB), there's a whole new dog in the fight. The 980 Ti will be clocked higher than the Titan X (using the same fully unlocked GM200 chip). Even if AMD delivers a product that's 25% faster than the 290X, it most likely won't be enough to beat Nvidia. The fact that AMD is messing around with water-coolers for their reference cards makes me think they don't have much confidence in their product, but I'm still hoping they'll pull a rabbit out of their strata chocolata and surprise everyone. Though, like you, I'm not counting on it.
 
I don't think AMD is aiming to compete with the 980. The 290x is only like 15% slower on average, this is a new arch with new memory I'm expecting it to be in between the Titan X and the 980.If they pull a rabbit out of their hat it may be faster than the Titan X but I'm expecting the latter.

With the imminent launch of the 980 Ti (6GB), there's a whole new dog in the fight. The 980 Ti will be clocked higher than the Titan X (using the same fully unlocked GM200 chip). Even if AMD delivers a product that's 25% faster than the 290X, it most likely won't be enough to beat Nvidia. The fact that AMD is messing around with water-coolers for their reference cards makes me think they don't have much confidence in their product, but I'm still hoping they'll pull a rabbit out of their strata chocolata and surprise everyone. Though, like you, I'm not counting on it.
Water cooling is an obviously better solution, and makes for much quieter, reliable products. To think anything else would be pure folly. I water cool my CPU, and GPUs, and it's a phenomenal solution!
 
Water cooling is an obviously better solution, and makes for much quieter, reliable products. To think anything else would be pure folly. I water cool my CPU, and GPUs, and it's a phenomenal solution!

Better cooling, sure. There are many advantages to that. I still wouldn't want a closed water-cooling system on my graphics card that only lasts three years, dries up and can't be refilled, and can't be replaced after it goes out of production. There's always 3rd party solutions, but... meh. On the other hand, if it's one of these "made ready for custom water-cooling systems," then that's a whole other bag.

AMD's brute force tactics for achieving competitive performance is well known. The fact that they have to resort to water-cooling solutions on reference cards indicates that air-cooling was insufficient... meaning we've got another hot potato coming. I hope that's not the case, but... :p
 
Better cooling, sure. There are many advantages to that. I still wouldn't want a closed water-cooling system on my graphics card that only lasts three years, dries up and can't be refilled, and can't be replaced after it goes out of production. There's always 3rd party solutions, but... meh. On the other hand, if it's one of these "made ready for custom water-cooling systems," then that's a whole other bag.

AMD's brute force tactics for achieving competitive performance is well known. The fact that they have to resort to water-cooling solutions on reference cards indicates that air-cooling was insufficient... meaning we've got another hot potato coming. I hope that's not the case, but... :p
I don't understand this water is bad idea you are going with here.

Water is better than Air for cooling, this assumption that they don't have confident in the product because they are going with water is ridiculous.

The H55 on my 7970ghz has a five year warranty and I fully expect it to last the 5 years, will I still be using the same gpu in that time probably not I would have upgraded by that time.

Since I have a closed loop cooler on my GPU my system is super quiet. You don't have to listen to the annoying sound of fans ramping up when putting load on the GPU when playing a game or encoding video which I do.

Seems to me some of you are just reaching and looking for area's to find fault because you just don't like AMD's products. Sorry to say it but most of the arguments you guys have been pushing here are weak. Replacing a CLC on a GPU is easy for a real enthusiast might be time to turn in your man card.
 
Last edited:
Water is better than Air for cooling, this assumption that they don't have confident in the product because they are going with water is ridiculous.
I think he was referring to confidence in their product lasting while being cooled with air.

If a card produces large amounts of heat, as if it is overclocked at stock settings, I wouldn't want it in my case. Especially not if nVidia has cooler cards that will do the same job.
 
This is not to continue the back and forth, this is just stating what is on my mind with the statement that you provided. It does "seem" as if you are (or were) trying to get people worked up over a simple statement. I simply stated that I was thinking of going to a Nvidia card over AMD, and if they do remove DVI connectors from their cards then it is just another reason for ME to switch. This does not mean that I am forever giving up on AMD and going to "Bash" them every chance I get. I am not a "Fanboy" of either side, I simply like to have the best hardware that I can afford at that time.
There is a bit of offense (but nothing that I get angry or upset about) taken to the statement of -The game bundle seems like it would entice a person "like you" enough-. You do not know me, or know much about me. Most of us on here enjoy a little bit of a debate back and forth, but when you start to speculate or assume on a person directly, that's where the line should be drawn. I could care less about games being bundled with a hardware. I care about the performance of said hardware and if I can get the hardware in a price point that I want to pay it for.
So please everyone, lets play nice and stop the overall "dislike" (not going to say hatred). Especially over a simple statement.

It's part my fault too. I meant it to come off as satire but it just came off as harsh.
 
I think he was referring to confidence in their product lasting while being cooled with air.

That's exactly right.

I don't understand this water is bad idea you are going with here.

Seems to me some of you are just reaching and looking for area's to find fault because you just don't like AMD's products. Sorry to say it but most of the arguments you guys have been pushing here are weak. Replacing a CLC on a GPU is easy for a real enthusiast might be time to turn in your man card.

AMD hasn't been able to match Nvidia's performance per watt. Not even close. Their brute force tactics seems to have them in such a pinch now that the only way out is water-cooling. I'm not saying water-cooling is bad. I'm saying it's sad that AMD has to go there to keep up. If Nvidia did the same, they'd blow AMD out of the water by logical extension (pun intended).

Nvidia offers lower TDP (which saves on the electric bill) and better performance, and they're doing it with air-cooling. That's a straight up win, if you ask me. And a better product. If you don't understand that, princess... maybe you need to trade in your tutu for a leotard. You're talkin' at 20 years of system building experience here, so simmer down. ;)

PS: I love AMD. The competition makes my Nvidia cards cheaper.
 
That's exactly right.
AMD hasn't been able to match Nvidia's performance per watt. Not even close.
AMD is going through the learning curve that Nvidia did with the G80 and GT200 when they integrated GPGPU into what was up until then a gaming orientated technology. If AMD are guilty of anything it is an unwillingness to make compromises in design at the expense of power budget. Nvidia were guilty of the same ethos with the GF100/GF110, and while the GTX 580 (and 590, Quadro, and Tesla derivatives) are still favoured and very competitive in compute workloads, nobody would argue that they are frugal in power usage.
Their brute force tactics seems to have them in such a pinch now that the only way out is water-cooling. I'm not saying water-cooling is bad. I'm saying it's sad that AMD has to go there to keep up. If Nvidia did the same, they'd blow AMD out of the water by logical extension (pun intended).
I think from my viewpoint, the difference is magnified by the fact that the cooler design Nvidia adopted with the GTX 690 and adapted for every other top-tier card since, is an excellent design, while AMD's recent blower-shroud designs seem little more than an afterthought by comparison. What started out as an excellent design for the Evergreen series and its sub-200watt cards has basically - with little revision- now had to shoulder cooling duties for 300watts or more.
The overall impression of AMD's heat/noise issues are further magnified by the fact that the company do not allow non-reference designs at launch ( the initial product launch tends to set the tone for a product regardless of later revision). I'm pretty sure that the overall impression would be more favourable if AMD licenced a decent cooler from a vendor, or put at least some effort into designing something up to the task.
Nvidia offers lower TDP (which saves on the electric bill) and better performance, and they're doing it with air-cooling. That's a straight up win, if you ask me. And a better product.
It is all about prioritization in design. Nvidia claim lower power usage in general because compute is de-emphasized in Maxwell, especially double precision. Even fully enabled FP64 cards like the GTX Titan disable boost if the user opts for the full 1:3 double precision rate simply because some GPGPU workloads have a higher power usage than even torture testing (Furmark and the like). AMD simply decided on making fewer compromises in the functionality at the expense of power usage, but it is something that needed to be done if the company were serious about making some inroads into the professional (math co-processor, visualization) graphics markets dominated by Nvidia, and HSA in general.
PS: I love AMD. The competition makes my Nvidia cards cheaper.
You should love the tech...the name on the sticker should be a secondary consideration. IMO.
 
These articles always bring out the most interesting comments, and somehow logic still finds it's way through all the fanboy non-sense.
 
You should love the tech...the name on the sticker should be a secondary consideration. IMO.

It's all about the tech, which is why my path of upgrades went like: 8800GT > 5870 > GTX 580 > 7970 > GTX 980. Pretty much every other generation of graphics card I've owned since ATI rejoined the fight for the throne has been ATI/AMD. I don't go for the stickers, I go for what I believe is the best product of the current generation (within the top single GPU scene).

You make a lot of valid points, so I'll give you props for that. I believe I passed on the GTX 680 in favor of the 7970 because of the same brute force strategy I've been pinning to AMD here. But yes, Nvidia has learned, and--as you point out--AMD is still in the learning curve. They may have have additional reasons for this, as you also pointed out (the pro compute market). But, haven't they already been there for a decade with the FirePro? Am I missing something here?

Truth is, none of that matters to me. I go for FPS, not compute. I use my computer for multimedia, work, web and gaming like most people. I buy 80 plus platinum PSUs to save on electricity in the long run. That's how conscientious I am about power consumption. I'd rather not have a graphics card that chows down on the electricity just because it can, for no particular reason (that is of use to me). That way I can buy more beer. :D

AMD really needs a new cooler, that's a point we can both agree on. I've been very skeptical towards closed water-cooling systems for a long time, but NZXT finally won me over with their 6 year warranty on the Kraken x61. I've got an order pending for two sets of NZXT S340 + Kraken x61 as I type. Either way, a water-cooler only addresses the temperature issue. The fact still remains that Nvidia delivers a superior product with better support right now.

That was an excellent reply, though. I get your point of view a little better now. :)
 
That's exactly right.



AMD hasn't been able to match Nvidia's performance per watt. Not even close. Their brute force tactics seems to have them in such a pinch now that the only way out is water-cooling. I'm not saying water-cooling is bad. I'm saying it's sad that AMD has to go there to keep up. If Nvidia did the same, they'd blow AMD out of the water by logical extension (pun intended).

Nvidia offers lower TDP (which saves on the electric bill) and better performance, and they're doing it with air-cooling. That's a straight up win, if you ask me. And a better product. If you don't understand that, princess... maybe you need to trade in your tutu for a leotard. You're talkin' at 20 years of system building experience here, so simmer down. ;)

PS: I love AMD. The competition makes my Nvidia cards cheaper.

Performance per watt is secondary to me, Performance per dollar is more important.

I care for performance first, own a house and the amount of power one GPU uses compared to everything in my house is miniscule. The 10 bucks per year I would save is not enough of a dent in my power bill to care.

What determines which product is better is going to be up to each individual and their setup.

As for experience I've been building for 22 years but I don't need to throw that in here to try to prove anything. Length of time doesn't always equal being good at what you do :)
 
As for experience I've been building for 22 years but I don't need to throw that in here to try to prove anything. Length of time doesn't always equal being good at what you do :)

Well, you were being uppity, so I had to throw it in there! And so did you, it seems. :D

I dunno about performance per dollar, though. That's how I used to think back when I was more of a mainstreamer and couldn't afford the high end cards. Though, I have to admit, I was this close to buying a GTX 970 instead of the 980. I haven't seen that kinda bang for the buck since the 8800GT. I bought two of those. My first and last SLI experience.

As for my "save electricity" compulsion, that just goes for everything. Norway is a dumb, dumb, dumb country. We have all this free, clean energy (water), and our government sells it all to neighboring countries, leaving us with the enormous bills. It's practically free during the summer, but then winter comes around. We hit a record low here a couple years ago. I'm talking minus 37 degrees one night. One of the toilets froze because I left the floor heating off that night. The subsequent electric bills made me reexamine my lifestyle. lol
 
Wow, this article is on fire. Now, where's my gas can of premium fuel... :)

Competition is good. We, consumers are the winners.
 
They may have have additional reasons for this, as you also pointed out (the pro compute market). But, haven't they already been there for a decade with the FirePro? Am I missing something here?
[OT]
You ask a valid question. AMD's FirePro have only had double precision since the Evergreen series (Q2 2010), and a rather anaemic 1:5 rate at that. AMD did field a GPGPU orientated FireStream series three years earlier based on the RV670 (HD3000 series), but its capabilities and IEEE compliance left a lot to be desired compared to the Nvidia G80 based Tesla SKUs.
Hawaii is the first AMD GPU that offers native 1:2 rate FP64, something Nvidia achieved in late 2009 (just as GPGPU supercomputing began challenging CPU-only based systems) with GF100 based Tesla's.

Having said that, supercomputing is the minority revenue earner in professional/prosumer graphics. Workstation is the real money earner, but where AMD's cards are well suited for mostly single precision/mixed (FP32+FP64) workloads, their pro drivers amount to an OpenCL driver and Catalyst, Nvidia have worked pretty hard to supply a complete CUDA-based ecosystem ( Optix, SceniX, Complex, IRAY and Mental Ray renderers etc.) as well as the aforementioned OpenCL (now getting some kind of love from the driver team).
Truth is, none of that matters to me. I go for FPS, not compute. I use my computer for multimedia, work, web and gaming like most people. I buy 80 plus platinum PSUs to save on electricity in the long run. That's how conscientious I am about power consumption. I'd rather not have a graphics card that chows down on the electricity just because it can, for no particular reason (that is of use to me). That way I can buy more beer. :D
Different strokes for different folks. Can't say that power saving is a prime mover behind my own purchases (as SLI'ed GTX 780's would attest), but I know from first hand experience that the reference Nvidia cooler allied with its general quietness have made it a firm favourite for those wanting a compact system for both gaming and HTPC duties- and for those freeing up hard drive space by encoding their DVD/Blu Ray libraries to x265, the GTX 960 is a bit of a no-brainer since the power savings over CPU-encoding are substantial.
[/OT]
 
I am really disapointed about AMD dropping DVI connector. That means that I wont be able to overclock my Qnix 2710 1440p monitor to 110Hz like I can with my good old R9 280X.
I hope non-reference versions will feature one dual link DVI port, othervise I might look away from this card.

Or you could just you know, get a DP-DVI converter lol
 
PS: I love AMD. The competition makes my Nvidia cards cheaper.
See, people should never care whats on a sticker and should focus on making a machine that works and performs up to expectations. This is the big problem with a lot of people as they make comments like how they want competition so the company they buy from is cheaper. That only hurts the other side and makes it harder for them to compete...(Not sure if you were joking or not btw, but I do see people say that and are serious).

Performance per watt is secondary to me, Performance per dollar is more important.

I care for performance first, own a house and the amount of power one GPU uses compared to everything in my house is miniscule. The 10 bucks per year I would save is not enough of a dent in my power bill to care.

What determines which product is better is going to be up to each individual and their setup.

As for experience I've been building for 22 years but I don't need to throw that in here to try to prove anything. Length of time doesn't always equal being good at what you do :)
That's the way I have been, always focus on performance per dollar though for a long time I was mostly sticking with NVidia because of that exact argument. Was not until my Dual GTX 580's felt a little sluggish for multi-monitor gaming that I re-evaluated my choice and changed them out for a pair of HD 6990's (Didn't keep the 580's long honestly, one of the few buying mistakes I feel I have made on the GPU side).

Well, you were being uppity, so I had to throw it in there! And so did you, it seems. :D

I dunno about performance per dollar, though. That's how I used to think back when I was more of a mainstreamer and couldn't afford the high end cards. Though, I have to admit, I was this close to buying a GTX 970 instead of the 980. I haven't seen that kinda bang for the buck since the 8800GT. I bought two of those. My first and last SLI experience.

As for my "save electricity" compulsion, that just goes for everything. Norway is a dumb, dumb, dumb country. We have all this free, clean energy (water), and our government sells it all to neighboring countries, leaving us with the enormous bills. It's practically free during the summer, but then winter comes around. We hit a record low here a couple years ago. I'm talking minus 37 degrees one night. One of the toilets froze because I left the floor heating off that night. The subsequent electric bills made me reexamine my lifestyle. lol
Well while I understand about being energy conscious, you have to also note that computers unless you run them under stress 24/7 are not going to raise your electricity bill except under extreme circumstances (Though by your description I am sure enegery is not cheap for you). Not to say I would not have bought GTX 980's like you just did if I was buying now (Best performance, low power, 4gb, what's not to love?). But to have energy drive you is not the best solution because its not something the high end normally focuses on. On that same note, if enegy is your primary concern it maybe best to also consider what is needed for your ideal gaming experience. Like for say if you were gaming at 1080p (I have no idea what your gaming at, random example) why even look at a GTX 980, you could buy something like a GTX 970 (Heck even a 960) and do just fine.

As for the CLC idea, I actually am with you hoping it can be adapted to custom cooling as that would be a vote in my book for best reference cooler ever. Hopefully that is the case as it would make a compelling argument even for me who is tieign my hands down from not buying any cards this generation :p. But I do not agree that the CLC is them having a problem with their own product as there is supposed to be an air cooled version as well. To me it sounds like they are doing it strictly because of the extreme amounts of complaints coming from the 290's. They have listened and decided the only option to move up with liquid cooling as there are limits to what can be done with an air cooler without sacrifices.
 
See, people should never care whats on a sticker and should focus on making a machine that works and performs up to expectations. This is the big problem with a lot of people as they make comments like how they want competition so the company they buy from is cheaper. That only hurts the other side and makes it harder for them to compete...(Not sure if you were joking or not btw, but I do see people say that and are serious).


That's the way I have been, always focus on performance per dollar though for a long time I was mostly sticking with NVidia because of that exact argument. Was not until my Dual GTX 580's felt a little sluggish for multi-monitor gaming that I re-evaluated my choice and changed them out for a pair of HD 6990's (Didn't keep the 580's long honestly, one of the few buying mistakes I feel I have made on the GPU side).


Well while I understand about being energy conscious, you have to also note that computers unless you run them under stress 24/7 are not going to raise your electricity bill except under extreme circumstances (Though by your description I am sure enegery is not cheap for you). Not to say I would not have bought GTX 980's like you just did if I was buying now (Best performance, low power, 4gb, what's not to love?). But to have energy drive you is not the best solution because its not something the high end normally focuses on. On that same note, if enegy is your primary concern it maybe best to also consider what is needed for your ideal gaming experience. Like for say if you were gaming at 1080p (I have no idea what your gaming at, random example) why even look at a GTX 980, you could buy something like a GTX 970 (Heck even a 960) and do just fine.

As for the CLC idea, I actually am with you hoping it can be adapted to custom cooling as that would be a vote in my book for best reference cooler ever. Hopefully that is the case as it would make a compelling argument even for me who is tieign my hands down from not buying any cards this generation :p. But I do not agree that the CLC is them having a problem with their own product as there is supposed to be an air cooled version as well. To me it sounds like they are doing it strictly because of the extreme amounts of complaints coming from the 290's. They have listened and decided the only option to move up with liquid cooling as there are limits to what can be done with an air cooler without sacrifices.

Alright, time to set some things straight. When I said that stuff about the competition "making my Nvidia cards cheaper," I was just trolling... throwing a jab out there. Look at my upgrades picture going: NV, AMD, NV, AMD, NV, and it should be pretty clear how I actually feel about things. The sticker doesn't matter. The tech does.

As for my pension for power saving, it's not quite as serious as I may have led on. Well, it is for everything else... but if there's no good choice, and both camps present me with a 500W TDP graphics card, I'm not gonna give a $#!7 about the electric bills. I'm gonna go for the best performing GPU. :)
 
Alright, time to set some things straight. When I said that stuff about the competition "making my Nvidia cards cheaper," I was just trolling... throwing a jab out there. Look at my upgrades picture going: NV, AMD, NV, AMD, NV, and it should be pretty clear how I actually feel about things. The sticker doesn't matter. The tech does.

As for my pension for power saving, it's not quite as serious as I may have led on. Well, it is for everything else... but if there's no good choice, and both camps present me with a 500W TDP graphics card, I'm not gonna give a $#!7 about the electric bills. I'm gonna go for the best performing GPU. :)

That is why I said it the way I did, I figured you did not mean it but I mentioned people have said that and meant it :)

I think if we hit 500Watt TDP for single GPU's, we may have to start looking into submersing our computers in mineral oil as the norm lol.
 
I've been on the fence for the last couple months about getting a Nvidia card over AMD for my new build. Waiting for "new round" of video cards to ship before I make a decision. If AMD does drop the DVI connectors (because most that I have seen have 2), I guess I will be going Nvidia. I can totally understand dropping 1 of the DVI connectors, but to drop both will cause some issues. I still know plenty of people that have DVI monitors, including myself. And yes, I could just "buy an adapter", but I would rather have a direct connect without the use of adapters if possible.

I have two DVI monitors on my desk... one is connected via Displayport with a passive cable (about $10 on amazon) The other to a single DVI port. I am very pro displayport as it is a smaller connector meaning they can fit more connectors on a single card. Previously you'd have done well to get 2xDVI and maybe an HDMI etc... now we get 3x displayport + HDMI no complaints here!
 
Back