AMD Radeon R9 390X variant will come water-cooled with High Bandwidth Memory

The whole NV has gold standard drivers and AMD are not is just old news.

And most of the people still spewing that FUD haven't used an AMD card in like 10 years.

Both Vendors have drivers issue and bugs and I see just as many people in forums with gpu issues from NV as I do from AMD.

AMD hasn't had a WHQL driver since December!

Quick call the driver police before AMD cards stall completely.

LOL

You have a lack of knowledge right there, you do
 
I've been on the fence for the last couple months about getting a Nvidia card over AMD for my new build. Waiting for "new round" of video cards to ship before I make a decision. If AMD does drop the DVI connectors (because most that I have seen have 2), I guess I will be going Nvidia. I can totally understand dropping 1 of the DVI connectors, but to drop both will cause some issues. I still know plenty of people that have DVI monitors, including myself. And yes, I could just "buy an adapter", but I would rather have a direct connect without the use of adapters if possible.
They will ship with adaptors.
AMD have decision to make regarding what they have room for on the card.
AS this card will be sooo powerful they have to consider eyefinity users of whom most will have d.p. ports.
The apadptor will be in the box
 
Having to resort to water cooling just to be on par with nvidia what a joke
amd with their hot running primitive brute force gpus

Haha cry moar Nvidia Troll!

The 390X uses LESS energy than the 290X, it just uses liquid cooling because that is just straight up better than air (And like they said there will be air cooled versions. Read much?)
 
This card will only sell for that price if its equal to the Titan X in speed and I think it will sit between the 980 and the titan X when it comes to performance so I'm thinking closer to $699.
I think your estimate is closer to the truth of the matter. The 390X as an AIO water cooled card is likely to be compared to the 295X2 both in pricing and performance, so that could cap the upper limit of what AMD could charge - although if the card is at least initially in short supply they could charge a short term premium - but again, I am guessing NVidia wheels out a GTX 980 Ti at ~ $649 (as it did with the 780 Ti) which would probably limit high-priced sales of the 390X past the initial flourish. The 390X could well shade the Titan X, but I suspect most of the reviews in mid-June will be pitting the 390X against a batch of 980 Ti Classified/FTW/Windforce/Strix/HoF/Lightning-TF5 boards.

Yeah well if the 390X is as strong as the 295x2 (Which by all means it looks like it will be) then it should be a full 20%+ stronger than the Titan X anyways. I expect $700-$800 for the 8GB WCE if that is the case.
 
I get the feeling AMD is aiming to compete with the 980... and if that's the case, then they've already lost (again). That's simply worrying. There's no point bashing AMD, because we all need them. I sincerely hope they get a win soon, or we'll be living on planet Nvidia/Intel... and nobody wants that.

I fully support AMD's efforts, but I won't buy a bad product (which is why I'm running with a 980 now). They need to get a grip on power consumption, heat generation and start making better drivers. I buy Nvidia for three simple reasons: Performance, component lifetime, and quality of life (drivers/support).

That said, I was very happy with my XFX Radeon HD 5870, and ASUS Radeon HD 7970 DCU II. My first triple screen experience was with the 5870. Good times! :)
 
Not to add fuel to the fire here but I thought the 970 still had 4GB of VRAM?
Just that the last 512mb of it was severely limited bandwidth wise? Like 80% slower or something along those lines.

If that is the case then why does everyone keep referring to it as 3.5GB? I know the last 512MB is considerably slower than the rest but the memory is there.
Mostly because the speed its at makes it not worth saying its there as full fledged video memory. Its not the end of the world, but when its used it is not fast enough to be used effectively which is why some games when utilizing it showed stutter. It probably was not good as well when people bought the "4gb card" for 4k which is a memory hog (Talking about 2-3 cards) that they found its performance was not what they were expecting nor would it last as long. But that is old news now so not worth fretting over anymore.
AMD hasn't had a WHQL driver since December!
Because paying for the WHQL certification makes the driver perform better.

Who cares how the VRAM is accessed guys on the GTX 970? The news is ancient now and the focus of the article is on the R9 390X which is where the comments should be.

Now then, as far as the outputs are concerned I love the choice like I do on the GTX 9XX series. Having 3 DP's and an HDMI (2.0a I might add) seems the most logical to me as it really gives the most flexibility to users who are focusing on the high end as it is. I am personally fine with the removal of the DVI's as you can easily get a DP to DVI adaptor for pretty cheap (I am guessing they will at least give one in the package) for those who still want to use it. It helps also keep the card slimmer for those wanting to make the card single slot adaptor in their own cooling setup.

To top this off, I personally am more now interested in the water cooler on this because now the rumor puts it with no fan (Meaning its not a hybrid cooler). If that was the case, it would be eve better if you were able to remove the AIO part and add it to your own setup. That is something I would be totally behind!
 
I get the feeling AMD is aiming to compete with the 980... and if that's the case, then they've already lost (again). That's simply worrying. There's no point bashing AMD, because we all need them. I sincerely hope they get a win soon, or we'll be living on planet Nvidia/Intel... and nobody wants that.

I fully support AMD's efforts, but I won't buy a bad product (which is why I'm running with a 980 now). They need to get a grip on power consumption, heat generation and start making better drivers. I buy Nvidia for three simple reasons: Performance, component lifetime, and quality of life (drivers/support).

That said, I was very happy with my XFX Radeon HD 5870, and ASUS Radeon HD 7970 DCU II. My first triple screen experience was with the 5870. Good times! :)

I don't think AMD is aiming to compete with the 980. The 290x is only like 15% slower on average, this is a new arch with new memory I'm expecting it to be in between the Titan X and the 980.If they pull a rabbit out of their hat it may be faster than the Titan X but I'm expecting the latter.
 
To top this off, I personally am more now interested in the water cooler on this because now the rumor puts it with no fan (Meaning its not a hybrid cooler). If that was the case, it would be eve better if you were able to remove the AIO part and add it to your own setup. That is something I would be totally behind!

From the leak pictured I've seen that looks like a standard AIO cooler which will need a fan to push air through the rad where are you seeing this as being fan less?

I can't even see that being possible with the TDP this highend card will be pushing.

Its looks like the same style cooler on the 295x2!
 
I am really disapointed about AMD dropping DVI connector. That means that I wont be able to overclock my Qnix 2710 1440p monitor to 110Hz like I can with my good old R9 280X.
I hope non-reference versions will feature one dual link DVI port, othervise I might look away from this card.
 
I've been on the fence for the last couple months about getting a Nvidia card over AMD for my new build. Waiting for "new round" of video cards to ship before I make a decision. If AMD does drop the DVI connectors (because most that I have seen have 2), I guess I will be going Nvidia. I can totally understand dropping 1 of the DVI connectors, but to drop both will cause some issues. I still know plenty of people that have DVI monitors, including myself. And yes, I could just "buy an adapter", but I would rather have a direct connect without the use of adapters if possible.

Dropping a video card because of an adapter? Why don't you just go Nvidia now? The game bundle seems like it would entice a person like you enough. That is until you find out you have 3.5 GB of ram, I mean that's way less piddling than buying an adapter...
It's not only the adapter - but the ability to overclock monitor, some Korean monitors are really overclockable (60Hz to 120Hz), but they all mostly use the dual-link DVI port. With an adapter you lose the option to overclock monitor's frequency.
 
I've been on the fence for the last couple months about getting a Nvidia card over AMD for my new build. Waiting for "new round" of video cards to ship before I make a decision. If AMD does drop the DVI connectors (because most that I have seen have 2), I guess I will be going Nvidia. I can totally understand dropping 1 of the DVI connectors, but to drop both will cause some issues. I still know plenty of people that have DVI monitors, including myself. And yes, I could just "buy an adapter", but I would rather have a direct connect without the use of adapters if possible.

Dropping a video card because of an adapter? Why don't you just go Nvidia now? The game bundle seems like it would entice a person like you enough. That is until you find out you have 3.5 GB of ram, I mean that's way less piddling than buying an adapter...

This is not to continue the back and forth, this is just stating what is on my mind with the statement that you provided. It does "seem" as if you are (or were) trying to get people worked up over a simple statement. I simply stated that I was thinking of going to a Nvidia card over AMD, and if they do remove DVI connectors from their cards then it is just another reason for ME to switch. This does not mean that I am forever giving up on AMD and going to "Bash" them every chance I get. I am not a "Fanboy" of either side, I simply like to have the best hardware that I can afford at that time.
There is a bit of offense (but nothing that I get angry or upset about) taken to the statement of -The game bundle seems like it would entice a person "like you" enough-. You do not know me, or know much about me. Most of us on here enjoy a little bit of a debate back and forth, but when you start to speculate or assume on a person directly, that's where the line should be drawn. I could care less about games being bundled with a hardware. I care about the performance of said hardware and if I can get the hardware in a price point that I want to pay it for.
So please everyone, lets play nice and stop the overall "dislike" (not going to say hatred). Especially over a simple statement.
 
From the leak pictured I've seen that looks like a standard AIO cooler which will need a fan to push air through the rad where are you seeing this as being fan less?

I can't even see that being possible with the TDP this highend card will be pushing.

Its looks like the same style cooler on the 295x2!
Well the first leaks pointed the cooler has a standard blower with AIO system similar to the R9 295X2 but the recent pictures that are not really confirmed pointed to it being short and not showing a fan.
27a.jpg

(FYI, I am almost certain this is just a render but multiple sites have pointed to a similar design so its up for debate).
The design didn't really show a fan on it unless its just under the shroud only pulling from right to left. But again its all speculation and at this point it could go either way. I do not believe the TDP is going to be high enough that this would be a problem.
 
Well the first leaks pointed the cooler has a standard blower with AIO system similar to the R9 295X2 but the recent pictures that are not really confirmed pointed to it being short and not showing a fan.
27a.jpg

(FYI, I am almost certain this is just a render but multiple sites have pointed to a similar design so its up for debate).
The design didn't really show a fan on it unless its just under the shroud only pulling from right to left. But again its all speculation and at this point it could go either way. I do not believe the TDP is going to be high enough that this would be a problem.

Thanks for the pics.

I've heard power consumption will drop to the 7970Ghz levels and that card has a 250 watt TDP which cannot be cooled passively.

If it has a rad it will have a fan.

Will see if I'm wrong but 90% sure it will.
 
You are not lying but that isn't news to anyone that follows the gpu market. AMD has said they were moving off monthly releases of driver updates along time ago.

I know it wasn't a lie. I wanted to let the fanboy know he didn't phase me.
I know what AMD said. I also know they said they would do them monthly, and it was clearly too challenging of a task proven by the fact it's May and this year has been all beta and hotfix drivers.
 
Last edited:
Thanks for the pics.

I've heard power consumption will drop to the 7970Ghz levels and that card has a 250 watt TDP which cannot be cooled passively.

If it has a rad it will have a fan.

Will see if I'm wrong but 90% sure it will.
Same, I meant there would not be a fan on the card itself but only on the radiator. Yea I doubt it could be cooled passively even at that TDP. There is no telling honestly because we have seen pics originally of the cooler but based on how long the waiting has been going on this new design is a possibility especially if they have been putting a lot of work into this card.

What's going to matter in my book, how powerful this card is in relation to other cards on the market as that will be what determines most of how this generation is going to go.
 
Same, I meant there would not be a fan on the card itself but only on the radiator. Yea I doubt it could be cooled passively even at that TDP. There is no telling honestly because we have seen pics originally of the cooler but based on how long the waiting has been going on this new design is a possibility especially if they have been putting a lot of work into this card.

What's going to matter in my book, how powerful this card is in relation to other cards on the market as that will be what determines most of how this generation is going to go.

My bad I thought you were trying to say there would be no fan period.

I also know they said they would do them monthly, and it was clearly too challenging of a task proven by the fact it's May and this year has been all beta and hotfix drivers.

They said they would NOT be doing monthly driver releases!

http://www.anandtech.com/show/5880/...hly-driver-updates-releases-catalyst-126-beta

"AMD has announced that starting with Catalyst 12.6 AMD will be ceasing their monthly driver update schedule in favor of releasing drivers on a dynamic/as-needed basis, effectively taking up NVIDIA’s driver release schedule."
 
Last edited:
If AMD drops both DVI ports, they will lose me as a customer. Have been waiting for this card for an upgrade from my 290x, but I have an overclockable korean 1440p monitor that is DVI only. I am not going to spend $600 on 120hz monitor with HDMI or DP. This would be a big mistake on AMD's part to drop DVI. Just no point in it. I mean who even has monitors with Display Ports. Maybe 1/5th of gamers. It just hasn't reached critical mass yet, while 90% of people are still using DVI.
 
If AMD drops both DVI ports, they will lose me as a customer. Have been waiting for this card for an upgrade from my 290x, but I have an overclockable korean 1440p monitor that is DVI only. I am not going to spend $600 on 120hz monitor with HDMI or DP. This would be a big mistake on AMD's part to drop DVI. Just no point in it. I mean who even has monitors with Display Ports. Maybe 1/5th of gamers. It just hasn't reached critical mass yet, while 90% of people are still using DVI.

Whats the problem with use the DP-DVI adapter that will come in the box with the graphics card?
 
Last edited:
I have mixed feelings about dropping support for DVI. Being someone who only has DVI monitors, I don't want to be forced into using adapters. On the other hand, if AMD or nVidia doesn't push the monitor manufacturers into supplying DP ports, DVI will never die.

At the moment I say this will only be a temporary problem. That is until monitor manufacturers finally decide to get on-board the DP train. Until then I think I can manage to use an adapter or two if needed. I have used a VGA/DVI adapter in the past for a short while. I didn't like it but that is the cost of moving forward and changing specifications.
 
I'm not sure how this whole back and forth quasi-bickering amongst each other regarding the exclusion of DVI is even relevant in the grand scheme of things, OEM vendors tend to deviate fairly often from the reference card design, and will likely continue to do so with the 300 series as well, so I don't expect the port to simply vanish from the cards.

What the problem with use the DP-DVI adapter that will come in the box with the graphics card?
An adapter included would probably be a single link adapter (most likely) or passive dual link adapter, of which neither generally support his display at that resolution or frequency. Even active dual link adapters do not always allow frequencies greater than 60hz.*


*I have not used cross standard adapters at either extreme extreme of DVI's capability in a long time so I'm not sure if my information is still valid.
 
Yeah well if the 390X is as strong as the 295x2 (Which by all means it looks like it will be) then it should be a full 20%+ stronger than the Titan X anyways.
AMD haven't stated that the 390X "is as strong as the 295X2" anywhere during the FAD presentations. That inference comes from sensationalist sites like WTFtech for clickbait material to fuel the tiresome fanboy wars they love to propagate.
There are ample signs that Fiji is a scaled up Tonga (which isn't appreciably stronger than Tahiti on a core-to-core/speed-to-speed basis even with DCC), yet some people - such as yourself believe that a 45% increase in core count (4096 vs 2816 for Hawaii) will somehow lead to 100% more performance. Considering Hawaii-based cards haven't shown themselves to be particularly bandwidth limited - the principle advantage of HBM, how do you expect this to happen?
I expect $700-$800 for the 8GB WCE if that is the case.
...and yet informed opinion says that the 390X will be a 4GB model exclusively, at least for a while.
I have shown those slides to a contact in a position to know what AMD is launching this quarter. They have confirmed that Fiji tops out at 4GB, not 8. - Joel Hruska, Extreme Tech (see comments under the FAD 2015 article)
The 390X uses LESS energy than the 290X
And you know this how? Subscription to the Daily Unicorn Times ?
45% more cores, likely a sizeable increase in overall high power logic (cache), and a clock rate likely to be the ballpark equal of Hawaii will more than offset the power demand savings of 4GB of HBM over GDDR5.
it just uses liquid cooling because that is just straight up better than air
When has a company voluntarily increased its bill of materials just for the hell of it? AMD have spent the previous few years jamming the cheapest blower shroud on their high end reference cards (to the detriment of review conclusions and general derision from readers and owners), yet they now decide to go " all in" with a more expensive AIO when their financial position is at its most precarious just because they decided LC is " just straight up better than air" even though earlier cards would have certainly benefitted from it (notably the HD 6990, 7970, 7990), and the Asetek design being used has been around for almost ten years.
 
Last edited:
Back