Leaked cooler points to an upcoming AMD Radeon R9 390X

Scorpus

Posts: 2,162   +239
Staff member

A leaked image has appeared of what could be a prototype cooler for the upcoming AMD Radeon R9 390X, designed by Asetek. Just three weeks prior, Asetek boasted about a major design win with an "undisclosed OEM" relating to graphics liquid cooling, with this new image practically confirming that said OEM is AMD.

Asetek is the company who designed the hybrid liquid-air cooler for the dual-GPU Radeon R9 295X2, so it makes perfect sense for their partnership with AMD to continue. The cooler they're developing for the R9 300 series card is similar in design to that for the R9 295X2, but with the fan moved towards the back to cool just one GPU. Along the top you can still see the liquid cooling loop connectors.

At this stage there's not much that can be gathered about AMD's upcoming flagship graphics cards, other than they're looking at drastically improving the reference cooler. The R9 290X was a hot, loud card when cooled by the weak stock cooler, so moving to a hybrid cooler will allow it to push the performance boundaries while keeping cool.

It's rumored that the Radeon R9 390X, the high-end card that'll likely use this cooler, will launch in early 2015 equipped with a Volcanic Islands-series 'Fiji' GPU. Specifications at this stage are unknown, as AMD is clearly still in the development phase for the card.

Meanwhile, as has been the case with the past few generations of graphics cards, Nvidia will launch their new line of Maxwell-based graphics cards first. It's expected that the GeForce GTX 900 series will launch as early as next week.

Permalink to story.

 
I don't know what to expect from GTX 900 series, but I know that at this point any new generation of video cards that still doesn't have DisplayPort 1.3 is a waste of money.

This new standard is now becoming seriously overdue. We are about to see 5K monitors from more than one manufacturer, and God knows how they are going to work without it.
 
I don't know what to expect from GTX 900 series, but I know that at this point any new generation of video cards that still doesn't have DisplayPort 1.3 is a waste of money.

This new standard is now becoming seriously overdue. We are about to see 5K monitors from more than one manufacturer, and God knows how they are going to work without it.
I agree. But then again, the 5K monitor will cost $2500 xD.
 
The R9 290X was a hot, loud card when cooled by the weak stock cooler, so moving to a hybrid cooler will allow it to push the performance boundaries while keeping cool.
One could also speculate the new cards are too hot for the old cooler. And that AMD had no choice but to change designs, just to compete with nVidia.
 
5K monitor will cost $2500 xD.
For a few days, maybe, prices for TV-s and monitors are sliding faster than anything these days.

Apple and DELL are rumoured to be the first ones to introduce 5K screens, some time in October. I think the rest will follow real quick.
 
I don't know what to expect from GTX 900 series, but I know that at this point any new generation of video cards that still doesn't have DisplayPort 1.3 is a waste of money.

This new standard is now becoming seriously overdue. We are about to see 5K monitors from more than one manufacturer, and God knows how they are going to work without it.

Before you bring in any new tech or product for that matter, you have to ask yourself, what is the demand for it. I may spend $100 on a mouse and $200 to get a custom 1/2" acrylic side panel delivered to Canada from Minnesota for my 600T, but I'm not even thinking about triple GPU setups to game at anything higher than 2560x1440. On top of that, current operating systems can't even properly scale on 4K let alone 5K. You're defending the 1% and that's honorable, but that is not what the masses want... yet.
 
So, radeon has already strike back even before the gtx 980 released yet? I hope it will be a nasty war.. more is better.. let the war begin, let them strikes back at each other, I'll grab popcorn and calmly sit until pricing war happens and then replace my gpu
 
Before you bring in any new tech or product for that matter, you have to ask yourself, what is the demand for it. I may spend $100 on a mouse and $200 to get a custom 1/2" acrylic side panel delivered to Canada from Minnesota for my 600T, but I'm not even thinking about triple GPU setups to game at anything higher than 2560x1440. On top of that, current operating systems can't even properly scale on 4K let alone 5K. You're defending the 1% and that's honorable, but that is not what the masses want... yet.
Very true. I think MSs priority should be better 4K scaling. That is a huge issue that is usually overlooked.
 
Sadly, the liquid cooling probably means AMD didn't bother to reach the 22nm, or .20m, wafer process so the cards ultimately run too hot, are clocked too high and aren't -really- all that much faster. So sad. I'd rather AMD and Nvidia not bother releasing new GPUs until they can get things arranged to do it right. :-(
 
One could also speculate the new cards are too hot for the old cooler. And that AMD had no choice but to change designs, just to compete with nVidia.
Probably. AMD don't have any real experience with large monolithic GPUs. Tahiti is AMD's largest GPU at 438mm², and the power demand/heat output is something AMD have been wrestling with as their stock coolers have grown steadily overwhelmed. Quite a learning curve as the company went from 144W (Cypress/HD 5870) to 185W (Cayman/HD 6970) to 201W (Tahiti/HD 7970) to 294W (Hawaii/R9 290X).
With the 390X's Fiji GPU is rumoured to be 500+mm² and having to battle the big Maxwell chip (GM200. it's probably a fair assumption that AMD don't want thermals to be the limiting factor in the design knowing full well that Nvidia have a solid background in bringing large GPUs to fruition. Every Nvidia GPU generation since late 2006 has featured a GPU larger than the largest AMD has ever designed ( G80 @ 484mm², GT200/ GT200b @ 576mm²/ 470mm², GF100/GF110 @ 529mm²/520mm², GK110 @ 551mm²)
So, radeon has already strike back even before the gtx 980 released yet?
Not with this card/cooler it won't. The cooler according to Asetek's press release is slated for the "first half of 2015". AMD have the fully enabled Tonga (R9 285X) to go up against the GTX 770/ GM 206 (GTX 960?), but the R9 290 and 290X will be AMD's answer to the GTX 980/970
I hope it will be a nasty war.. more is better.. let the war begin, let them strikes back at each other, I'll grab popcorn and calmly sit until pricing war happens and then replace my gpu
Dream on. There hasn't been a discrete graphics price war since the market turned into a duopoly aside from some individual SKUs that were largely short-lived salvage parts (HD 5830 vs GTX 460 for example). Both AMD and Nvidia play the game of dovetailing price/performance/features to maximize their financial return while separating consumers from their income - all the while appearing to be involved in some battle for the ages....yeah, right. The fact that this price war fantasy construct still exists in some peoples minds in the face of both companies admitting to long running price fixing remains a testament to the power of guerrilla/viral marketing. BTW: Neither AMD nor Nvidia contested the verdict and both paid out their $962,000 judgements pretty quickly.
 
Last edited:
Before you bring in any new tech or product for that matter, you have to ask yourself, what is the demand for it. I may spend $100 on a mouse and $200 to get a custom 1/2" acrylic side panel delivered to Canada from Minnesota for my 600T, but I'm not even thinking about triple GPU setups to game at anything higher than 2560x1440. On top of that, current operating systems can't even properly scale on 4K let alone 5K. You're defending the 1% and that's honorable, but that is not what the masses want... yet.

Graphics designers are craving for 5K monitors, they will be among the first users, and then web developers. None of them care about gaming performance of their PC-s, but they do care a great deal about it working at 60Hz as the minimum, and this is where we have our problem, because DisplayPort 1.2 is unable to deliver 60GHz on a 5K screen. I'm not defending anyone, this is the reality. It is not about high performance at all, any current graphics card can do it, we just need DisplayPort 1.3 to be able to get 60GHz and above with 5K.
 
Graphics designers are craving for 5K monitors, they will be among the first users, and then web developers. None of them care about gaming performance of their PC-s, but they do care a great deal about it working at 60Hz as the minimum, and this is where we have our problem, because DisplayPort 1.2 is unable to deliver 60GHz on a 5K screen. I'm not defending anyone, this is the reality. It is not about high performance at all, any current graphics card can do it, we just need DisplayPort 1.3 to be able to get 60GHz and above with 5K.
At the moment it doesn't really matter I don't think. The only 5K panels in development seem to use MST (basically 2 tiled 2560x2880) and ganged DP 1.2. By the time you see a native single panel 5K - and I'm guessing it might be Apple, DP 1.3 will have debuted on the next series of cards. Chances are workstation FirePro (Fiji/Bermuda based) and Quadro (GM200 based) should both support the spec. at the very least. Nvidia moved pretty fast in including HDMI 2.0 on the GTX 970/980 so it seems reasonable that the next round of releases will feature DP 1.3.
 
One could also speculate the new cards are too hot for the old cooler. And that AMD had no choice but to change designs, just to compete with nVidia.
Or you could speculate that any major reference cooler change at that point means they are having problems with thermals...Or we could also assume that maybe a reference model like this is just a new idea/step to trying to improve the reference design with something different from the normal market. A nice refresh if I do say so myself.

Very true. I think MSs priority should be better 4K scaling. That is a huge issue that is usually overlooked.
I agree, 5k is something so weird to come out so fast but then again with everything else people always demand the next big thing and right now resolution is the thing we all keep wanting to show off. Still think 4k is probably going to be the focus since more T.V's and programming will focus on it next.
 
Graphics designers are craving for 5K monitors, they will be among the first users, and then web developers. None of them care about gaming performance of their PC-s, but they do care a great deal about it working at 60Hz as the minimum, and this is where we have our problem, because DisplayPort 1.2 is unable to deliver 60GHz on a 5K screen. I'm not defending anyone, this is the reality. It is not about high performance at all, any current graphics card can do it, we just need DisplayPort 1.3 to be able to get 60GHz and above with 5K.

Either way, DP 1.3 wasn't finalized in Q2 and we are still waiting. We will need it, but right now the people that would like to go 5K are in a very small pool. Would it still help push the tech, yes, but I don;'t see the rush. Professionals spend a lot more on hardware than the average gamer, and 4K is still niche right now. Also, the panels Dell are using aren't being used by anyone else, so it's not like the 5K market is going to be booming any time soon.
 
Last edited:
"current operating systems can't even properly scale on 4K"

FYI, Mac OS X works perfectly well with Retina screens of 2880x1440 resolution, which is 4.1K.
 
Back