AMD Radeon R9 Fury specs allegedly confirmed ahead of launch

Scorpus

Posts: 2,156   +238
Staff member

The AMD Radeon R9 Fury is expected to launch shortly, which might explain why we're suddenly seeing a whole bunch of leaks relating to the graphics card. The latest collection of information on the Fury, leaked by VideoCardz, includes both detailed specifications and images of a Sapphire Tri-X model.

The R9 Fury's Fiji-based GPU core will be cut down as expected, featuring 56 active compute units for 3,584 stream processors, compared to 64 CUs and 4,096 SPs on the fully-enabled Fury X. Core clock speeds appear to be slightly reduced as well, with the Fury reportedly having a standard maximum clock speed of 1,000 MHz.

Despite the reduction in clock speeds, there is a good chance that many Furies released to the market will feature overclocked cores. One Tri-X model features an overclocked core of 1,040 MHz, while previous rumors indicated the Fury might feature core clocks of 1,050 MHz to match the Fury X.

Like the Fury X, the Fury will feature 4 GB of HBM clocked at 500 MHz, providing 512 GB/s of memory bandwidth. Overclocking HBM is currently impossible, so don't expect any overclocked Fury cards to feature higher memory clock speeds.

As for the Fury itself, images of the Sapphire Tri-X variant show a short PCB with an overhanging three-fan air cooler. This cooler isn't AMD's reference design, but the short PCB is a product of the memory modules being included right on the GPU's die.

AMD previously stated the $549 Radeon R9 Fury would launch mid-July, so it shouldn't be too long before these graphics cards start to hit the market (and our test bench).

Permalink to story.

 
"This cooler isn't AMD's reference design, but the short PCB is a product of the memory modules being included right on the GPU's die."

Don't you mean the Interposer?
 
I like AMD and still use their CPU's/GPU's from time to time when saving cash, but if they think they can price their GPU"s against Nvidia based solely on performance their footprint/stock will continue to dwindle. You still get more dips and dives with AMD GPU's under stress and the issues/bugs that come with AMD's drivers.
Most people are not going to buy a Fury X over a 980Ti unless its $25-$50 cheaper, I don't care what cooler it comes with.
The 980Ti is around $650 right now, the FuryX should be around $619, and the Fury is priced right at $550, but they would make a killing if they priced it at $500.
 
I like the Sapphire cooler.
I can see that Sapphire thought of keeping the system ram cool as well.
 
I never thought about it until now, but assuming Nvidia does adopt HBM, how will that affect not only the price game between AMD and Nvidia, but also within Nvidia's own high end catalog? Especially against the Fury series, I've read people counting the ability to overlock an Nvidia card (or lack thereof with the Fury card) as a potential deal breaker. If Nvidia uses the same kind of HBM, they'll lose a lot of headroom for said overclocking. In addition, for those who were not afraid to overclock, the GTX 970 was a popular alternative to the 980. While the Fury X had offered practically no headroom to the endeavor, will the same be true for future Nvidia cards? Only time will tell I suppose.
 
You still get more dips and dives with AMD GPU's under stress and the issues/bugs that come with AMD's drivers.
While I can't comment on the price, I can safely say that your above statement is false. All AMD cards have more than adequate cooling and don't throttle down under high loads.
As for issues and driver problems, you have the same problems with Nvidia cards too, but if you want driver updates for your card then you should probably go with AMD since nvidia stops making performance updates for "older" cards relatively fast.
 
I never thought about it until now, but assuming Nvidia does adopt HBM, how will that affect not only the price game between AMD and Nvidia, but also within Nvidia's own high end catalog?
Would you expect a paradigm shift when both companies are using the same memory standard? This really hasn't been the case with any previous standard (EDO, VRAM, SDR, DDR.......DDR3, GDDR5).
Especially against the Fury series,
I suspect by the time Pascal and Arctic Islands appear, the Fury line will be relegated to mainstream performance (and hence pricing), making it nonviable from a manufacturing point of view.
HBM, and HBM2 equipped cards will be the province of the enthusiast tier only for at least the next generation or two. Production cost makes it uneconomic to extend the technology into the mainstream (although AMD might have to bite the bullet on costs for HBM with APUs to remain in the market with Intel)
I've read people counting the ability to overlock an Nvidia card (or lack thereof with the Fury card) as a potential deal breaker. If Nvidia uses the same kind of HBM, they'll lose a lot of headroom for said overclocking.
You're assuming that a more mature HBM2 suffers the same limitations as the initial HBM1 implementation. That may not be the case. In any event, the bandwidth offered by HBM2 may make the need to overclock it superfluous. GPU overclocking on the other hand is still at the whim of the company and technological hurdles that currently exist ( input power requirement, voltage limits for the silicon, heat output).
 
Generally speaking I've had a LOT more problems with AMD drivers than NVIDIA.

And yes, a Fury X really needs to be lower priced. If 980Ti drops to $620 (and lets face it, NVIDIA is gouging atm), you'd be crazy to go anywhere near a Fury X.
 
@dividebyzero , yes? Maybe? I don't know, Funny things sometimes happen when new tech comes out. *shrug*
You're assuming that a more mature HBM2 suffers the same limitations as the initial HBM1 implementation.
Actually I'm not, though poorly worded.
If Nvidia uses the same kind of HBM...
The rest is good to know, I had not looked much into the technology yet myself honestly, and the post was more food for thought than anything.
 
While I can't comment on the price, I can safely say that your above statement is false.
Hopefully they've addressed it.
My 6970 had lower min FPS then my 570 and dipped lower under stress.

cards have more than adequate cooling and don't throttle down under high loads.
When you have generation after generation of hot running, power hungry GPUs that make so much noise with some stock coolers it bothers gamers?...a new flagship that needs a waterblock from the get go with no OC headroom, with a noise issue itself.... your not a company known for "adequate cooling". Cmon now, adequate is the nice way of putting it.
My 6970 was hot and loud with very little headroom. Man how far they've come:D
Its not that bad but lets be honest here.
As for issues and driver problems, you have the same problems with Nvidia cards too, but if you want driver updates for your card then you should probably go with AMD since nvidia stops making performance updates for "older" cards relatively fast.
Overall, AMD's drivers have usually been more problematic compared to Nvidia's its common knowledge/advice given by many professional reviewers and forums going on years. Last I checked its not that bad anymore, my opinion doesn't really matter anyways but I've experienced those results as well through the years and I'm part of the majority.
You get what you pay for.
 
Last edited:
When you have generation after generation of hot running, power hungry GPUs that make so much noise with some stock coolers it bothers gamers?...a new flagship that needs a waterblock from the get go with no OC headroom, with a noise issue itself.... your not a company known for "adequate cooling". Cmon now, adequate is the nice way of putting it.
My 6970 was hot and loud with very little headroom. Man how far they've come:D
Its not that bad but lets be honest here.
According to ALL benchmarks you will notice that power consumption is on average 20W more for AMD on the high end cars (the fury lowers this to around 10W). This means that you might pay a few bucks more per year.
As for Temperatures, AMD actually has great OEM solutions that keep the cards nice and cool. (let me repeat: AMD no longer has problems with GPUs throttling from heat) Some are actually much better than what Nvidia offers. If you want to mention a very old video cards that are no longer being sold then should I also mention Fermi or the nvidia mobile GPUs that broke my laptop because of overheating?
 
According to ALL benchmarks you will notice that power consumption is on average 20W more for AMD on the high end cars (the fury lowers this to around 10W). This means that you might pay a few bucks more per year.
Yeah, like I said its not that bad anymore but it was quite prevalent at a time, adding to my overall point. Their reference cards usually leave a lot to be desired.
PatoRodrigues said:
Do not buy the reference R9 290 with the stock cooler. They have a POS memory overheating problem that is probably causing black screens all over the place. Trust me, I bought two of these from XFX.


As for Temperatures, AMD actually has great OEM solutions that keep the cards nice and cool. (let me repeat: AMD no longer has problems with GPUs throttling from heat)
Just to mention, the issues I experienced with my 5XXX and 6XXX series cards dipping didn't occur due to temps. However it doesn't mute my point about their longstanding issues with cards running warmer/louder then the competition.
Todays non-reference cooling solutions like the Windforce 3X have really helped eliminate many concerns.

Some are actually much better than what Nvidia offers. If you want to mention a very old video cards that are no longer being sold then should I also mention Fermi or the nvidia mobile GPUs that broke my laptop because of overheating?
Again, that is a handpicked viewpoint from the minority and a rare example. Nvidia's cards have been much more efficient going on years/generations now. The number of laptops I've tossed due to overheating issues over the years have mostly had AMD GPU's. From stuttering, heat throttling, game launch driver bugs to the FreeSync tech, they are usually a few steps behind. Nvidia's architecture is usually more efficient.

Biggieshady said:
They both work fine for most titles but I'd say nvidia drivers are more mature as in you can use beta drivers and expect stability with all games, with AMD you often need to pick WHQL driver version that has least issues with one or two problematic games in your collection.
zentex said:
AMD drivers has become better, but they are too far behind nVidia when it comes to stable drivers.

Maybe this isn't true anymore and if so, I'll man up and eat my words. My apologies for coming across as an AMD hater! I am not trying to instigate an argument or pick a side, I am merely speaking from experience and what I've seen. After doing some recent checking most people now say its very close so maybe I am in the wrong here.
 
Last edited:
How can Nvidia do anything with HBM when AMD owns the patents?!
LOL, who told you AMD holds the patents?
Guest account used for disseminating FUD, who da thunk it?

1. The technology was co-developed with AMD but SK Hynix "owns" the patents. In broad terms, DRAM stacking has many patent holders, not least of which is Intel and Micron
2. The technology is an open standard (JESD235) not proprietary. Licence the tech, and a company has the same technology as anyone else who licences it.
3. Nvidia's Pascal is already confirmed for use with HBM2 for enthusiast level consumer and professional graphics cards.
 
Last edited:
Back