AMD Radeon HD 6990 specs leaked, rumored for March 8

Matthew DeCarlo

Posts: 5,271   +104
Staff

We first saw the upcoming Radeon HD 6990's specifications last November when an internal slide leaked online, but many details remained unknown -- until now. More slides have appeared on DonanimHaber, revealing nearly everything eager shoppers might want to know, except pricing.


Rumored to be released next Tuesday, the Radeon HD 6990 features two Cayman GPUs clocked at 830MHz, 3072 stream processors, 192 texture units, two 256-bit memory channels, 4GB of GDDR5 running at 5000MHz, and it consumes a maximum of 375W (37W during idle).

By flipping a switch on top of the card, you can enable a built-in overclocking mode that boosts the clock frequency to 880MHz and increases Radeon HD 6990's compute power from 5.10 TFLOPs to 5.40 TFLOPs. The max TDP is also elevated to 450W, but idle draw remains the same.


To keep temperatures in-check, AMD is outfitting its latest flagship card with an updated dual-slot cooler featuring two vapor chambers and one center fan. The Radeon HD 6990's cooler offers 20% more airflow and 8% better thermal performance in the same form factor as the HD 5970.

The Radeon HD 6990 receives power via two 8-pin PCIe connectors and the rear I/O plate houses four Mini DisplayPort outputs alongside one DL-DVI port. Additionally, AMD will include three adapters with every card: Mini DisplayPort to SL-DVI Passive, SL-DVI Active, and HDMI Passive.


AMD's performance slides show that Radeon HD 6990 running in OC mode is on average 67% faster than the single-GPU GeForce GTX 580 when playing games at 2560x1600. The lead is especially dramatic in Crysis Warhead and the Dragon Age II demo, but again, these are AMD's numbers.

According to AMD's charts, you can expect a 60% to 100% performance increase in extremely high-resolution applications when two of the dual-GPU cards are configured in Quad CrossFire. As usual, you can expect us to provide a full review when the Radeon HD 6990 arrives.

Permalink to story.

 
Good to hear that TS are getting a review sample. Myself and red1776 were debating on whether AMD would build enough cards to cater for everyone !
Any word on if the AIB's will enforce AMD's edict that using the second BIOS switch setting and/or AMD Overdrive automatically voids any warranty claim ?
"AMD warranty does not cover damages caused by overclocking, even when overclocking is enabled via AMD software and/or the Dual-BIOS Function on the AMD Radeon™ HD 6990".
- From the fine print in this slide.
 
If the stats are true the price would equal the difference in performance atleast, thats insane performance
 
At stock clocks (830 core/shader/5000 mem) Better than Crossfired HD 6950's, worse than Crossfired HD 6970's
At the "oc'ed" clocks (880 core/shader/5000 Mem) the card will be close to HD 6970 Crossfire -the only difference being the lower memory frequency (HD6970 runs at 5500 effective).
 
375w is a sick amount of TDP. Running two of these bad boys and an i7 you'd need a thousand watt psu. wicked.
 
And then some.

(375 x 2) + 180 to 250w -assuming you are oc'ing the CPU, probably a fair assumption if you're going quadfire gives, you 880-950w.
Bear in mind that most very good 1kw+ PSU's are in the 85-88% efficiency range, you're most likely looking at 1200w, and that leaves very little margin for error.
Hit the 880MHz clock profile (415-450w) and your looking for a 1500 watter that can handle a 187.5w power draw through each 150w rated PCI-E 8pin line, which probably goes some way to explaining why AMD will void the warranty if things go badly with this card when overclocked.

Having said that, if Nvidia follow the same "performance at any cost" approach with the GTX 590, then rest assured that it wont be just the quadfire owners checking out the 100 amp PSU's. Still, if you can afford $US1300+ (my estimate) of GPU's then sourcing a big PSU, or a chassis with dual-PSU mounting ability and a couple of 1kw class units shouldn't represent too much of a hurdle.
 
And then some.

(375 x 2) + 180 to 250w -assuming you are oc'ing the CPU, probably a fair assumption if you're going quadfire gives, you 880-950w.
Bear in mind that most very good 1kw+ PSU's are in the 85-88% efficiency range, you're most likely looking at 1200w, and that leaves very little margin for error.
Hit the 880MHz clock profile (415-450w) and your looking for a 1500 watter that can handle a 187.5w power draw through each 150w rated PCI-E 8pin line, which probably goes some way to explaining why AMD will void the warranty if things go badly with this card when overclocked.

Having said that, if Nvidia follow the same "performance at any cost" approach with the GTX 590, then rest assured that it wont be just the quadfire owners checking out the 100 amp PSU's. Still, if you can afford $US1300+ (my estimate) of GPU's then sourcing a big PSU, or a chassis with dual-PSU mounting ability and a couple of 1kw class units shouldn't represent too much of a hurdle.

hahaha, now my 1650W/132A doesn't seem so dumb does it?? ...or maybe it does:wave: I had my machine pegged for almost 1300w at full load, since then, I have seen a couple articles/charts that shows an amazing economy of scale as it were for multiple ( 2+ cards). Not sure I'm buying it. These things are load leveled and using AFR. where exactly would the (in my case) supposed 25%+ economy come from?
 
hahaha, now my 1650W/132A doesn't seem so dumb does it?? ...or maybe it does:wave:
Who opined that ? Obviously someone who shouldn't be labelled an "enthusiast".
No such thing as too much power...unless you're using such a small percentage of the output that the efficiency tanks- but I doubt people pick up a 1500w Strider to power an HTPC.

The big question is how are the reviews going to reflect the actual card?
Most sites tend to bench at stock, and measure thermals, noise and power consumption for the same. Most sites also through in OC'ed results but tend to gloss over increased noise/power consumption/heat- mostly because I suspect that overclocks are subjective (non standard). What happens when you have a card that has essentially standardized two modes of operation - double benchmarks in all categories ? Bearing in mind that the card could/can be overclocked further than the 880 setting...three sets of benchmarks?
 
Who opined that ? Obviously someone who shouldn't be labelled an "enthusiast".
No such thing as too much power...unless you're using such a small percentage of the output that the efficiency tanks- but I doubt people pick up a 1500w Strider to power an HTPC.

I have just taken crap for "overkill" even though it is obviously not.

The big question is how are the reviews going to reflect the actual card?

That will be fun to see how the various sites handle the reviews. I would say that some would like to opt out of the 'second switch' set of reviews, however they may not have a choice as they will get scooped if they do.
Most sites tend to bench at stock, and measure thermals, noise and power consumption for the same.

yes...but this is for the heavyweight title. I suspect that Guru's review may be about 50 pages! LOL
 
What I am interested in is the performance of these cards in Xfire on the new 1155 platform. I might be mistaken, but I believe that these new motherboards only run SLI and Xfire in x8 instead of the x16 the x58 motherboards can handle.

From what I have read, I'll give links below, even the 480 is too powerful to be used in SLI in a x8 setup without losing performance. Wouldn't these cards being more powerful lose even more performance in Xfire on a x8 setup?


http://www.techpowerup.com/reviews/NVIDIA/GTX_480_PCI-Express_Scaling/1.html
 
Hence my musing on the 187.5w draw on nominal 150w cables.
I think benching and using the card comes with some interesting caveats. I don't think I can ever remember a component being released that when used at a manufacturers setting has the ability to take down the core elements of a system.
Theoretically most good PSU's shouldn't be stressed to provide 187.5w/15.6A over a line built for 150w/12.5A, but that doesn't take into account multi-rail PSU's where the total wattage might be acceptable but is split over 6-8 seperate 12V rails and individual rail overhead is low. Nor does it take into account PSU ageing or the utilization of peripheral(Molex) to PCI-E adapters -the latter could be a real cause for concern if the card comes bundled with them.

I'm hopeful that AMD and it's board partners make the BIOS switch and it's implications crystal clear to users. Most buyers of the cards should have enough of a knowledge base to work out for themselves what is involved, but there will be a segment of buyers who will buy simply because the bar graphs and framerate numbers look the best.
I have just taken crap for "overkill" even though it is obviously not.
You rollin', they hatin'

I'm guessing it's e-peen envy and/or a severe lack of technical knowledge...both of which I'm guessing will come a HUGE shock to you :rolleyes:. The rest simply are either flaming for fun or have a completely different set of checkboxes ticked for their own personal needs and are too intransigent or lack the imagination to see past their own situation.
My advice: have fun with 'em until it gets boring then hit the ignore setting when/if it gets repititious.
 
If the BIOS switch is there, then somebody's likely to use it regardless of all the disclaimers and legalities in the fine print. When that happens, somebody's PSU could get blown up or his PC badly damaged and AMD could be in a real load of trouble and for what ? some nice benchmark numbers to post against Nvidia ? By the way, they call it "fine print" because it usually is too small to read so most people never bother to read it anyway.
 
Wow. That's a powerful graphics card. Does anyone know if a 4 way x-fire setup is possible with this?
 
Man that looks powerful...one day I will build myself a crazy system with something like this =D.
 
No.
Crossfire supports a maximum of four GPU's -either 2 dual gpu cards, 4 single gpu cards or 1 dual and 2 single gpu cards.

Having more than four GPU's is only applicable for a render/folding farm, not gaming.
 
Can anyone tell me how many 8800GT OC will make one HD 6990 just for sake of comparision. I bought my 8800gt back in 2009 and I am able to play all games at high-res. Dont want to upgrade for 1 yr.
 
If you don't want to upgrade for a year then dont bother worrying about the 6990. By the time you're ready to upgrade this card will be on eBay for relative peanuts.

Want to do some ghetto comparitive benchmarking see how Crossfired HD6970's do against a GTS 450 (a slightly higher performing card than the 8800GT)
 
Back