Leaked: ATI Radeon HD 6990 specifications

Emil

Posts: 152   +0
Staff

The AMD-Nvidia war is about to go dual GPU. Following the leak of AMD Radeon HD 6970 benchmark numbers, a slide from what appears to be a presentation on the AMD Radeon HD 6990 (codenamed Antilles) has leaked, courtesy of user Gast on the 3DCenter forums.

If the slide is legitimate, and it appears to be, the graphics card will integrate 3840 stream processors and have 4GB of GDDR5 memory clocked at 4.80GHz. It will be equipped with two DVI-I and three mDP connectors. Power consumption will be at 300W under load and around 30W when idle. The next-generation dual-chip flagship offering from AMD will carry two codename Cayman GPUs with 1920 stream processors per chip. The card will deliver 6.0 trillion floating point operations per second (TFLOPS) single-precision performance or up to 1.5TFLOPS double-precision performance.

If it were released today, Antilles would become the top performing graphics card in the world. Nvidia is rumored to be delaying its dual-GPU GTX 590 in order to implement further improvements so that it can once again beat AMD's offering (Nvidia currently holds the crown for single-GPU performance with its GeForce GTX 580). Antilles is slated to ship in the first quarter of next year, though pricing has yet to be announced.

Permalink to story.

 
I'm not the only one that is excited that Nvidia is back in the game and that the GPU war is off to a fresh start, am I?
 
I don't think Nvidia was ever out of the game.... the HD 5000 series was good but they were only able to match Nvidia's level. For the most part, Nvidia has been winning the GPU war.
 
madboyv1 said:
I'm not the only one that is excited that Nvidia is back in the game and that the GPU war is off to a fresh start, am I?
Nope. I'm pumped about it too! I can't afford the high-end cards, but I'm ready for the trickle down effect to kick in.
 
seriously i dont really care about all the technical nubers here, all i care about is whether that card would be worth the money being spent on it......and here is to wishing amd takes the crown for best gpu performance
 
The bandwidth would imply that the card is using the older 5Gb GDDR5 chips.
HD 6990 = 307.2 Gb/sec
HD 5870 = 153.6 Gb/sec x 2 GPU's= 307.2 Gb/sec

And it's definitely using a 256-bit memory bus.
 
Both GTX590 and AMD6990 specs are impressive to say the least... Impressive as their future retail price... But is good to see both companies again going for the top spot! My good wishes to AMD :)
 
bioflex said:
seriously i dont really care about all the technical nubers here, all i care about is whether that card would be worth the money being spent on it......and here is to wishing amd takes the crown for best gpu performance

They will. For about 2 weeks until Nvidia releases their dual gpu card. Then we all wait for the next gen. In terms of multi gpu the only time ati ever came out on top was the hd 5970.
 
Now am I the only one looking at this and thinking Nvidia is going to have a hard time to match or beat this card without having something thats going to consume 400+ watts of power? By the looks of this card ATI is pairing up two 6970's to make this beast which is going to be ATI's flagship single GPU card power consumption on it is going to be in the realm of 200 watts, and I imagine its being downclocked to make that 300 watt number they posted. The 6970 is rumored to be 10% faster than a GTX480 making it 10-20% slower than the GTX580. If by some amazing feat of technological ingenuity Nvidia can put two of these 250 watt chips onto a single PCB without lowering their specs than they'll have it, but I honestly don't see this happening. With heat and power consumption in mind it's going to very close and I wouldn't be surprised if the two cards are within 5% of each other.
 
NVIDIA and AMD are going at it hard now. I've only been involved with computer building for about a year and a half now, I haven't built one yet, but this is the hardest they've been fighting each other since I was interested (which was when the GTX 200 vs HD 4xxx were the flagships). They are fighting for the top name and the top performance, and we, the consumers, get to sit down and watch the sparks fly. It'll only make prices drop and performance soar. A toast to rivalry!
 
@Adhmuz
Bear in mind that the HD 6990 need not actually adhere to the 300w PCI spec in it's entirety. The card is supposed to have an integrated power-limiter that monitors power draw with every GPU clock cycle and will dynamically adjust power usage according to which shader blocks are active at any given time. Since only a "power virus" such as OCCT or FurMark actually causes the whole GPU+VRAM to be active at a given time, the TDP can effectively exceed 300 watts while never breaching that figure - hope I explained that well enough to be understood.
The GTX 580 can call on a similar feature, although the AMD solution (hardware) seems a much better alternative to nvidia's driver based solution.

@Evanspec
This is pretty minor stuff compared with the Nvidia G80 (8800GTX/Ultra) v ATI R600 (HD 2900XT) battle of 2007. The next process node (28nm- nvidia's Kepler vs. AMD's Southern Islands) due in the latter half of next year is shaping up to be a doozy though.
 
Time to bring out the heavyweights in the GPU wars. Here's a challenge to Nvidia and AMD. Let's see who can offer gamers the lowest price on a next-gen dual-GPU card. :)
 
I think everyone is forgeting something. It isn't single GPU, it is Single PCB.
 
This sounds like fanboy logic or A nivda employee statement, lol. Nvida lost alot of ground/customers too 4XXX & 5XXX. The 4XXX ATI/AMD did not ever beat them as benches go but was A way better value (cost vs preformance), now as far as the 5XXX they Destroyed Nivda in Value ... & benches were 50/50 (nidva only benched higer in nvida backed games). Now as far as the 580, 580x2 and the 6990 ... the 580 cost more then and 5970 and the 5970 cost $50.00 less and is 30% -40 % faster (yes I know its 2 gpu's .. but the fact is old 5970 is A WAY better value.) now 6990 will cost about $650 when it first comes out and the 580x2 will be $900.00 .. for $250.00 it needs too be alot better & I dont think it will. The point is , if Nivda dont get their price point/cost down & value up they are going too die A slow death. I admit I am Value bandwagoner, I will jump onboard whatever is doing the best at the moment with-in cost and most ppl are like this. Anyhow, you said they were never out ... I dont think they even back in yet .. the 580 Value is not their ... ... As of right now , I do plan on getting 2x 6970 right after x-mas (might hold out for the 6990) , because its time to upgrade for me and this will be the best VALUE. ( running 3x 4850s still( P22,000 3Dmark V )).
 
there's a lot of fighting going on.
One thing I'd like to know is quad cf scaling with AMD's latest drivers.
 
princeton said:

They will. For about 2 weeks until Nvidia releases their dual gpu card. Then we all wait for the next gen. In terms of multi gpu the only time ati ever came out on top was the hd 5970.

That would be if the people of AMD are stupid or confident enough to release the card before Nvidia does, because after that Nvidia would be DELAYING and DELAYING IT untill they get to be better than HD 6990, AMD should think a little bit more abou Strategy, cause Nvidia is really showing that they are much bettern on that.
 
Yeah, they should delay like they did with the gtx 400, it was great for them.

:S

I dont know if it is better to buy one of those mosnter$ than will last 5 years being the best or buying a half price card than will, dont know, 3 years being the best?. But I don't see any game in the future than will use more than a gtx 480, only crysis 2.
 
Yeah, they should delay like they did with the gtx 400, it was great for them.
:S.
Well played !
I dont know if it is better to buy one of those mosnter$ than will last 5 years being the best or buying a half price card than will, dont know, 3 years being the best?. But I don't see any game in the future than will use more than a gtx 480, only crysis 2.
Crysis 2 won't be the be-all-and-end-all game that some seem to think it will be (imo). Once game dev's start making better use of DX11 features -S.T.A.L.K.E.R.2 (CryEngine3) comes to mind- realistic effects, ambient occlusion, more pervasive physics (destructable/interactive enviroments etc.),more widespread use of tessellation along with higher driver/game levels of MLAA/SSAA/TrSSAA (for example) should keep the graphics market ticking over for the foreseeable future....and once DX11 has run it's race, there's always DirectX 12 and ray tracing.
 
People also seem to forget that nVIDIA's CUDA cores are actually used by professionals, end users, scientists, servers, and video and picture editors. Also PhysX, great SLi scaling, etc.

AMD has Eyefinity (a fad), and Crossfire. Havok physics is software driven (owned by Intel), and time after time, new Catalyst driver releases offer very very very little in adding performance. AMD's GPU's and CPU's are cheaper because they have no choice. Why do you think the lesser known brands at your supermarket are cheaper than the big brands? Why would it be any different with silicon?

Jensen said it himself BEFORE Fermi was out... "There is no safety net at nVIDIA."

AKA, go big or go home. AMD has yet to get to get off the couch if you ask me.
 
People also seem to forget that nVIDIA's CUDA cores are actually used by professionals, end users, scientists, servers, and video and picture editors.
Not to mention ~99% of hospitals and health facilities for radiography (X-ray/CT/MRI tomography), audiology and numerous other branches of medical discipline....but how does that relate to the HD 6990 ?...or are you just trolling?
AMD has Eyefinity (a fad),
Highly unlikely......
and time after time, new Catalyst driver releases offer very very very little in adding performance.
Patently untrue. The only negatives I think you can lay at AMD's graphics drivers are lack of legacy support, the (up until recently) prehistoric profile setting and a smaller team of code writers. Crossfire and general gaming applications are for the most part very good. ( I run both SLI and CFX )
AMD's GPU's and CPU's are cheaper because they have no choice.
You obviously never tried to buy a HD 5870 or 5890 some time between October 2009 and November 2010. CPU's on the other hand are more likely priced due to 1.The fact that the process they use (45nm) is ancient -the tooling and R&D costs have been amortized some time ago, and 2. to maintain marketshare (see recent drops in GTX 460 pricing for a comparison)
Jensen said it himself BEFORE Fermi was out... "There is no safety net at nVIDIA.".
And just how much marketshare, mindshare and revenue did that ethos cost nvidia when the G212 failed to materialize and GF100 (Fermi) was hurriedly pressed into action as a desktop card -which was never nvidia's original intention?
AKA, go big or go home..
nvidia ended up doing both..........I'm definitely thinking tr....
AMD has yet to get to get off the couch if you ask me.
....oll
 
Back