Leaked: ATI Radeon HD 6990 specifications

By Emil
Nov 24, 2010
Post New Reply
  1. The AMD-Nvidia war is about to go dual GPU. Following the leak of AMD Radeon HD 6970 benchmark numbers, a slide from what appears to be a presentation on the AMD Radeon HD 6990 (codenamed Antilles) has leaked, courtesy of user Gast on the 3DCenter forums.

    Read the whole story
  2. madboyv1

    madboyv1 TechSpot Paladin Posts: 941   +42

    I'm not the only one that is excited that Nvidia is back in the game and that the GPU war is off to a fresh start, am I?
  3. Legendle2007

    Legendle2007 Newcomer, in training

    I don't think Nvidia was ever out of the game.... the HD 5000 series was good but they were only able to match Nvidia's level. For the most part, Nvidia has been winning the GPU war.
  4. Wagan8r

    Wagan8r TechSpot Guru Posts: 584   +45

    Nope. I'm pumped about it too! I can't afford the high-end cards, but I'm ready for the trickle down effect to kick in.
  5. bioflex

    bioflex Newcomer, in training Posts: 70

    seriously i dont really care about all the technical nubers here, all i care about is whether that card would be worth the money being spent on it......and here is to wishing amd takes the crown for best gpu performance
  6. Well everyone is entitled to their view of history........
  7. myrmidonks

    myrmidonks Newcomer, in training Posts: 65

  8. dividebyzero

    dividebyzero trainee n00b Posts: 4,783   +639

    The bandwidth would imply that the card is using the older 5Gb GDDR5 chips.
    HD 6990 = 307.2 Gb/sec
    HD 5870 = 153.6 Gb/sec x 2 GPU's= 307.2 Gb/sec

    And it's definitely using a 256-bit memory bus.
  9. Cueto_99

    Cueto_99 TechSpot Enthusiast Posts: 240   +11

    Both GTX590 and AMD6990 specs are impressive to say the least... Impressive as their future retail price... But is good to see both companies again going for the top spot! My good wishes to AMD :)
  10. princeton

    princeton TechSpot Addict Posts: 1,716

    They will. For about 2 weeks until Nvidia releases their dual gpu card. Then we all wait for the next gen. In terms of multi gpu the only time ati ever came out on top was the hd 5970.
  11. Adhmuz

    Adhmuz TechSpot Paladin Posts: 893   +98

    Now am I the only one looking at this and thinking Nvidia is going to have a hard time to match or beat this card without having something thats going to consume 400+ watts of power? By the looks of this card ATI is pairing up two 6970's to make this beast which is going to be ATI's flagship single GPU card power consumption on it is going to be in the realm of 200 watts, and I imagine its being downclocked to make that 300 watt number they posted. The 6970 is rumored to be 10% faster than a GTX480 making it 10-20% slower than the GTX580. If by some amazing feat of technological ingenuity Nvidia can put two of these 250 watt chips onto a single PCB without lowering their specs than they'll have it, but I honestly don't see this happening. With heat and power consumption in mind it's going to very close and I wouldn't be surprised if the two cards are within 5% of each other.
     
  12. Evanspec

    Evanspec Newcomer, in training

    NVIDIA and AMD are going at it hard now. I've only been involved with computer building for about a year and a half now, I haven't built one yet, but this is the hardest they've been fighting each other since I was interested (which was when the GTX 200 vs HD 4xxx were the flagships). They are fighting for the top name and the top performance, and we, the consumers, get to sit down and watch the sparks fly. It'll only make prices drop and performance soar. A toast to rivalry!
  13. dividebyzero

    dividebyzero trainee n00b Posts: 4,783   +639

    @Adhmuz
    Bear in mind that the HD 6990 need not actually adhere to the 300w PCI spec in it's entirety. The card is supposed to have an integrated power-limiter that monitors power draw with every GPU clock cycle and will dynamically adjust power usage according to which shader blocks are active at any given time. Since only a "power virus" such as OCCT or FurMark actually causes the whole GPU+VRAM to be active at a given time, the TDP can effectively exceed 300 watts while never breaching that figure - hope I explained that well enough to be understood.
    The GTX 580 can call on a similar feature, although the AMD solution (hardware) seems a much better alternative to nvidia's driver based solution.

    @Evanspec
    This is pretty minor stuff compared with the Nvidia G80 (8800GTX/Ultra) v ATI R600 (HD 2900XT) battle of 2007. The next process node (28nm- nvidia's Kepler vs. AMD's Southern Islands) due in the latter half of next year is shaping up to be a doozy though.
  14. fpsgamerJR62

    fpsgamerJR62 Newcomer, in training Posts: 489

    Time to bring out the heavyweights in the GPU wars. Here's a challenge to Nvidia and AMD. Let's see who can offer gamers the lowest price on a next-gen dual-GPU card. :)
  15. I think everyone is forgeting something. It isn't single GPU, it is Single PCB.
  16. This sounds like fanboy logic or A nivda employee statement, lol. Nvida lost alot of ground/customers too 4XXX & 5XXX. The 4XXX ATI/AMD did not ever beat them as benches go but was A way better value (cost vs preformance), now as far as the 5XXX they Destroyed Nivda in Value ... & benches were 50/50 (nidva only benched higer in nvida backed games). Now as far as the 580, 580x2 and the 6990 ... the 580 cost more then and 5970 and the 5970 cost $50.00 less and is 30% -40 % faster (yes I know its 2 gpu's .. but the fact is old 5970 is A WAY better value.) now 6990 will cost about $650 when it first comes out and the 580x2 will be $900.00 .. for $250.00 it needs too be alot better & I dont think it will. The point is , if Nivda dont get their price point/cost down & value up they are going too die A slow death. I admit I am Value bandwagoner, I will jump onboard whatever is doing the best at the moment with-in cost and most ppl are like this. Anyhow, you said they were never out ... I dont think they even back in yet .. the 580 Value is not their ... ... As of right now , I do plan on getting 2x 6970 right after x-mas (might hold out for the 6990) , because its time to upgrade for me and this will be the best VALUE. ( running 3x 4850s still( P22,000 3Dmark V )).
  17. dividebyzero

    dividebyzero trainee n00b Posts: 4,783   +639

    The money might be better spent on an education.
  18. there's a lot of fighting going on.
    One thing I'd like to know is quad cf scaling with AMD's latest drivers.
     
  19. indiangamer

    indiangamer Newcomer, in training Posts: 26

    yeah!! Now i am ready for a $1000 upgrade......
  20. edison5do

    edison5do Newcomer, in training Posts: 239

    That would be if the people of AMD are stupid or confident enough to release the card before Nvidia does, because after that Nvidia would be DELAYING and DELAYING IT untill they get to be better than HD 6990, AMD should think a little bit more abou Strategy, cause Nvidia is really showing that they are much bettern on that.
  21. Yeah, they should delay like they did with the gtx 400, it was great for them.

    :S

    I dont know if it is better to buy one of those mosnter$ than will last 5 years being the best or buying a half price card than will, dont know, 3 years being the best?. But I don't see any game in the future than will use more than a gtx 480, only crysis 2.
  22. Johny47

    Johny47 Newcomer, in training Posts: 157

    Looks great, now all AMD have to do is make drivers that these new cards deserve for once =/
  23. dividebyzero

    dividebyzero trainee n00b Posts: 4,783   +639

    Well played !
    Crysis 2 won't be the be-all-and-end-all game that some seem to think it will be (imo). Once game dev's start making better use of DX11 features -S.T.A.L.K.E.R.2 (CryEngine3) comes to mind- realistic effects, ambient occlusion, more pervasive physics (destructable/interactive enviroments etc.),more widespread use of tessellation along with higher driver/game levels of MLAA/SSAA/TrSSAA (for example) should keep the graphics market ticking over for the foreseeable future....and once DX11 has run it's race, there's always DirectX 12 and ray tracing.
  24. Jurassic4096

    Jurassic4096 Banned Posts: 158

    People also seem to forget that nVIDIA's CUDA cores are actually used by professionals, end users, scientists, servers, and video and picture editors. Also PhysX, great SLi scaling, etc.

    AMD has Eyefinity (a fad), and Crossfire. Havok physics is software driven (owned by Intel), and time after time, new Catalyst driver releases offer very very very little in adding performance. AMD's GPU's and CPU's are cheaper because they have no choice. Why do you think the lesser known brands at your supermarket are cheaper than the big brands? Why would it be any different with silicon?

    Jensen said it himself BEFORE Fermi was out... "There is no safety net at nVIDIA."

    AKA, go big or go home. AMD has yet to get to get off the couch if you ask me.
  25. dividebyzero

    dividebyzero trainee n00b Posts: 4,783   +639

    Not to mention ~99% of hospitals and health facilities for radiography (X-ray/CT/MRI tomography), audiology and numerous other branches of medical discipline....but how does that relate to the HD 6990 ?...or are you just trolling?
    Highly unlikely......
    Patently untrue. The only negatives I think you can lay at AMD's graphics drivers are lack of legacy support, the (up until recently) prehistoric profile setting and a smaller team of code writers. Crossfire and general gaming applications are for the most part very good. ( I run both SLI and CFX )
    You obviously never tried to buy a HD 5870 or 5890 some time between October 2009 and November 2010. CPU's on the other hand are more likely priced due to 1.The fact that the process they use (45nm) is ancient -the tooling and R&D costs have been amortized some time ago, and 2. to maintain marketshare (see recent drops in GTX 460 pricing for a comparison)
    And just how much marketshare, mindshare and revenue did that ethos cost nvidia when the G212 failed to materialize and GF100 (Fermi) was hurriedly pressed into action as a desktop card -which was never nvidia's original intention?
    nvidia ended up doing both..........I'm definitely thinking tr....
    ....oll


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.