TechSpot

ATI's R600 appears to be a complete monster...

By wolfram
Nov 15, 2006
  1. MetalX

    MetalX TechSpot Chancellor Posts: 1,909

    140.1GB/s... ... ... ... **dies** ... ... ... ...
     
  2. F1N3ST

    F1N3ST TS Rookie Posts: 1,088

    Remember that merger with AMD, the king of Memory controllers, that is now coming into play lol.
     
  3. wolfram

    wolfram TechSpot Paladin Topic Starter Posts: 2,605   +9

    That 512bit memory bus is awesome :eek:

    Everything looks phenomenal, except power consumption. Imagine how much power it will requiere for Crossfire operation... Even a PCP&C 1KW PSU wouldn't be enough for them.

    I just wish that GPU's were more power efficient things, just like Conroe's.

    Maybe someday, GPU's and CPU's will be cooled only by small heatsinks, without any ****ing fan. Imagine a complete silent PC, consuming less than 200W @ 100% load, and staying very cool.....

    But it appears that AMD (ATI) and Nvidia don't care about heat and power consumption issues.....
     
  4. Rage_3K_Moiz

    Rage_3K_Moiz Sith Lord Posts: 7,291   +25

    OMG 2GB of VRAM?! *dies*
     
  5. wolfram

    wolfram TechSpot Paladin Topic Starter Posts: 2,605   +9

    I think that will be only for FireGL cards. Maybe "normal" R600-based cards will have "only" have 1GB of RAM, or 1.5GB.
     
  6. agi_shi

    agi_shi TS Rookie Posts: 507

    Wow, if I get that card I'd literally have more VRAM than system ram... But the prices will probably be something like 2000$ ;).
     
  7. mailpup

    mailpup TS Special Forces Posts: 8,506   +237

    When those ATI cards come out, I'd be interested in getting one. I'd be leapfrogging over the current Nvidia top of the heap card.

    I don't know if this has ever been discussed but with cards becoming larger and more powerful, I wonder if they might evolve into having replaceable/upgradable memory chips and GPU'S. Sort of like becoming mini-motherboards as they seem to be these days. Thermaltake even markets a separate dedicated GPU power supply. So to upgrade these "mini-motherboards" you would only have to replace the appropriate parts, upgrade maybe a separate GPU BIOS and driver software and you're good to go, at least for awhile until, like a motherboard, you have to upgrade the base card.
     
  8. nickslick74

    nickslick74 TS Rookie Posts: 883

    I have to wonder, with the new, faster cards coming out much faster than they used to, when is PCIe x16 going to jump up in speed? Won't these new cards get bottlenecked eventually by the bandwidth limits of PCIE x16?
     
  9. Sharkfood

    Sharkfood TS Guru Posts: 1,199

    There have been some rather discouraging information about the G80 and extremely poor dynamic branch performance in shaders. Previous 6/7 series cards also suffered from this, but it appears the G80 does not tackle this big shortcoming.

    I've been seeing benchmarks on others comparing the G80's dynamic branching performance at a factor in specific conditions around 4% of an X1950. The raised ceilings afforded by DX10 will just be a waste if shader branching is anyway crippled by one architecture.

    As it stands right now, posted examples of dynamic branching/shaders right now aren't complex enough to exhibit the problem. It's anyone's guess if developers will code for ATI/AMD dynamic branching or NV's... if so, this will continue to be a non-issue.. but it's discouraging that this facet of the architecture, while improved over the previous generation, may be a problem.

    It's still being debated/ironed out to determine if this is some compiler shader optimization, driver/compiler issue with G80 or a real, honest problem. Unfortunately, much like the FX series, any kinds of debates or discussions concerning this are being bemuddled and messengers/discussions are being harrassed by "cronies".. I'm hoping those working on exploring this issue will be able to work past this rather nasty trend.
     
    1 person likes this.
  10. wolfram

    wolfram TechSpot Paladin Topic Starter Posts: 2,605   +9

    That's true Sharkfood. Some people say that the ATI's R600 will be very powerful in shader-rich games. Just like the X1000 series.

    BTW, I don't think that PCI-E will be the bottleneck. It is very fast. But maybe we could see a PCI-E X32.... who knows :)
     
     
  11. wolfram

    wolfram TechSpot Paladin Topic Starter Posts: 2,605   +9

    Now the R600 becomes a X2800XTX :eek:

    750 Mhz core clock and 2.2 Ghz GDDR4 RAM!?!?!

    That is amazing :) It should destroy the 8800GTX, but I may be wrong...
     
  12. agi_shi

    agi_shi TS Rookie Posts: 507

    Heh, "should"? More like, "will". And pricing in at 600$ puts it in 8800GTX territory... Hmm, 600$ for an extreme card or 600$ for a card that destroys the extreme card ;)
     
  13. MetalX

    MetalX TechSpot Chancellor Posts: 1,909

    It's even better than the 8900GTX. The 8900GTX is 700/2200... and still only 128 unified shaders.
     
    1 person likes this.
  14. agi_shi

    agi_shi TS Rookie Posts: 507

    Haha, NICE... If I'm getting this right, ATI's 64 unified shaders will do 4 calculations each, thus acting like 256 unified shaders... Twice as much as the Geforce's. Now imagine the X2950XTX... Probably what I'll have once the
    X3950XTX comes out... My X850XT would seem like a ... I have no idea. Haha.

    Wow... This stuff is really cool... Just imagine, X9950XTX...
     
  15. MetalX

    MetalX TechSpot Chancellor Posts: 1,909

    Just imagine Xfire, now that it's all internal... dual Crossfire Bridges for 2x the bandwidth of SLI. Prob get like 45000 in 3dMark06 :D

    My guess on the X2950XTX is like maybe... 1000MHz/3000MHz / 128 Unified Shaders x4 ops = 512 / 512 bit ... 192GB/s bandwidth :) Just my wishful thinking :D

    But since ATi is no longer Canadian, I am starting to find I prefer Nvidia. But I will concur that that X2800XTX will pwnage the 8800/8900GTX/GTS
     
  16. swker98

    swker98 TechSpot Paladin Posts: 1,348

    these cards look like monsters, i heard there going to be very long also
    just forget about the price......

    on another note, anyone know prices\ ETA of midrange ati cards that are comparable to the 8600ultra

    thanks
     
  17. agi_shi

    agi_shi TS Rookie Posts: 507

    If I remember correctly the X2800GTO should be 300$, destroying any 8600 card. As for longness, I believe they were at first 13 inches. Yes, a foot and an inch. However, I think ATI revised them so that they aren't longer than the 8800GTX. They shouldn't have though...

    "Hey guys, check out my super-big, super-awesome 8800GTX... it's like, 11 inches!"
    "Meh, that's nice. Nothing really compared to my 13-inch X2800XTX that will totally destroy your card though..."

    @MetalX:
    [sometime in the future]
    - Hello, I'm agi_shi and I have an X2800XTX card. It's really struggling in today's games and I like to game. Any suggestions on what to upgrade?
    - WOW, man, that's from like... the middle ages man. Get yourself an X9600XT, it'll only cost you ~120$, and it'll destroy your X2800XTX. You'll also need to upgrade your PSU to a good, quality one of about ~3500W, though I recommend 4000W. You'll also need to upgrade your motherboard to one with a PCI-Express x512 slot.

    Haha.
     
  18. nickslick74

    nickslick74 TS Rookie Posts: 883

    Wow, the price will go up for those in the future? Maybe because of its antique status? :giddy:
     
  19. MetalX

    MetalX TechSpot Chancellor Posts: 1,909

    LOL if I'm still around Techspot by then, I'll try to remember to quote that exact post when you come running for help :D
     
  20. agi_shi

    agi_shi TS Rookie Posts: 507

    Not the 9600XT, the X9600XT. Big difference =). I'll be sure to sell my X850XT by then though (given that it still works ;) ). Probably will be worth a lot in 10 generations of cards.
     
  21. MetalX

    MetalX TechSpot Chancellor Posts: 1,909

    Lol doubt it. An ATi Rage 128 Pro is NOT worth much ATM, and thats like, 6 gens ago. I mean if worth a lot means $0.50, then yes, I do see your point :D
     
  22. agi_shi

    agi_shi TS Rookie Posts: 507

    ... well, it'd be fun to tell someone you still have a working card 10 generations old, even if it IS worthless...
     
  23. MetalX

    MetalX TechSpot Chancellor Posts: 1,909

    Indeed, it IS fun... although mine is not 10 gens old YET.
     
  24. Mictlantecuhtli

    Mictlantecuhtli TS Evangelist Posts: 4,919   +9

    I'm sure everyone will see the difference between 150 fps with the "old" card and 200 fps with X2800XTX. :blackeye:

    How many of you are still playing at 1024x768 resolution? At that resolution this kind of cards will be a waste of money (well, will be no matter what, at this point).
     
Topic Status:
Not open for further replies.


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...


Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.