ATI's R600 appears to be a complete monster...

Status
Not open for further replies.
Remember that merger with AMD, the king of Memory controllers, that is now coming into play lol.
 
That 512bit memory bus is awesome :eek:

Everything looks phenomenal, except power consumption. Imagine how much power it will requiere for Crossfire operation... Even a PCP&C 1KW PSU wouldn't be enough for them.

I just wish that GPU's were more power efficient things, just like Conroe's.

Maybe someday, GPU's and CPU's will be cooled only by small heatsinks, without any ****ing fan. Imagine a complete silent PC, consuming less than 200W @ 100% load, and staying very cool.....

But it appears that AMD (ATI) and Nvidia don't care about heat and power consumption issues.....
 
I think that will be only for FireGL cards. Maybe "normal" R600-based cards will have "only" have 1GB of RAM, or 1.5GB.
 
Rage_3K_Moiz said:
OMG 2GB of VRAM?! *dies*
Wow, if I get that card I'd literally have more VRAM than system ram... But the prices will probably be something like 2000$ ;).
 
When those ATI cards come out, I'd be interested in getting one. I'd be leapfrogging over the current Nvidia top of the heap card.

I don't know if this has ever been discussed but with cards becoming larger and more powerful, I wonder if they might evolve into having replaceable/upgradable memory chips and GPU'S. Sort of like becoming mini-motherboards as they seem to be these days. Thermaltake even markets a separate dedicated GPU power supply. So to upgrade these "mini-motherboards" you would only have to replace the appropriate parts, upgrade maybe a separate GPU BIOS and driver software and you're good to go, at least for awhile until, like a motherboard, you have to upgrade the base card.
 
I have to wonder, with the new, faster cards coming out much faster than they used to, when is PCIe x16 going to jump up in speed? Won't these new cards get bottlenecked eventually by the bandwidth limits of PCIE x16?
 
There have been some rather discouraging information about the G80 and extremely poor dynamic branch performance in shaders. Previous 6/7 series cards also suffered from this, but it appears the G80 does not tackle this big shortcoming.

I've been seeing benchmarks on others comparing the G80's dynamic branching performance at a factor in specific conditions around 4% of an X1950. The raised ceilings afforded by DX10 will just be a waste if shader branching is anyway crippled by one architecture.

As it stands right now, posted examples of dynamic branching/shaders right now aren't complex enough to exhibit the problem. It's anyone's guess if developers will code for ATI/AMD dynamic branching or NV's... if so, this will continue to be a non-issue.. but it's discouraging that this facet of the architecture, while improved over the previous generation, may be a problem.

It's still being debated/ironed out to determine if this is some compiler shader optimization, driver/compiler issue with G80 or a real, honest problem. Unfortunately, much like the FX series, any kinds of debates or discussions concerning this are being bemuddled and messengers/discussions are being harrassed by "cronies".. I'm hoping those working on exploring this issue will be able to work past this rather nasty trend.
 
That's true Sharkfood. Some people say that the ATI's R600 will be very powerful in shader-rich games. Just like the X1000 series.

BTW, I don't think that PCI-E will be the bottleneck. It is very fast. But maybe we could see a PCI-E X32.... who knows :)
 
Now the R600 becomes a X2800XTX :eek:

750 Mhz core clock and 2.2 Ghz GDDR4 RAM!?!?!

That is amazing :) It should destroy the 8800GTX, but I may be wrong...
 
wolfram said:
Now the R600 becomes a X2800XTX :eek:

750 Mhz core clock and 2.2 Ghz GDDR4 RAM!?!?!

That is amazing :) It should destroy the 8800GTX, but I may be wrong...
Heh, "should"? More like, "will". And pricing in at 600$ puts it in 8800GTX territory... Hmm, 600$ for an extreme card or 600$ for a card that destroys the extreme card ;)
 
MetalX said:
It's even better than the 8900GTX. The 8900GTX is 700/2200... and still only 128 unified shaders.
Haha, NICE... If I'm getting this right, ATI's 64 unified shaders will do 4 calculations each, thus acting like 256 unified shaders... Twice as much as the Geforce's. Now imagine the X2950XTX... Probably what I'll have once the
X3950XTX comes out... My X850XT would seem like a ... I have no idea. Haha.

Wow... This stuff is really cool... Just imagine, X9950XTX...
 
Just imagine Xfire, now that it's all internal... dual Crossfire Bridges for 2x the bandwidth of SLI. Prob get like 45000 in 3dMark06 :D

My guess on the X2950XTX is like maybe... 1000MHz/3000MHz / 128 Unified Shaders x4 ops = 512 / 512 bit ... 192GB/s bandwidth :) Just my wishful thinking :D

But since ATi is no longer Canadian, I am starting to find I prefer Nvidia. But I will concur that that X2800XTX will pwnage the 8800/8900GTX/GTS
 
these cards look like monsters, i heard there going to be very long also
just forget about the price......

on another note, anyone know prices\ ETA of midrange ati cards that are comparable to the 8600ultra

thanks
 
swker98 said:
these cards look like monsters, i heard there going to be very long also
just forget about the price......

on another note, anyone know prices\ ETA of midrange ati cards that are comparable to the 8600ultra

thanks
If I remember correctly the X2800GTO should be 300$, destroying any 8600 card. As for longness, I believe they were at first 13 inches. Yes, a foot and an inch. However, I think ATI revised them so that they aren't longer than the 8800GTX. They shouldn't have though...

"Hey guys, check out my super-big, super-awesome 8800GTX... it's like, 11 inches!"
"Meh, that's nice. Nothing really compared to my 13-inch X2800XTX that will totally destroy your card though..."

@MetalX:
[sometime in the future]
- Hello, I'm agi_shi and I have an X2800XTX card. It's really struggling in today's games and I like to game. Any suggestions on what to upgrade?
- WOW, man, that's from like... the middle ages man. Get yourself an X9600XT, it'll only cost you ~120$, and it'll destroy your X2800XTX. You'll also need to upgrade your PSU to a good, quality one of about ~3500W, though I recommend 4000W. You'll also need to upgrade your motherboard to one with a PCI-Express x512 slot.

Haha.
 
agi_shi said:
@MetalX:
[sometime in the future]
- Hello, I'm agi_shi and I have an X2800XTX card. It's really struggling in today's games and I like to game. Any suggestions on what to upgrade?
- WOW, man, that's from like... the middle ages man. Get yourself an X9600XT, it'll only cost you ~120$, and it'll destroy your X2800XTX. You'll also need to upgrade your PSU to a good, quality one of about ~3500W, though I recommend 4000W. You'll also need to upgrade your motherboard to one with a PCI-Express x512 slot.

Haha.

Wow, the price will go up for those in the future? Maybe because of its antique status? :giddy:
 
LOL if I'm still around Techspot by then, I'll try to remember to quote that exact post when you come running for help :D
 
nickslick74 said:
Wow, the price will go up for those in the future? Maybe because of its antique status? :giddy:
Not the 9600XT, the X9600XT. Big difference =). I'll be sure to sell my X850XT by then though (given that it still works ;) ). Probably will be worth a lot in 10 generations of cards.
 
Lol doubt it. An ATi Rage 128 Pro is NOT worth much ATM, and thats like, 6 gens ago. I mean if worth a lot means $0.50, then yes, I do see your point :D
 
MetalX said:
Lol doubt it. An ATi Rage 128 Pro is NOT worth much ATM, and thats like, 6 gens ago. I mean if worth a lot means $0.50, then yes, I do see your point :D
... well, it'd be fun to tell someone you still have a working card 10 generations old, even if it IS worthless...
 
I'm sure everyone will see the difference between 150 fps with the "old" card and 200 fps with X2800XTX. :blackeye:

How many of you are still playing at 1024x768 resolution? At that resolution this kind of cards will be a waste of money (well, will be no matter what, at this point).
 
Status
Not open for further replies.
Back