Details on nVidia G80 GPU

By Justin Mann on February 1, 2006, 7:31 PM
According to some rumors, though mostly word of mouth, nVidia's newer line of GPUs in the G80 series are set to be around this April. These ones are particularly awesome because they will be the first true dual-core GPUs, as opposed to multiple cores per PCB and SLI. These new beasts also carry DirectX10 support, Shader Model 4.0, core speeds 800MHz with room for growth and be based on the newer “half-node” 80nm process, resulting in a much appreciated heat reduction. These might also be the first cards on the market to utilize GDDR4. If any of this pans out, the start of 2Q 2006 will look very good for the video card market. Read more details here.




User Comments: 32

Got something to say? Post a comment
JMMD said:
Wow! Those are some nice specs. Can't wait to see what Nvidia has in store for the new cards.
cyrax said:
Omgwtf!....I have seen what DX10 can do, CRY2 engine. Is is beyond console graphics.
exscind said:
I don't want to get ahead of myself, but that card looks so disgustingly crazy. 800MHz of GPU and 1.8GHz of GDDR4 is just beyond any comprehension. Shader Model 4.0 and DirectX 10 are also good things to look forward to. Video cards is perhaps the fastest growing technology in the computer industry these days. There doesn't seem to be any "pause" periods where things slow down and take a break; ATI and nVIDIA continually crank out newer and badder cards one after another. It's good because progress is good; it's bad because anything you have becomes outdated a mere year later :).
djleyo said:
dual-core GPUs, DirectX10 support, Shader Model 4.0,GDDR4, core speeds 800MHz(and more)!!!everyone start drooling and pick up your jaws of the floor !!!do we really nead more ??? this news item made my day!! hooray for nvidia this olso means the series 7 gpu's are going to have a nice price droptake that next gen consoles imagine two of these bad boys in 16x SLI bye bye playstation 3
MonkeyMan said:
Oh man this new video card is going to be a monster man. I'm betting that this video card will execute games instantly, because of its capacity to operate and monstrous speed levels. This is going to be one bad baby!!!!!!! The G80 is going to be the most amazing video card ever!!!!
jmag034 said:
It wouldnt matter if it supported DX 14.0 and SM 7.0...the image quality sucks. I get the same detail with a 7800GT that i could get with a 9800pro.
djleyo said:
jmag034 time to upgrade to a new monitor man you dont now what youre talking about
otmakus said:
So this video card will start to support DirectX 10 long before DirectX 10 is even launched? And jmag034 is partly right, with 17" monitor (or even 15"), even the most punishing games can be played at medium with a 9800 pro. The graphic quality improves a bit at high setting, but that improvement doesn't worth the price difference between a 9800pro and a 7800GT, except for the real enthusiasts. Then again, a real enthusiast who can afford a 7800GT must surely be able to afford at least 19 or 21" monitor.
Need_a_Dell said:
HOLY SHMOLY! This sounds too good to be true! By the time I win the last 7800, it'll be obsolete! (I think I'm about to cry!) This card is every computer nerds wet dream. DX10 support, shader model 4.0, dual core, heat reduction, GDDR4! These specs are what dreams are made of. I want one now. TechSpot, hook me up! (After I win the 7800 of course! :P )
Kaleid said:
Looks like nice specs, but of course there will be something better after that card.I remember being impressed hearing about T&L for the first time and a buddy ended up paying a lot of money for a Geforce 256.
JMMD said:
[b]Originally posted by Kaleid:[/b][quote]Looks like nice specs, but of course there will be something better after that card.I remember being impressed hearing about T&L for the first time and a buddy ended up paying a lot of money for a Geforce 256.[/quote]I always try to stay one release behind so I'm not paying top dollar for any new cards. Once the new card comes out, the 7800's should fall in price and then maybe by summer, they'll be great deals.
Race said:
[b]Originally posted by JMMD:[/b][quote]I always try to stay one release behind so I'm not paying top dollar for any new cards. Once the new card comes out, the 7800's should fall in price and then maybe by summer, they'll be great deals.[/quote]I agree....absolutely the way to go (unless money is no object).ATI may be a little disheartened ;) by this news because their best card (1900) will have to go up against it in benchmarks, which should be interesting.One benefit to the fast moving graphics card industry is being able to go back and replay some of your recent favorite games on 'ultra-high quality' at 16 x 12!Can't wait to read more about this...
nic said:
Wow, looks like some long time loyal ATI fans may be making a switch...wonder what ATI has lined up to counter this?
Rage_3K_Moiz said:
ATI has the R600 lined up for launch but NVIDIA's cards are some serious competition. Wonder what games will be able to utilize such features. It'll be quite some time I reckon, even after Vista is released. Sadly, this will make older games(such as the BF2) obsolete, since DX10 will only allow DX9 gfx to run with a software layer. But it means better looking games, so who's complaining? Wondering about BF3.... :D
asphix said:
Sounds impressive.. I love where the whole industry has been going lately with the multi-core solutions. Will be interesting to see what we have 5 years from now.
Cartz said:
[b]Originally posted by nic:[/b][quote]Wow, looks like some long time loyal ATI fans may be making a switch...wonder what ATI has lined up to counter this?[/quote]I don't think any ATI fans are going to be jumping ship too quickly... By the time this card is on the market, (June/July) ATI will already have its next flagship card announced, and it will best nvidia's offering.By the time ATI releases that card, nvidia will counter with it's next flagship card's information, which will edge out ATI... It's been going on for years like this, and to think that this will be the 'decisive victory' card for nvidia is pretty foolish. That said, the SM4.0 and Directx10 support is nice, but harkens to the days of the GeForce256, quite possibly revolutionary, but I offer ten to one odds that the card is obsolete before either of these technologies are taken full advantage of.
Kreuger said:
I didn't know GPUs were becoming dual core. That's pretty cool. I don't expect to get one of these for 4 years or so because the pricing for graphic cards are outrageous.
mentaljedi said:
I would never spend so much but then again, i haven't gota job as i'm still a stendent...
nathanskywalker said:
Wow, i'm gonna be sick. Come on, the 7800 just came out. Ok progress=good, but this is ridculus!!! ahhh!!! But a dual core....very nice, not that anyone(most people anyway) could possibly need that for some time yet to come, but still, like was pointed out in previous posts, previous cards will go down in price, and that is good for just about everyone.
Nic said:
[b]Originally posted by Cartz:[/b][quote]...and to think that this will be the 'decisive victory' card for nvidia is pretty foolish...[/quote]Please mind the language Cartz...I never mentioned anything about 'decisive victory' in my post and I don't appeciate your remarks. Yes, you are quite correct that the balance of power swings between nVidia/ATI in cycles, but it has been quite some time since ATI had the lead. With nVidia due to release such a powerful card in April, my comment simply suggested that it would take something very special indeed to best nVidia's upcoming card.Further, consumers WILL flock to buy nVidia hardware if it delivers the best bang for your buck, and this WILL include existing ATI fans. That is after all how ATI got me...
barfarf said:
[b]Originally posted by JMMD:[/b][quote]I always try to stay one release behind so I'm not paying top dollar for any new cards.[/quote]Totally agree. Eating day old donut and buying day old technology is for frugal living.Plus since directx 10 is not out yet maybe this card will suck at it. I could not find much about directx 10 except that it will be shipping with windows vista[url]http://en.wikipedia.org/wiki/DirectX[/url]This is the first i heard of dual core video cards. I cant wait see what happens in a year when software and hardware catch up with each other.
Cartz said:
[b]Originally posted by Nic:[/b][quote][b]Originally posted by Cartz:[/b][quote]...and to think that this will be the 'decisive victory' card for nvidia is pretty foolish...[/quote]Please mind the language Cartz...I never mentioned anything about 'decisive victory' in my post and I don't appeciate your remarks. Yes, you are quite correct that the balance of power swings between nVidia/ATI in cycles, but it has been quite some time since ATI had the lead. With nVidia due to release such a powerful card in April, my comment simply suggested that it would take something very special indeed to best nVidia's upcoming card.Further, consumers WILL flock to buy nVidia hardware if it delivers the best bang for your buck, and this WILL include existing ATI fans. That is after all how ATI got me...[/quote]Ah, sorry Nic, re-reading my original post, I guess it kinda sounded like I meant to call you foolish, which is not what I had intended, I just meant to say that it would be foolish to think that this will be the be-all end-all of video cards. No personal attack intended.Quite some since ATI has had the lead you say? What about right now? The X1900XT handily woops the 7800GTX 512 in every benchmark I've seen, so now it is Nvidia's turn to steal the speed crown back.Also, I find it exremely debateable whether Nvidia has offered better bang for your buck over the last 6 months. The 7800GTX 512 was ridiculously overpriced, when you could get 75-80% of the performance out of a 7800GTX or X1800XT for 50-60% of the price. Even now, if you can still find one, the 7800GTX 512s are still priced through the roof, even moreso then the X1900s. If Nvidia tries to pull this kind of extortion again with their next flagship card, I think you're absolutely right, people will go for the most bang for their buck, which will either be ATI's R600 or the reduced price X1900XTX.
Vaulden said:
Wow... even if the specs are exaggerated they look very nice! I can't wait to not only see what these cards can do... but what about the generation after it? I know it's really early, but if I'm not mistaken this is a much larger jump than has happened in a while. What can we expect for the future of video cards? Combine this with the expected improvements in the CPU world and we have a match made in heaven!
gamingmage said:
Wow this is a huge step and I can't wait for it. The main things I will look for are price, performance, and compatablity. They can't make these to expensive or nobody will buy them. Hopefully the performance jump is huge or at least significant. Probably these cards will all be PCI-e x16.
PUTALE said:
wow, dual core is coming to gpu as well. This is nice. Looking at the specs, it looks like nvidia will come up as the king once again.
thomasxstewart said:
Heres few more details:G80card.//FUTURE:We heard that G80 will be in time for launch in June during Computex and the process technology is likely to be 80nm at TSMC. In the recent statement, NVIDIA has said that they will be backing the 80nm "half-node" process by TSMC where it allows reduction of die size by 19%. We have previously mentioned that G80 is likely to take on the Unified Shader approach and supports Shader Model 4.0. G80 is likely to be paired up with the Samsung GDDR4 memories reaching a speed of 2.5Gbps. As for ATi, the next generation R600 is slated for launch end of this year according to the roadmap we have seen and the process technology is 65nm. It seems that the leaked specs of the R600 that surfaced in June last year is pretty likely. According to Xpentor, NVIDIA G80 will make ATI stumble on April. Quad SLI itself can be implemented on a single card with two chips solution because it will carry the first dual core GPU ever with the support of DirectX10 and Shader Model 4.0. The development of G80 is also mentioned as being running very intensive since NVIDIA's acquisition over ULi. As for the upcoming G71, there will be 32pipes, increase in ROPs and a little speed bump over the core clock. Date moved up from June to April '06 in one breath.Thats high speed marketing.(carbon nanotubes may be certain eventual dynamic change that present engineering must get as much as possible today, before whole system changes again).SOME MORE SPECS65nm 64 Shader pipelines (Vec4+Scalar) 32 TMU's 32 ROPs 128 Shader Operations per Cycle 800MHz Core 102.4 billion shader ops/sec 512GFLOPs for the shaders 2 Billion triangles/sec 25.6 Gpixels/Gtexels/sec 256-bit 512MB 1.8GHz GDDR4 Memory 57.6 GB/sec Bandwidth (at 1.8GHz) WGF2.0 Unified Shader more disruptive will be carbon nanotubes next year, many specs will go (up to) 10X if nanotubes pan out.Many ask why so fast change? Well game cards have been way over priced since sli year ago.SLI in itself was to straighten out INTELS pci express 16x flop & develope "true" 16x slot. Think of Ms. Pacman, 8 kb. Then think of sharek eating tropical fish, same game (mouth action back & forth action) only 8mb. 3DNA was amoung first next step to 250 mb games & 3dnow. Yet progress has slowed, equipment is stalled in developement limbo. demands of hdtv large screen interactive gaming may be next step en masse'.What world needs is more powerful card, not wacko expensive cards. 32X slot will be great help.Once done it should remain for quite awhile, Yet today most games sold are like tetris, ms. pacman, or shooter over & over.Power will free us, sooner better.As well as my last thoughts inserted above about nano processors & gddr4 nano memory. Signed:PHYSICIAN THOMAS STEWART VON DRASHEK M.D.
KillerPrince said:
wow that is amazing directx10 now what are they gonna think of next?
canadian said:
Go nvidia, yea! Im so going to buy this. Imagine this, but in SLI!!!Nvidia is so much better than ATI. Although, ATI is Canadian...Should be interesting.
blackdragon1230 said:
[b]Originally posted by djleyo:[/b][quote]dual-core GPUs, DirectX10 support, Shader Model 4.0,GDDR4, core speeds 800MHz(and more)!!!everyone start drooling and pick up your jaws of the floor !!!do we really nead more ??? this news item made my day!! hooray for nvidia this olso means the series 7 gpu's are going to have a nice price droptake that next gen consoles imagine two of these bad boys in 16x SLI bye bye playstation 3 [/quote]Well i cant say for sure about the XBOX 360 gpu(not really a fan of the 360 or Ati) but as far as the Nvidia Rsx its still going to outdue the G80. IF what i found is true the RSX has a 7 series main core and 6 support cores. It seems that Nvidia and IBM did some close work on the RSX and applied cell technology to it, if the schematics i found are correct. But huray for Nvidia for taking the next step in the video card evolution.[Edited by blackdragon1230 on 2006-03-05 02:26:15]
JavaJawa said:
While my 2x7800GTs are fine, having only just got a new PC containing them a few weeks ago, I'm a bit peeved to find out that the G80 is coming out so soon . . .
dicexzx said:
Forgive me for being so naive when it comes to computers, but I was just curious will the G80 still be able to run with other operating systems other than Vista? If so, will there be any cons about this and will it still run all other games with ease or do we have to wait for Vista to be released? Thank you.Edit: I was also curious when do you think the G80 will be released? Because some people are saying in June, some are saying we might even have to wait until 2007. I was just curious when you guys think the card will be released. Thank you.[Edited by dicexzx on 2006-04-02 16:52:02]
Cerberus666 said:
[b]Originally posted by jmag034:[/b][quote]It wouldnt matter if it supported DX 14.0 and SM 7.0...the image quality sucks. I get the same detail with a 7800GT that i could get with a 9800pro. [/quote]Yeh probably. I used to run BF2 on medium detail with the 9700 pro BUT i now run the same settings with the X1300 at double the framerate (1024x768 in both cases) so its not just the quality of each frame but the quantity as well...the DX10 and SM:4 support are probably to futureproof them and the sick clock speeds will just help to boost the framerate 100+ even on 16x12 - not that any monitor can support that kind of refresh anyway...a futureproof GFX card and in SLI the performance is unimagineable - only its all going to come at an astronomical priceJust an after-thought: Anyone think the G80 series is going support PCI-E 32x?And in response to 'What's ATI got lined up to counter it?'"Probably DX11 and SM:5 with 2GB of memory and (called the X2000XTXTXTX)" :P[Edited by Cerberus666 on 2006-08-11 08:02:56]
Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.