Nvidia 9800GX2, waste of money

Status
Not open for further replies.

sledgus

Posts: 115   +0
Again we see nVidia taking their 2 of their flagship GPU's, shrinking them, and whacking them together on a single board at a nice heavy price. Is it really worth it?

At the moment, Crysis will not run smooth on any hardware currently available. I know people will claim it does, but we all know that it is a load of BS. EVERY single benchmark (real world benchmark too, not synthetic) shows that with the very latest and best Quad Core CPU, and 2 (sometimes even 3) 8800 GTX Ultras, Crysis will bring the system to its knees never rendering more than avg 50 or so frames a second. So will 2 8800GTX's on one board be any different?????

What we need, is, a COMPLETELY NEW architecture. We don't need another repeat of the 7950GX2 a few years ago, we dont need two cards (which are now getting old) fused together. we need a NEW CARD.

I know nVidia would be working on a new card, but them releasing the 9800GX2 just annoys me, because anyone who buys it would just be wasting their money while nvidia makes big $. It WILL cost a fortune, and it won't even be a next generation card.

Anyway, the point I want to get across to anyone out there thinking of buying one, is, don't buy it. Crysis is the new benchmark in gaming, and as new games come out and the standard gets higher, your going to be stuck with a card that cannot run the games truly smooth, and you'll be looking at your pocket wishing you didnt fork out the $1000.

Wait untill nVidia releases their upcoming 'true next generation' 9000 series (not the mid range 9600GT, and the stupid 9800GX2) and see how they fair with Crysis, Call of Juraz, Lost Planet etc.

Let me know what you think :)
 
I thought thats where they were trying to go with PhysX cards


speaking of Crysis have you all seen this guy stream it to his PS3 (2nd video down, on right side)

http://www.streammygame.com/smg/modules.php?name=Tutorial


Playing the latest PC game on the PS3 is now a reality at HD resolution and fast frame-rates," said Richard Faria CEO. "I have a PS3 in my living room and PC in my office and my two kids both have old PCs in their bedrooms," he said. "Now we can play games anywhere around the home. StreamMyGame's technology networks the power of a main PC so it can be used to play high end games on other PCs, PS3s and Linux devices".

Currently, you can only use StreamMyGame over a LAN network, but it is being expanded to broadband in March this year.
 
sledgus said:
What we need, is, a COMPLETELY NEW architecture. We don't need another repeat of the 7950GX2 a few years ago, we dont need two cards (which are now getting old) fused together. we need a NEW CARD.
I don't disagree with anything you said, except maybe the part that I quoted. Was the 8800 not a 'new' card? It completely blows away anything in the 7000 series, thats pretty good for me.

Crysis is definately hard on cards, but the 8800 handles other games fine, I'm not sure we should be upset with nvidia for Crysis not being able to be ran full out.
 
Yeah I agree with u completely, the 8800 was a new card, and it absolutely smashed everything in the 7 series. It was a true next generation card for its time, and was nothing short of awesome. But that is exactly what we need now, we need a new card that smashes everything in the 8 series. All we are getting from Nvidia, is another stupid iteration really, of the 8 series (two 8800GTX's screwed together)

We need a card that is true next generation, that will run Crysis on
1280X1024 with all settings set to High, and atleast 60 fps or greater. It's really not that hard, and Nvidia will no doubt do it, I'm just blasting them for making this stupid, waste of life 9800GX2 in the mean time. It's NOT what gamers/the gaming community needs.

I also don't understand why they have named the card '9800GX2'. Why did they name the model in the 9 series? Everyone will read the name of the card, and think 'Oh man, this is a next generation card, it's the 9 series!' When really its just two 8 series card stuck together. They should have named it the 8850 GX2, or 8900GX2 or something, so that it stays in the 8 series category where it should be. Just like the 7950GX2, it was 2x7 series cards stuck together, hence the '7' in the model name. Just wierd
 
SNGX is saying that we can't really complain about the 9800GX2 just because it won't run Crysis too well. For most games though, it will run them a good deal faster than 8800GTXs. It will be a powerful card, and you can't judge it just based on its performance in one game.
 
I'm sorry, but I don't see the point of your complaint. As a company, NVidia will do anything to make money and this includes tricking ignorant people into buying graphics cards that aren't great. Will they be able to fool the core community? No. But at least they will make some $$ for not really doing anything. And as to a card that smashes the 8 series, well the 8800GT is a wildly popular card. Why would they want to make it obsolete already when they can be pocketing more cash without researching better cards? On the other hand, the point of the 9600GT is as a competition to the cheap 3850 ATI cards, allowing them to dominate ATI on both high and midrange.
 
Yeah, Nvidia is a corporation, and as such, it exists to make money, not to please the customers. Although we customers have to be happy with what we buy for them to make money. But that means that they release whatever they think people will buy, not what gives the best performance per dollar. Performance per dollar is not what they want to do, because they make less money like that.
 
MetalX said:
SNGX is saying that we can't really complain about the 9800GX2 just because it won't run Crysis too well. For most games though, it will run them a good deal faster than 8800GTXs. It will be a powerful card, and you can't judge it just based on its performance in one game.

I understood exactly what SNGX said, you don't understand what I said. I'm not bagging the card for not being able to run Crysis well, I'm having a go at nVidia for not being innovative, and releasing a card that is simply put; Not Worth the Money.

Nvidia are already annihilating the gpu market, they recently posted their billion dollar quarters (their FIRST in the companys HISTORY). Their 8800GTX has been the best card (un-disputed, un-challenged) for over a year now. Never in history has ONE card by ONE manufacturer been so dominant for so long.

They are making awesome sales atm, especially with their 8800GT. It is common knowledge now that nVidia are the best, and hardly anyone is going ATI at this point in time. This 9800GX2 is not necessary, and many gamers are going to buy it only to be left behind as they are really buying technology that is old already.

That is the main purpose of my post, to give people food-for-thought In the hope people consider these things before deciding to fork out a big sum of $ for the card. As I said before, it will cost a fortune, it will run current games well, but not next gen games.

It is 2008, and with Crysis we have already seen what next generation games are going to look like (and what hardware they will require). in 2008 next generation games will be released, do you want to spend $1000 only to have old technology? See my point?
 
Uh I'm pretty sure the 9800GX2 is not 2 GTX but 2 GT's slapped together. Just to add a little more flame to the fire, they failed at this technology once (7950GX2 was 2x7900GTs slapped together) before so lets hope they aren't just trying to get rid of as many G80 refreshes as possible to make cash for the next series... uh oh.

The HD3870X2 (according to questionable chinese benchmarks) already beat the Ultra by 30+% which means its in theory as fast as or faster than the new 9800GX2. I love paper releases. Also everyone (who is thinking SLI.. I know I am) needs to remember the DX9 cannot use 4 GPU's to write images to the screen. It only allows for 3 draw sources (GPUs) access to the screen at once makeing a quad setup of any kind completely useless. If I'm not mistaken DX10 allows for an unlimited amount of draw sources. Maybe through emulation in DX9E (Vista) this issue may be fixed but as far as I know it hasn't so welcome to Vista everybody!!!
 
Yes that's right, thanks for the correction. Yeah the 7950GX2 sounded awesome on release, then was just an absolute flop thanks to driver problems which were never fixed.

ATI's X2 is very impressive, although from the benchmark results it absolutely sucked on Crysis. I think this will be addressed shortly though, and with new drivers the HD should top the Crysis performance chart too.

foxhound81 said:
Well considering that the forcasted price for the 9800gx2 is around 450 dollars, I think its a great deal.

That's the first i've heard of that. Numerous sources on the internet are forcasting a price of anywhere between $600-$800 USD, depending on availability
 
I don't really see where you're going with this thread. You first tell us it's a waste of money, they you back down and say that its just not innovative, and now you just seem like you're trying to convince everyone to not buy a 9800GX2. I'm lost.
 
I'm simply going to say that a dual GPU system or video card is a waste of money for those games that see no performance boost with a second GPU, especially if the that same game or utility gets playable frame rates with a single GPU. The few games that are only playable with a dual GPU system are few and far between, and are usually only unplayable at HD screen resolutions.
 
MetalX said:
I don't really see where you're going with this thread. You first tell us it's a waste of money, they you back down and say that its just not innovative, and now you just seem like you're trying to convince everyone to not buy a 9800GX2. I'm lost.


lol what the hell are you talking about? I'm saying all 3 of those things.

1. Its not innovative
2. It's a waste of money
3. If you buy it, you will be left behind

I never backed down, what are you on about?
 
I've seen actual benchmarks of the HD3870x2. It didn't beat the Ultra by 30%. In fact it lost in a few areas, but won most rounds. It seemed to struggle a lot in DX10 and when AA was applied. AMD said it was a driver issues and they will be bringing out a new set of drivers before the 28th this month.

If the price is right and it will sell at $450 USD than that is a fricken fantastic bargain. It will dominate the 8800Ultra in performance, and smash it in price. Its currently the new king of GPU's until nVidia release their 9800GX2 (the ones already posted on the net are fake and was apparently done with two 8800GTX's which where underclocked).

ATI's and nVidia's next gen cards are going to be quite interesting. last year nVidia said their next gen cards will be twice as fast as the 8800GTX, now its only gonna be 30% faster (And from what I hear its much the same as the 8800GTX only with higher clock speeds). And ATI's next gen cards are meant to be a good 30-50% faster. Perhaps ATI will take the crown next round.
 
Blind Dragon said:
Playing the latest PC game on the PS3 is now a reality at HD resolution and fast frame-rates," said Richard Faria CEO. "I have a PS3 in my living room and PC in my office and my two kids both have old PCs in their bedrooms," he said. "Now we can play games anywhere around the home. StreamMyGame's technology networks the power of a main PC so it can be used to play high end games on other PCs, PS3s and Linux devices".

Currently, you can only use StreamMyGame over a LAN network, but it is being expanded to broadband in March this year.
If you took a closer look you will realise that the game was NOT playing smoothly. It looked like it was playing at 15-20fps. AND it was being played at 800-600 resolution (so it can fit on the screen). If the PS3 can't cope playing Crysis with a constant 30fps at those resolutions, it will have no hope when/if it gets ported from PC to PS3. They will be forced to take a few visuals and features out to make it playable. Hell even my PC and HD3850 can play Crysis at 800-600 at highest details faster than what it was shown on the PS3. Guess that goes to prove my PC outperforms the fastest consoles:grinthumb
 
Don't you understand what that device does? The game is not being played on the PS3. The game is running on the PC, and the input from the controller and the output from the graphics card are being streamed between the PC and the PS3. The PS3 is basically just projecting the image it receives from the PC onto the screen.

Your PC probably is faster than consoles, because they've all been out for over a year now.

Like I said, where is this thread going?

I seem to have lost any sense of anyone trying to make a point here.
 
sledgus said:
Yes that's right, thanks for the correction. Yeah the 7950GX2 sounded awesome on release, then was just an absolute flop thanks to driver problems which were never fixed.

ATI's X2 is very impressive, although from the benchmark results it absolutely sucked on Crysis. I think this will be addressed shortly though, and with new drivers the HD should top the Crysis performance chart too.

Dude everything is going to suck at Crysis and its because the people developing it took the wrong route. Then they told everyone that they developed the game with a GTX which was a complete lie. They developed something ultra pretty on high end animation hardware and thought they could simply change a few lines of code and it would work for everyone. 3 years later... the game is still poop which is too bad seeing as the story writers were on point. Remember the movies of shooting palm tree leaves and dew falling off of them? Yeah... not even in Very High.
 
uhatemedoncha said:
Dude everything is going to suck at Crysis and its because the people developing it took the wrong route. Then they told everyone that they developed the game with a GTX which was a complete lie. They developed something ultra pretty on high end animation hardware and thought they could simply change a few lines of code and it would work for everyone. 3 years later... the game is still poop which is too bad seeing as the story writers were on point. Remember the movies of shooting palm tree leaves and dew falling off of them? Yeah... not even in Very High.


uhatemedoncha , that was a nothing post. We have already established that everything sucks on crysis, and that is not the issue of debate here at all.

I didn't know they stated they developed everything on a GTX, but yeah I agree that is obviously a load of crap. Game developers don't develop software on 'consumer level' hardware, they develop it on workstation/developer hardware like the nVidia Quad FX platforms. Maybe they developed it with goal of it being able to run on a GTX, but that obviously turned out pear shaped aswell as it won't run well on 3x 8800 Ultras let alone one single GTX lol. That's quite funny.

On another note though, I must say that the 3870X2 is DAMN IMPRESSIVE! It wipes the floor with every single nVidia card, and smashes games at 2560 resolution! DAMN AWESOME....
 
Of course it does. It's a dual GPU card, and should be classified as a dual card solution. Thats why it beats every Nvidia card, because none of them are dual-GPU.
 
Of course, if ATI can do it, how hard will it be for nVIDIA to slap together two 8800GTs in the same way? And we all know how the 8800GT vs HD 3870 turned out...
But getting back to the point, would the 9800GX2 have any improvements like higher core clocks etc like ATI has done with the 3870 X2? Also, any definitive MSRP yet?
 
Ah, I thought it was two 8800 GTX cores in the 9800GX2. Makes sense to put G92 in there, although I have one question. Isn't SLI natively supported in G92? What is the "SLI chip" there in the 9800GX2 for?
 
Two chips on 2 boards- Nvidia calls sli connector chip
Two chips on one board- Ati calls pcie bridge chip

Two 9800GX2 may indeed run sli giving you "quad" sli
Two 8800GT = One 9800GX2
Two 9800GX2 in sli = Four 8800GT's

That's what it supposed to be able to do any
 
Status
Not open for further replies.
Back