Nvidia's dual-GPU GeForce GTX 59x pictured

Matthew DeCarlo

Posts: 5,271   +104
Staff

A shred of photographic evidence of Nvidia's rumored dual-GPU GeForce GTX 59x has emerged online, courtesy of Chinese site eNet.com. Earlier reports referred to the card as a GTX 590, but the Chinese media is calling Nvidia's upcoming entry the GTX 595. No matter its name, the card should pack quite the punch, supposedly featuring two GF110 graphics processors -- the same architecture Nvidia used in its recently launched single-GPU flagship, the GeForce GTX 580. It remains to be seen if the card will pack twin GF110-based chips, and if it does, many suggest they won't be quite as fast as the GTX 580's.


Instead, the GTX 59x could be powered by two cut down GPUs that will be used in the card(s) below Nvidia's GTX 580, such as the GTX 570. Those chips, whatever they may be, will supposedly be linked via the NF200 SLI bridge. There's also talk of 3GB of VRAM, or 1.5GB per GPU, but further specs are unclear. It's worth noting that most Nvidia cards can only drive two displays at once, but the GTX 59x is said to have three DVI outputs. Supporting tri-display setups out of the box could make it the go-to card for Nvidia's 3D Vision Surround technology, which currently requires two cards (barring the GTX 295 with projectors).

Permalink to story.

 
What is that small chip in the middle? does that like control the dual gpus and make it appear as 1
 
wagan8r said:
blimp01 said:
What is that small chip in the middle? does that like control the dual gpus and make it appear as 1
It's a Tegra 2 for a performance boost!

JK.

Joking aside, slapping a tegra 2 onto the board for 2D and low power 3D apps could prove to be quite power effective against using the downclocked GPU(s). I wonder if the red or green team have tried this, obviously I can't really prove anything but it could be quite effective! :p
 
i wonder how fast this is going to be compared with the gtx 580, could it be as stellar as the gtx 295 was back then?.. cos i do own a gtx 295 and its pretty much the fastest card i have. anyway i am still looking forward to amd's dual card offering before i make my next major purchase.
 
Can't imagine the power that a card like that will ask for. AMD seems to be going the same rout... making cards that ask for an incredible amount of power. At this rate I can't imagine what cards will be asking for in a few years. Given that cards are finally being able to push crysis, isn't it time now to focus more on efficiency and give performance a rest?
 
The problem with your logic is you are assuming gaming technology has become stagnant since Crysis. It has not. It only gets better and better requiring more and more power. Thus as soon as you start focusing on effeciency instead of performance PC's will effectively become giant consoles as the progression of technology will cease.
 
You're wrong, Crysis is still the most demanding game around, and it is the only reason cards such as a GTX 580 are worth buying. I think you people look at the length of the bar rather than the actual figure in those performance charts.Though these cards are at the top, it doesn't mean that the midrange ones are obsolete. My 4870 still eats any game alive at 1680x1050, all except crysis. I suspect also that crysis 2 will not ask for much more than crysis 1 since its also coming for consoles. If you're not buying a card for crysis, then a gtx 580 or 5970 is a ridiculous option. A 6870 or even a 6850 would suffice otherwise.

Requiring more and more power is bullsh*t... techology means efficiency. There is nothing special about offering me twice the performance while asking for twice the power... we already have that option with sli and crossfire. Giving me more for less is what you should define as an improvement in technology. Besides, and improvement in efficiency automatically means an imrovement in performance, if the card can use less power, it allows room for improvement while still only asking for a reasonable amount of power.

I suppose, however, that giving performance a rest is asking too much; but, i do think that there is no need for the next generation of cards to greatly surpass this generation in performance because, in my opinion, the current generation offers enough today, and should still be enough for a while as crysis 2 wil probably reign for another 3 years... and i believe we are ready for it even now. Instead i think the next batch of cards should offer better value and efficiency first, even if it means only a small bump in performance.
 
Excuse me while I play the pedant...
You're wrong, Crysis is still the most demanding game around
I think you'll find that Metro 2033 is the king of making graphics cards cry
and it is the only reason cards such as a GTX 580 are worth buying.
Ever heard of distributed computing?...No?...examples here, here and here. Sticking with a degree less altruism...3-D gaming? I'll leave it there for the moment...
I think you people look at the length of the bar rather than the actual figure in those performance charts..
The kind of people who base their graphics purchase on the "length of a bar" generally don't include those shelling out $400 for a card. The people who are impressed by pictures at the expense of substance generally fall into two categories: Those who won't ever buy one, and the those who won't ever buy one and envy those people who do.

Though these cards are at the top, it doesn't mean that the midrange ones are obsolete.
Of the fourteen comments posted, not a single one is stating any such thing. I think most people are well aware that a new release doesn't instanty invalidate the hardware they own.
My 4870 still eats any game alive at 1680x1050, all except crysis..
Metro 2033 with tessellation, DoF and 4x MSAA enabled ? How about any other DX11 title using ambient occlusion, water simulation etc.? Just because one person is willing to forego some of the games IQ settings doesn't mean that everyone is of a like-minded disposition.
I suspect also that crysis 2 will not ask for much more than crysis 1 since its also coming for consoles. If you're not buying a card for crysis, then a gtx 580 or 5970 is a ridiculous option. A 6870 or even a 6850 would suffice otherwise.
Really. And just what kind of gameplay experience are you afforded with a HD 6850/6870 at 5760x1080 @ 120Hz ?
Requiring more and more power is bullsh*t..
. techology means efficiency...
Then why reference the HD 6850 and 6870 ? Both cards produce inferior performance/watt than their predecessors (HD 5770/5850 ? -pricing/model structure)
There is nothing special about offering me twice the performance while asking for twice the power.....
Still a better metric than comparing Juniper/Cypress with Barts/Cayman
we already have that option with sli and crossfire
Which of course doesn't scale 100% when adding a second card (i.e. less that double the performance for twice the power)
Giving me more for less is what you should define as an improvement in technology.
That generally requires a smaller process node. If you hadn't realised, both the HD5000 and 6000 series (along with the HD 4770) both use 40nm, and that process is now at it's physical limits -hence the transition to 28nm for both AMD and nvidia next year.
Feel free to name a GPU (or CPU) that offered significant performance and lower power consumption as it's predecessor on the same process node.
Besides, and improvement in efficiency automatically means an imrovement in performance.
Wrong. The metric that is changing here is performance/watt. The highest performing graphics cards are most definitely NOT the most efficient. Don't believe me? then check out the cards in the lower segment of these power consumption graphs
if the card can use less power, it allows room for improvement while still only asking for a reasonable amount of power.
"a reasonable amount of power" ? personally, coming from a multi-gpu backround I find ~400-500w (max gaming load) quite "reasonable"- kind of depends on how you view these things I suppose.
Your argument holds water only if there are no alternatives to the enthusiast card. What stops you from purchasing an IGP solution or any other "cost-effective" graphics solution?
I suppose, however, that giving performance a rest is asking too much
Nope. That's why tablets, netbooks and e-readers exist. Just don't expect everyone else to have the same priorities that you ascribe to.
I do think that there is no need for the next generation of cards to greatly surpass this generation in performance because, in my opinion, the current generation offers enough today
As I noted, not everyone is suited by the same gaming experience. Better hardware allows for a more immersive gaming experience. It also allows technologies to be introduced that further feed the need for more improved performance. Is Eyefinity not a valid gaming experience simply because lower specced and older cards can't use it?, likewise 3-D gaming, PhysX/physics, AO, DoF, tessellation, high levels of multi-sampled/super-sampled AA, higher minimum framerate, smooth gameplay, quicker game/map loading ?
and should still be enough for a while as crysis 2 wil probably reign for another 3 years and i believe we are ready for it even now.
No and Yes respectively...although you'll be getting lower framerates because of the DX10 filepath compared with DX11 gamers.
Instead i think the next batch of cards should offer better value and efficiency first, even if it means only a small bump in performance.
Want value and efficiency?, then look to the mainstream SKU's within a product line.
 
First of all, relax.

How many people use eyefinity, i am referring to the people at the middle of that normal distribution, and even at the extremes, gtx 580 (sli) and 6990 (crossfire) performance levels will suffice for a long time to come. Secondly, Metro is a f*cking joke, stop kidding yourself. By no means demanding for the very high end cards, not to mention it favours Nvidia graphics, nor is it as good a game as crysis (i'll leave that one as my opinion). It is not the measuring stick, crysis still is, and it will be replaced by crysis 2.

Then why reference 6870 over 5770? There is still a line genius. If i was to go that way i should be getting a 5450. Fact is i wasn't even that impressed with the 6870 because it was no different from the 5000 series in terms of efficiency, but its still a better buy because of the additional features, improved tessellation, crossfire performance and of course, value. That is why i draw the line at 6870/50 rather than 5770. The 5770 is a great buy at lower resolutions and budget, but i'm still relatively in the high end range here... and the 6870 is my counter to cards like a 6990 or gtx 595 which i can argue is unnecesry for most people... outside of crysis (keep in mind here genius, normal distribution). You're clearly just trying to disagree with everything i say here.

"both the HD5000 and 6000 series (along with the HD 4770) both use 40nm"
That is exacltly my point, efficiency will not change, but performance will, meaning more power, given that, crossfireing old cards is an option. This is the reason i would say that this new batch of cards were not even necessary (the Gtx 580 was since the 480 was piss poor) but i still welcome them because they have more to offer besides an improvement in efficiency. Tablets and e-readers, again, you need to relax, i'm still talking about gaming.

Let me restate my initial point here...
Crysis, as i said earlier, is the measuring stick, as it has been for the last 3 years, and it will be replaced by crysis 2, which will probably reign for another 3 years. Our average gaming resolution is 1920x1080 (not 5760x1080) with 4xAA give or take. A gtx 580 or 5970 can give us a framerate of at least 50fps at higest settings, and the 6990 may give us even more, and then we still have the option of sli and crossfire to go past 60. Crysis 2 may not ask for much more than crysis. Not other game is this demanding, and it will be a long time before the average game is as demanding as crysis. So i'm saying that crysis has been defeated, and it will be a long time before games require more performance than crysis. But, although crysis is now defeated, look at the cost... desktops using more than 600 watts of power, some nearing 1000... that is my concern, and that needs to come down. I think people got a bit too obsessed with more performance and forgot about power draw, i'm saying it has gotten out of hand.
 
First of all, relax.

Please take a leaf out of your own book and calm yourself down.

A healthy, spirited discussion is always welcomed, but I feel its going a bit too far. We all have an opinion, but please cast this in a polite, friendly manner.
 
First of all, relax.
Troll bashing is my hobby. So all good on that score.
How many people use eyefinity.
Much more than were using it when the HD 4870 was current spec. If fact, around 2-3% of gamers now enjoy multi-monitor gaming (Up from 0% last year). See Steam Hardware Survey. Link to relevant results later in the post (note: this is called adding a citation. It adds supplemental information, places that information into context, and distinguishes "Fact" from "Opinion")
i am referring to the people at the middle of that normal distribution.
The GTX 580 et al aren't aimed at people within one standard deviation of the average...that is what you were trying to say with the backwoods math right?
The buyer of a GTX 595/HD 6990 or it’s equivalent in previous model lines might make up 1% of the buying market at most (hint: this lies outside two (2) standard deviations)
and even at the extremes, gtx 580 (sli) and 6990 (crossfire) performance levels will suffice for a long time to come. .
kevin knows this since he's tapped into future game devs, DirectX 12 implementation, GPGPU and it's parallel computing implications. Tell me kevin, how do you see the future of ray tracing?
Secondly, Metro is a f*cking joke, stop kidding yourself. By no means demanding for the very high end cards.
At 4 x MSAA, tessellation and DoF enabled? (or are you making up special kev settings?) Nice try. There is a reason why cards are never benchmarked using 4 x MSAA in the game...this is it.
If you're having trouble comprehending the results, it shows that the HD 5970, 5870, 5850, GTX 480 and 470 all fail to play this game at all at 2560x1600 -not an unreasonable resolution people buying the highest performing SKU's. Playability even at 1920x1080 means dropping adv DoF
not to mention it favours Nvidia graphics .
Duh, this thread is about an nvidia graphics card
nor is it as good a game as crysis (i'll leave that one as my opinion). It is not the measuring stick, crysis still is, and it will be replaced by crysis 2..
Crysis is a graphically superior corridor shooter with the plot of a Jean Claude van Damme movie. If that's what turns your crank -and it seems it does- who am I to deny you your gaming chubby?
Then why reference 6870 over 5770? There is still a line genius...
The arbitrary line that Kev the First chooses obviously. How about the HD 5850 that I also referenced?
If i was to go that way i should be getting a 5450.
Maybe you should...at least it's DX11 compliant.
Fact is i wasn't even that impressed with the 6870 because it was no different from the 5000 series in terms of efficiency
If you could read the TPU charts (which are representative of everyone elses findings) you'll see that the 6870 offers inferior efficiency.
If your not impressed (AMD are likely wringing their hands at this thought) why use the 6850/6870 as a reference in the first place?
That is why i draw the line at 6870/50 rather than 5770..
Maybe we get the story on the front page! "Kev draws the line at 6870. AMD, Intel and nvidia to cease development of enthusiast, GPGPU and HPC graphics along with parallelised computing"
I'm guessing your knowledge of graphics card development is roughly equal to the number of Eyefinity capable HD 4870's.
....cards like a 6990 or gtx 595 which i can argue is unnecessary for most people....
Again.....enthusiast graphics aren't aimed at people like you. They are aimed at people who want (not need) max performance.
outside of crysis (keep in mind here genius, normal distribution
Pretty sad with the constant references to a three year-old game....or haven't you clocked it yet?
(Hint: the long-buried, recently discovered aliens did it).
You're clearly just trying to disagree with everything i say here.
Only the bs you're spewing forth....hey guess what we're both right on this point, Huzzah!
I'm still talking about gaming.
Very short sighted there kev.
What about the workstation market ?
What about GPGPU ?
What about HPC ?
You blather on about efficiency this-and-that and don't even realize that you're destroying your own argument by referencing a card (Fermi) that is improving performance/watt efficiency by leaps and bounds in workstation and high performance computing.
By contrast, the CPU-only Jaguar has a Linpack/peak efficiency of 75 percent. Even so, Tianhe-1A draws just 4 megawatts of power, while Jaguar uses nearly 7 megawatts and yields 30 percent less Linpack.
The same Tianhe-1A was originally configured with HD 4870X2 cards. Replaced by Fermi cards to increase efficiency.
And of course workstation and HPC cards are available because they are driver and hardware optimized variants of enthusiast desktop graphics cards- you could say that Tesla, Quadro and FirePro exist because of the enthusiast desktop graphics market...mainly because it is true.
Let me restate my initial point here....
Do you have to?
Crysis, as i said earlier, is the measuring stick,
Crysis: Kev's Measuring Stick Edition....happy now?
and it will be replaced by crysis 2 which will probably reign for another 3 years.
Nostradamus Kev !
I'm willing to bet Crysis 2 might also be a linear corridor shooter that has less appeal that the original because the gameplay and style are already known quantities.
Our average gaming resolution is 1920x1080 (not 5760x1080) with 4xAA give or take.
Wrong. 1680x1050. Also note that 6 months ago multi-monitor gaming wasn't even included in the survey.
... desktops using more than 600 watts of power, some nearing 1000... that is my concern, and that needs to come down..
Why? Global warming? fire risk ? EM radiation causing higher rates of cancer?
If you don't want a high performance system, then maybe you...uh...shouldn't buy one. Personally I think that every narrow minded person who believes that their personal viewpoint affords them a right to dictate to others should be nailed to a chair, and to have their beloved Crysis DVD hammered into the back of their skull until the brain stem is severed....while I'm sure my views would gather traction in a public forum, I also realize that ill-informed people who make outlandish claims based on thirty seconds of fractured thought are also entitled to the personal freedom of voicing their view
I think people got a bit too obsessed with more performance and forgot about power draw, i'm saying it has gotten out of hand.
Obsessed with performance on a technology site ! Wow, how did that happen !
Might I suggest a site that is less performance orientated for you, since the ongoing striving for the Next-Best-Thing seems to be causing you some undue stress.
 
Wow, all that and not a single noteworthy point. You've stopped making valid points and are now just trying to bash whatever i say, 1680x1050 benefits my argument btw. I will not argue with you zero, as you are adept at repeating yourself and are clearly willing to go back and fourth without any substance, this is no longer an interesting argument, worth the read though, I even had a laugh.

One point i will respond to though... I do not disagree with the release of the GTX 580 or the upcomming 6900 series, i'm saying that i think their performance levels will be adequate for a while. Look at the last 3 years... in late 2007, 2 8800gt's in sli was a great setup, and today it can still perform quite well. I'll stop there so you can find something else to do with your time zero... on that note, you need to find a better hobby (one that you're good at lol). I don't think you are an ***** zero... we just have different views, and you haven't yet convinced me that i'm wrong.
 
1680x1050 benefits my argument btw..
Pointing out that your "guess" which you pass off as "fact" is in fact incorrect benefits you? On another level...a few months ago the same survey showed that 1280x1080 was the dominant gaming resolution, now it is 1680, in a few months it will likely be 1920x1080. You see where this is going? No?, then get your caregiver to explain.
I do not disagree with the release of the GTX 580 or the upcomming 6900 series, ..
Unlike your earlier post where you stated:
That is why i draw the line at 6870/50 .
So now your drawing the line at the GTX 580 and HD 6990. Nice flip-flopping.
You do realise the HD 6990 is likely to be straining the 300 watt PCI spec to breaking point? No? Quelle surprise.
i'm saying that i think their performance levels will be adequate for a while...
Then what? We get out the magic beans and conjure up a GTX 680 or HD 7990 that miraculously uses the self same TDP that flip-flop kev has seemed acceptable for the day?
...in late 2007, 2 8800gt's in sli was a great setup, and today it can still perform quite well....
So lets get this straight...you're using an example of 327 watts power consumption to prove how efficient it is? You do realise that a GTX 460 or HD 5850 would crush a 8800GT SLi setup all while using less than 80% of the power consumption? All while an 8800GT SLI setup barely manages gaming at 1920x1080.
Since you don't seem to know anything about GPGPU, HPC, graphics card evolution, SKU structure, how to read a trend, future gaming development or IQ implementation, I'll ask you another question or two (rhetorical. -look it up):
What happens if the user doesn't have a second PCIe x16 slot to utilise? (Examples being, mATX board, incompatible chipset/mobo layout, triple slot coolers, adjacent slots required for hardware RAID/sound/TV tuner cards etc.)
What happens if one of the cards dies and is replaced by a newer current model under warranty? Have you tried SLI'ing a 8800GT and a GT 240 ?
I don't think you are an ***** zero...
If only the feeling were mutual...
 
Relax guy, you're not even reading my comments and fully understanding them. I didn't change my mind and finally accepted the gtx 580, you've got me all wrong. I will not try to explain myself either, you're not really grasping anything here. And do you really think that the modal resolution in gaming will go past 1920x1080? When i made the point with the 8800gt's, i wasn't talking about efficiency, but about performance, i'm saying that the levels are still satisfactory today, trying to make the point that you can still survive on 3 year old hardware... which is why i said that extreme today will probably still be good even two years from now. I'm not talking about GPGPU's because thats not what this agrument is about, and you're 'rhetorical' questions are not at all relevant.

Zero, this is my last post here, and you got your wish, the feeling is now mutual. You're trying you're best to disagree with everything i say before even trying to understand my points, and then creating arguments that have nothing to do with me. You're not the 'intelligent debate' type, just the 'can't shut up' type. Have a nice one dude, maybe we'll cross paths in another thread.
 
...you're not even reading my comments and fully understanding them. I didn't change my mind and finally accepted the gtx 580, you've got me all wrong. I will not try to explain myself either, you're not really grasping anything here.
Nice use of Gibberish
And do you really think that the modal resolution in gaming will go past 1920x1080?.
Why do you think QFHD is now a standard ? and UHDTV probably soon will be.
When i made the point with the 8800gt's, i wasn't talking about efficiency,?.
Behold! The return of flip-flop! Excuse me while I revisit your earlier post...
techology means efficiency. There is nothing special about offering me twice the performance while asking for twice the power..
trying to make the point that you can still survive on 3 year old hardware....
Personally, kev, I'm an enthusiast. I don't really want to just "survive" with three-year-old hardware. I want the best I can get within my budget, and the fact that both AMD and nvidia continue to release these cards would indicate that there are plenty of other people with the same interest level. Neither company is going to release cards that people don't want to buy.
which is why i said that extreme today will probably still be good even two years from now.....
Only you are arguing this point.
Two years hence, these same cards will likely be on their second or third owner, hopefully giving them good service.
I'm not talking about GPGPU's because thats not what this agrument is about, .....
You can't seperate (or "agrue" the point as you put it) GPGPU from gaming because they are intrinsically linked. Workstation, HPC and desktop cards are the same hardware. Or are you proposing under the Edict of Kev that hardware now has to pass some kind of End User licencing to keep hardware out of gamers hands while allowing GPGPU users their use? So what about people who game AND program using CUDA ? What about those people who use their gaming rigs for Folding when they are not gaming....or is helping to find cures for disease not relevant as far as you are concerned? Would it surprise you to learn that the best Folding/BOINC points-per-day are also the high end enthusiast cards? Would it also surprise you that the top eight F@H teams on the planet are all based upon Enthusiast Gaming/Tech/Overclockers sites?
...and you're 'rhetorical' questions are not at all relevant.....( your not you're kev)
No, Not to people with such little understanding. I'll give you that.
Zero, this is my last post here,
PRAISE THE LORD!!!!!!!!! Enthusiast's rejoice, and join us next time for another of kev's rants on the evils of a technology run amok.
 
@dividebyzero

Sweet Jesus, I've never seen such a dumb troll so completely and thoroughly stomped into the dirt by facts and logic who still came back for more. I'd say some props are due to kev, for his tenacious will, if anything xD.
 
Ha! maybe so.

The joys of having some spare time between shifts. It's either troll carving or doing the crossword to relax out before service starts at the restaurant
Just a shame that so many threads here at TS get hijacked by trolls and flame artists. It tends to turns away the regular contributors and leads to a very short average thread length where any decent discussion is smothered at birth by morons with nothing better to do than play the fool/cyber warrior to get their jollies in an online world that affords them some recognition/status that they lack in the "real" world.
 
Back