Triple Monitor Gaming: GeForce GTX 590 vs. Radeon HD 6990

If you can get me two of the newer GTX 580's I will test what ever you want on them ;)

For the love of god don't do it! if he gets a pair of 580's to go with those three 30' dells, he will be the nightmare from down under!
 
For the love of god don't do it! if he gets a pair of 580's to go with those three 30' dells, he will be the nightmare from down under!

Want me to sort it dude.........Lighter is at the ready........... :haha:
 
I don't understand why game devs don't treat 3 screens as 3 individual "cameras" looking into the game world, instead of just expanding the FOV for the existing single "camera".

Sure, you'd have 3 vanishing points, so it'd look funny if you tried to publish all three as a single image (eg. a screenshot), but it really wouldn't be a problem for the gamer turning his head, and having bezels between each screen.

This solution would truly allow the gamer to separate his viewing direction from his gun direction, without distortion, which would be nice....

Seriously, why don't they do this? Anyone?
 
My guess would be that it is such a small niche market that providing better support to multi-monitor gamers is not a priority. After all there are so many more important things that game developers are failing to support these days, so multi-monitor support has to be low if they cannot get the basics right.

That said there are a few games that do support multiple monitors correctly and do detected them as individual panels, it’s just not wide spread.

Other quality games such as StarCraft II do not support multiple monitors because there is no point, as you are unable to see any more of the playing field to avoid giving richer gamers an advantage. I personally think that sucks but it is what it is.
 
What really bothers me about this article AND the comments posted is that everyone has seemingly overlooked the glaring difference in the tested cards.

AMD card --> 4GB of VRAM video memory = 2GB per card
NVID card --> 3GB of VRAM video memory = 1.5GB per card

This is the main reason for the performance difference particularly at the top resolution.

When a video card runs out of VRAM the frame rate drops dramatically as you can see in a number of the max res 7600 benchmarks although this fact is not even noted.

Antialiasing also uses large amount of VRAM which is also why bumping the AA to 4x, 8x etc effects the Radeon card less.

Video memory is mirrored across a multi card setup so basically the ATI card has 2GB to work with while the Nvidia card has 1.5GB to work with, 25% less to work with.

Its a shame this wasn't mentioned because it unfairly weights the results and is a definite inhibitor at extreme resolutions.
 
The 6990 and 590 are enough to make the concept of multi-display gaming cross my mind, but not more than that or enough to spend money on it.

40nm is 2009. The next step, 32nm, was not delivered to AMD & Nvidia on time by TSMC so they kept 40nm for 2010, and are planning to go directly to 28nm this year. Production has already started so expect a huge leap in performance and efficiency when they come, and that is sooner than you might think. You will get 25% better performance from a HD7950 CF than the 6990 and it will operate fine on a 560W PSU.

I don't think one should wait if that someone is a gamer. Go right ahead and build an i5 2500k with a HD68xx or GTX 560ti and a 1080P display, because you will get very close to optimum performance in the biggest games out now such as Portal2, Blackops, and Star Craft 2. Just don't blow your money on a high-end multi-display system now. Neither the game developers, GPU manufacturers, nor the display manufacturers are agreeing on how to connect their products optimally but that will change very soon.
 
What really bothers me about this article AND the comments posted is that everyone has seemingly overlooked the glaring difference in the tested cards.

AMD card --> 4GB of VRAM video memory = 2GB per card
NVID card --> 3GB of VRAM video memory = 1.5GB per card

This is the main reason for the performance difference particularly at the top resolution.

When a video card runs out of VRAM the frame rate drops dramatically as you can see in a number of the max res 7600 benchmarks although this fact is not even noted.

Antialiasing also uses large amount of VRAM which is also why bumping the AA to 4x, 8x etc effects the Radeon card less.

Video memory is mirrored across a multi card setup so basically the ATI card has 2GB to work with while the Nvidia card has 1.5GB to work with, 25% less to work with.

Its a shame this wasn't mentioned because it unfairly weights the results and is a definite inhibitor at extreme resolutions.

I am not sure why this bothers you, there is nothing we can do about the limited memory capacity of the GeForce GTX 590, it is what it is.

Furthermore we discussed the likeliness of the poor 7680x1600 performance as a video memory limitation anyway.

From the Dirt 2 comments as an example…

“This margin was reduced to just 6% in favor of the GeForce GTX 590 at 5760x1200 and then at the resolution of 7680x1600 everything went wrong for the GTX 590. Probably a driver bug or a video memory limitation, the Radeon HD 6990 sailed along happily with 53fps but the GeForce couldn't deliver. We tried reducing the anti-aliasing level to 2x which did breathe quite a bit of life back into the GeForce GTX 590 as it was able to average 42fps at 7680x1600, which still made it 28% slower than the Radeon HD 6990.”

Enough said.
 
Guest said:
Video memory is mirrored across a multi card setup so basically the ATI card has 2GB to work with while the Nvidia card has 1.5GB to work with, 25% less to work with.

Its a shame this wasn't mentioned because it unfairly weights the results and is a definite inhibitor at extreme resolutions.

"That's what AMD said" lol
And because of the "Eyefinity" direction is why they put more VRAM. It's Nvidia's fault for thinking their vidcards had enough VRAM. Not the reviewers fault nor the people who are commenting. Nvidia should have been better prepared in that arena (although i was reading a recent review that showed a more powerful cpu..i.e. I7-2600K Oc'd past 4.2 will bring the "get-e-up" out of those Nvidia GPu's)

It's important reviews like this happen so we can ALL learn something..especially the big companies(if they ever read this stuff)
 
Playing a game with the wide field of view is absolutely awesome! Those who can't or won't do it are really missing out.
 
Anyone who's interested in multimonitor gaming should check out http://widescreengamingforum.com/. Been around a while now. I've been playing games in triplehead format for a long time - my first triple card was a Matrox Parhelia driving three 1280x1024 monitors. Performance wasn't great, but it worked. Matrox moved on from that card with a very clever idea - the Triplehead2go, http://www.matrox.com/graphics/en/products/gxm/th2go/. This little box looks like a single monitor to the video card, at up to 5760x1080 resolution. It has three ports on the other side for monitors, and splits the incoming video stream accordingly.
 
Looks like 3 wall mounted "arms" with full swivel and tilt. Newegg sells Rosewill ones that fit 17" to 32" screens for around 50 bucks (depending on the sale day) That's my guess
 
This whole thread has just made me angry. I purchased two of the latest greens some time back and I returned home to unopened cards but a dead 30" 305T. It stranges me a little to have to pay another 1kusd for a brand new one.
 
I would be interested in knowing if having the monitors in portrait mode would have made any difference. Resolutions of 4536x2048 and 3600x1920 would likely give the same performance results but might mitigate some of the playability issues.
 
I personally didnt like gaming at massive res's. My almost 6 years old apple 30 inch displays were probably to blame. With a little over an inch of a bezel on *each* display - its just far too disruptive to gameplay, especially when you need to mess with so many settings and files to get certain titles running (this may have improved in the year and a half since I tried it out).

Also the fact that my eyeballs are around 2 feet from my display doesnt help, scanning over 90 inches of display area just doesnt work out (and no, portrait was not an option for me/these particular displays). Even if I did rearrange my setup completely to "fix" that issue, I still dont see the point.

Bottom line, if your keen on trying out multi screen gaming, I dont think I could stress enough how much worth it is to get displays with ultra thin bezel (or if you dont mind a little DIY, take them off completely!). Much too gimmicky for me tho :/ Even as much as it costs being an early adaptor in general - the lack of support and gpu power needed to truly run modern games at such high res's just hurts too much.
 
Nice article. I have a setup with 3 24'' monitors and a single HD 5970. I actually use the monitors on portrait mode (better for writing code), so I get resolutions like 3240x1920 (or something like that).

The games run pretty smoothly, with some exceptions. Crysis 2, for instance, is one of them. Your benchmarks show that the Radeons don't have decent performance with this game. I think the game is optimized for the nvidia cards (it says so when it starts), so I need to use only one monitor to play it. And I still get a very annoying flickering in outdoor scenarios.

I had some problems with Civilization V as well. It's either too demanding on the graphics card or poorly coded. Lowering the resolution made it better to play. Something similar happens with Formula 1 2010. If I play it maximized it's not as smooth. If I reduce the windows a little bit, like 80 or 90% of the whole screen, it plays just fine. I don't get it.

The other games I played ran pretty smoothly, though. Medal of Honor, Assassin's Creed 2 and Call of Duty: Black Ops were great. I didn't get to measure FPS, but I played fine. Oh yeah, and my favorite game of all PES 2011 only plays in 1920x1080; the game won't allow you to maximize the window, which sucks.

That's it. The bottom line is that I have been using a single high end Radeon with three 1920x1080 montors and I am pretty happy with it.
 
2x 6970: $700 (slightly better performance than 6990)
3x HannsG 28" 1900x1200 monitors: ($750 to $900) - I bought them on sale
Total: $1450-$1600 < $5000
 
Guest said:
2x 6970: $700 (slightly better performance than 6990)
3x HannsG 28" 1900x1200 monitors: ($750 to $900) - I bought them on sale
Total: $1450-$1600 << $5000

This was my initial reaction to the article!! 5000$ is extremely overboard Especially when an EXTREMELY close experience, close to the same size, can be had for A LOT less (I also have 3 Hanns G 28/27.5" monitors) and a single raedon 6870 Oc'd (about to be two)
 
I currently have an Eyefinity set up with 3 Dell U2311 running at 5760x1080. I love it. I would never go back to just one screen game-play if I didn't have to. I'm using two XFX 5870 XXX edition cards and Battlefield: Bad Company 2 runs above 60 fps with almost every setting on high. I think i have the anti-aliasing down a bit. Some people say that 3 monitors is overkill.... well obviously they haven't gamed on 3 or haven't tried being a little more productive when not gaming cause having 3 monitors is so much nicer for doing work such as writing research papers, or building websites or anything else that needs multiple windows simultaneously.

Btw... Great Review!!
 
I don't personally believe its Nvidia's "fault" to only have 1.5GB per card. I think it was purposefully done that way.

A) It suits pretty much any bezel corrected resolution across 3 1920x???? monitors which is more mainstream in the eyefinity/surround community.

B) 3x 30" monitors is a niche among niches. Just a guess here but I would say there are probably less than a thousand users running 3x 30" displays not including companies and the like.

Why would they spend more money when the majority of your niche surround community isn't going to see any benefit. So the few that run 3x 30" monitors have to buy ATi, wow, Nvidia just lost 1000 customers? Big deal.

If you look at drivers as an example, most games are either patched in the drivers before they come out or directly after. ATi and Nvidia both try to keep the mass of users happy by doing this. However if you look at Eyefinity/Surround support both in the drivers and in games, there is a chasm. Game devs don't take the time to add support if the engine doesn't already support it because its not cost effective and their user base isn't running it anyway, why should they care. Same effect in driver support.

There just aren't that many people with the dosh to splash out on 3x monitors and 2x video cards, let alone 1 video card and a decent sized single monitor, though with prices dropping this is changing some.

What really bothers me personally as a eyefinity/surround owner is that this "feature" is something both ATi and Nvidia tout as 'the next great thing' and yet their own support for it is atrocious. Lets just hope it gets better and not worse.

All things being said though, I wouldn't change a thing. It was well worth the money in my opinion and adds incredible emmersivness to games not to mention making desktop tasks much more enjoyable and efficient, even if you only use two of them for the later.

As far as the article goes, I'm not ridiculing Techspot for comparing the two cards. They generally do an excellent job in the review department and usually better than most. Quite the contrary, I commend them for even attempting the subject.

Who ELSE?!?!?!? has even broached the topic of how a video card can handle 7megapixels at >30fps? Other than WideScreenForums, none that I know of. It should be a staple for all review sites to test Eyefinity/Surround resolutions when a card is released, along with the bevy of other resolutions. If they don't, they're just doing what the devs, ATi and Nvidia are doing... ignoring the customers that get the most from their products and catering to the everyone else that doesn't. Seems very unintuitive from a customer standpoint.

I was merely trying to draw attention to this fact for the uneducated Eyefinity/Surround newcomers.
 
Steve,

To your specific point regarding Starcraft 2 not providing support; I think that the reason was to prevent an arms race for the competitive crowd.

I have a triple monitor 22'' setup, and I respectfully disagree with your conclusion that 3x33'' is not worth it since you focus on the center screen. Black out your peripheral car windows and tell me whether you're a better driver without them; you won't be? Right?

So it goes with any gaming situation. I find I am more comfortable in my environment with the added peripheral vision than without. I could never drive without a front-facing field of vision, of course, but the side field of vision does provide a great benefit, particularly if you are playing a UI-heavy game.

And by the way, a couple extra 22'' monitors probably only costs about 500 bucks...

On the issue of "fish-eye" view, most games are built around a presumption that the screen is flat. If you curve your monitor array inward then it does look awkward, but flatten those screens out and the outside monitors appear just about right. I personally prefer to curve them only slightly. it mitigates the fish-eye view while still making the outer edges close enough to be useful.
 
I have a triple monitor 22'' setup, and I respectfully disagree with your conclusion that 3x33'' is not worth it since you focus on the center screen. Black out your peripheral car windows and tell me whether you're a better driver without them; you won't be? Right?

I agree whole heartedly with this. I have 3 x 25" setup and driving games are incredible on eyefinity. I also noticed in FPS like Metro 2033 that (for example in the tunnel when I previously thought the 'creatures' were running by me, they were actually running up standing at my side.....and eating me. I would like a little tweak on the side monitors so I can take my "objects may be closer than they appear" stickers off, but other than that i think it's great.
 
Of course you can spring for some cheap 27-28” screens for the 5760x1200 resolution but even then you need to spend about $1000 on all the monitors and then if you want to play today’s games and future games in all their glory you need a pair of Radeon HD 6990 or GeForce GTX 590 graphics cards at a cost of $1400.
I am finding that not to be true;
http://imageshack.us/photo/my-images/816/crysiswarhead5760.jpg/ all settings enthusiast 2/AA
http://img218.imageshack.us/img218/378/codmw25760x1920.jpg Highest settings 4/AA
http://img41.imageshack.us/img41/3483/f12010game2011041701090.jpg Ultra settings
https://www.techspot.com/gallery/data/500/medium/Metro2033veryhigh33.jpg Very high settings 4/AA
Dirt 2 108 FPS Ultra settings 4/AA Run on 3 x 5850's

4x 5850 Asus EAH DirectCu's = $912.00
This is what i was talking about Steve. I have not figured out what is the trigger for the change in vram use, but I know it has changed from AFR to supertiling.
 
Back