Triple Monitor Gaming on a Budget: SLI vs. Crossfire vs. Single High-end GPU

Julio Franco

Posts: 9,092   +2,043
Staff member
Read the full article at:
[newwindow=https://www.techspot.com/review/639-triple-monitor-gaming/]https://www.techspot.com/review/639-triple-monitor-gaming/[/newwindow]

Please leave your feedback here.
 
Running Xfire or SLI is all very well if you can put up with the pitfalls but I'm a great believer in the KISS theory. Keep It Simple Stupid. I'm really not interested in double the heat, double the price, double the noise & double the complication for less than double the performance. I'll just stick to a higher end single card thank you.
 
This was an excellent article and it is helpful since I am working on gathering items for a triple monitor setup. It seems like the games I enjoy would require dual 670s or dual 680s to enjoy in all of their glory (Far Cry 3 for example). I'm going to wait until a price drop and pick up another 680 after seeing the results for FC3 and Hitman.
 
You must have completely missed pcper.com's review of Titan and its update to its testing method. It appears that despite the frames per second on crossfire being very high in practice it produces mostly 'runt' frames and that the frame is no better than having a single card. Put simply by only using FPS you have a 100% totally useless performance review. The whole industry is changing to actually measuring real performance over time and here we have average FPS over the entire game. *shakes head*

Nothing you have shown tells me if these cards and resolutions are playable at all.
 
Okay this is an interesting piece and I think it brings some great information but the title and hence to concept of the piece does not fit the actual piece. "On a Budget" implies an attempt to keep the cost down, using a Intel Core i7-3960X Extreme Edition is NOT being on a budget or trying to keep cost down in any form. In fact if the person spent that much on the CPU then they likely would not care to spend the same kind of cash on the GTX 690 or high end dual cards.
 
Okay this is an interesting piece and I think it brings some great information but the title and hence to concept of the piece does not fit the actual piece. "On a Budget" implies an attempt to keep the cost down, using a Intel Core i7-3960X Extreme Edition is NOT being on a budget or trying to keep cost down in any form. In fact if the person spent that much on the CPU then they likely would not care to spend the same kind of cash on the GTX 690 or high end dual cards.

This is a GPU test, im guessing the i7 was used to eliminate any bottleneck issues.
 
Okay this is an interesting piece and I think it brings some great information but the title and hence to concept of the piece does not fit the actual piece. "On a Budget" implies an attempt to keep the cost down, using a Intel Core i7-3960X Extreme Edition is NOT being on a budget or trying to keep cost down in any form. In fact if the person spent that much on the CPU then they likely would not care to spend the same kind of cash on the GTX 690 or high end dual cards.

I think your missing the point. In test scenario's like this, reviewers us a high powered CPU in order to eliminate it as a bottleneck. I mean, what good would this review be if he used a core i3 and the results were all the same, simply because the CPU was not fast enough to keep the GPU's fed with data.
 
Running Xfire or SLI is all very well if you can put up with the pitfalls but I'm a great believer in the KISS theory. Keep It Simple Stupid. I'm really not interested in double the heat, double the price, double the noise & double the complication for less than double the performance. I'll just stick to a higher end single card thank you.

A whole testing article and it seems you didn't read it at all. Is not "doubling" anything here: SLI with mid-budget cards is giving ~30% more performance while consuming ~30% more energy and costing less than a high-end card. And it has the benefit that you can buy one mid-budget card and later add another identical card when it gets cheaper. The test is not comparing a single GTX 680 vs dual GTX 680 in SLI.

To me this article is de-mythifying some ideas about SLI and Crossfire configurations and you're just too narrowed to talk in absolutes.
 
Great article,

Im still quite happy with my 460 sc,s in sli.

I get very similar performance to my brothers 670 at two thirds of the price. Also I have had no issues with sli, Very few games dont take advantage of it, and the ones that do usually are either old or dont require that much horsepower anyway.

Nvidia are on the ball, and there is usually a sli profile released before the actual game is released.

Heat is not an issue for me either, I would recommend buying a motherboard with a decent gap between your cards if you are using stock cooling.

Noise is of course doubled, though the 9 fans in my case drown out anything anyway, And im not really fussed if my pc is noisy.
 
Okay this is an interesting piece and I think it brings some great information but the title and hence to concept of the piece does not fit the actual piece. "On a Budget" implies an attempt to keep the cost down, using a Intel Core i7-3960X Extreme Edition is NOT being on a budget or trying to keep cost down in any form. In fact if the person spent that much on the CPU then they likely would not care to spend the same kind of cash on the GTX 690 or high end dual cards.

I think your missing the point. In test scenario's like this, reviewers us a high powered CPU in order to eliminate it as a bottleneck. I mean, what good would this review be if he used a core i3 and the results were all the same, simply because the CPU was not fast enough to keep the GPU's fed with data.
then wth does "budget" means? or at least what it means in the context of this review?
"budget cards" = amd 7850s or nvidia gtx660s?
 
I think your missing the point. In test scenario's like this, reviewers us a high powered CPU in order to eliminate it as a bottleneck. I mean, what good would this review be if he used a core i3 and the results were all the same, simply because the CPU was not fast enough to keep the GPU's fed with data.

I see this excuse all the time and do not buy it and am amazed at how many people do. A test that pushes outside the bounds of a realistic build is meaningless beyond an academic pursuit. The bottle neck WILL exist in the real world and a measurement of it is something that makes since. Plus it is valuable information because it can show you if a card you want to use will be bottlenecked or not and thus help you make better selections.

My objection was the use of the term budget when it was not in the title it was about just video cards. Also lets be real for a moment, would not a bottleneck when doing this type of setup be a real issue and then should be shown?

Come on guys, giving reviewers a free pass for being lazy and taking the easy way is not what we should demand.
 
You have the 660 ti in SLI competing with the 7850 (much cheaper cards).

You should have it compete with the equivalently priced 7950 and you''ll find that AMD's offering gives you much more bang for the buck than the Nvidia.

You are biased towards NVIDIA with this article. You are pushing dual card setups versus single and yet the most balanced options (660ti and 7950) aren't even compared. Shame on you.

Remove this article and do a PROPER REVIEW
 
This looks very much like the Legion Hardware review of the HIS 7850 4gb xfire review.

One thing I was hoping for was a comparison of 8x8 vs 16x16 xfire runs.
I think the difference isn't that big but I would like the figures in front of me.

The reason I would like to know is that last week Monday I bought a Gigabyte 7850OC card and my WEI score on Graphic's was 7,9 but as soon I put in the second card on Saturday it dropped to 7,7 because my board only supports 8x8.

I know that score doesn't matter too much in games,that's why I would like to see what it actually does in games when comparing fps.
 
You have the 660 ti in SLI competing with the 7850 (much cheaper cards).

You should have it compete with the equivalently priced 7950 and you''ll find that AMD's offering gives you much more bang for the buck than the Nvidia.

You are biased towards NVIDIA with this article. You are pushing dual card setups versus single and yet the most balanced options (660ti and 7950) aren't even compared. Shame on you.

Remove this article and do a PROPER REVIEW

How is this not a proper review? You clearly skimmed over the actual review, and likely bashed based on the hardware alone. It's a higher budget SLI to an easier budget Crossfire, but both are in the general same vein. Also how would a 7950 really stack up, when a 7970 is flat out better in pricing? It's to compete against the 680 in the review. Would you rather they used a 690 and totally defeat the point?

The 660 Ti's are the best overall price / performance. 670's would seem pointless and 650 / Ti are really a cheap, pointless card many mid-level users would ignore. Feel free to write your own review, get your own hardware and prove how it's biased. It's the games that are clearly biased, there's some games that favor AMD hardware. You going to rag on Nvidia about how bad, the cards are on those games?

So what I get is.. you'd rather them use a 660 Ti and 7950 as single cards also? For what purpose is your sad attempt at things? This is on a triple monitor setup, those cards wouldn't have enough to even compare. So I can literally say your comment is utter crap, and a waste of space but whatever floats your boat. :p
 
I Too think this review wasn't setup properly:
- unrealistic CPU
- GPUs models compared
- 4GBs of ram
 
I could never get into triple-monitor gaming. The bezel separations would be a major distraction. Besides the fact that you're still going to be focusing 90% of your attention on the middle monitor, the other 10% being just peripheral vision stuff.

I have a 27" Dell UltraSharp 2560 x 1440 monitor - works perfectly for my gaming needs.
 
ITT people are making a lot of hubbub about non-issues, though the new testing methodology on pcper is very interesting. As for me, no idea how I'll handle the GPU issue for my next build. *rolleyes*
 
This was an excellent article and it is helpful since I am working on gathering items for a triple monitor setup. It seems like the games I enjoy would require dual 670s or dual 680s to enjoy in all of their glory (Far Cry 3 for example). I'm going to wait until a price drop and pick up another 680 after seeing the results for FC3 and Hitman.
Actually I was watching a benchmark you would be interested in:
https://www.youtube.com/watch?v=Qw24094REh0
As you can see, the 680 struggles by itself at that resolution, but with 2, it is a matter of the game and drivers maturing.Some games rock with 2 cards at that resolution but at the same time, some struggle because the game scales badly. Good luck on your decision bud!
 
A whole testing article and it seems you didn't read it at all. Is not "doubling" anything here: SLI with mid-budget cards is giving ~30% more performance while consuming ~30% more energy and costing less than a high-end card. And it has the benefit that you can buy one mid-budget card and later add another identical card when it gets cheaper. The test is not comparing a single GTX 680 vs dual GTX 680 in SLI.

To me this article is de-mythifying some ideas about SLI and Crossfire configurations and you're just too narrowed to talk in absolutes.
Maybe you're too narrowed to see the sarcasm in my post. Read it again. The penny may drop but in your case...
 
Can't believe a 680 and 7970 in stock form are only 5-15 FPS behind SLi GTX 660Ti's in several games. My Windforce 3X 670 handles a stock 7970 and 680. Factor in single GPU smoothness and less driver hassles and you could make an argument either way.
Answer for me? SLi 670's :D
 
I Too think this review wasn't setup properly:
- unrealistic CPU
- GPUs models compared
- 4GBs of ram

I will explain the CPU choice in a moment because you are not the only one that doesn't understand the reason it was used.

The GPU's are not being directly compared, they are two different options that represent two very different prices and if you read the review we spoke about this.

4GB's of RAM? Not sure you did the maths rights here, the equation is 4 x 2GB =...

Okay this is an interesting piece and I think it brings some great information but the title and hence to concept of the piece does not fit the actual piece. "On a Budget" implies an attempt to keep the cost down, using a Intel Core i7-3960X Extreme Edition is NOT being on a budget or trying to keep cost down in any form. In fact if the person spent that much on the CPU then they likely would not care to spend the same kind of cash on the GTX 690 or high end dual cards.

What CPU do you propose we use? The point of the high-end Core i7 as others have already pointed out is to eliminate any chance of a CPU bottleneck which could limit GPU performance. When it comes to gaming the Core i7-3960X really isn't any faster than the Core i7-3770K which isn't much faster than a similar clocked Core i5 processor. So really the results reflect what you would get with those processors.

If we tested with a budget Core i3 or AMD Phenom processor for example then some of the results would be CPU bound and all that might tell us is that all GPU configurations tested will delivering the same performance, is that useful?

This has nothing to do with being lazy, that's just silly. All these tests were run from scratch. I have a range of different processors and platforms available but for GPU testing we always use the fastest available option.

You have the 660 ti in SLI competing with the 7850 (much cheaper cards).

You should have it compete with the equivalently priced 7950 and you''ll find that AMD's offering gives you much more bang for the buck than the Nvidia.

You are biased towards NVIDIA with this article. You are pushing dual card setups versus single and yet the most balanced options (660ti and 7950) aren't even compared. Shame on you.

Remove this article and do a PROPER REVIEW

Go Green Team!

This looks very much like the Legion Hardware review of the HIS 7850 4gb xfire review.

One thing I was hoping for was a comparison of 8x8 vs 16x16 xfire runs.
I think the difference isn't that big but I would like the figures in front of me.

The reason I would like to know is that last week Monday I bought a Gigabyte 7850OC card and my WEI score on Graphic's was 7,9 but as soon I put in the second card on Saturday it dropped to 7,7 because my board only supports 8x8.

I know that score doesn't matter too much in games,that's why I would like to see what it actually does in games when comparing fps.

I have done 8x8 vs 16x16 testing in the past, with these graphics cards the short answer is it makes no difference. Ohh and yeah I write for Legion Hardware as well ;)
 
Okay this is an interesting piece and I think it brings some great information but the title and hence to concept of the piece does not fit the actual piece. "On a Budget" implies an attempt to keep the cost down, using a Intel Core i7-3960X Extreme Edition is NOT being on a budget or trying to keep cost down in any form. In fact if the person spent that much on the CPU then they likely would not care to spend the same kind of cash on the GTX 690 or high end dual cards.

I think your missing the point. In test scenario's like this, reviewers us a high powered CPU in order to eliminate it as a bottleneck. I mean, what good would this review be if he used a core i3 and the results were all the same, simply because the CPU was not fast enough to keep the GPU's fed with data.

One could also argue, that if someone on a budget is following this guide to build a triple monitor setup, the (avg fps) results could be misleading. How are they to know how their (for example) 4 thread i5 will compare to a 12 thread i7?

What CPU do you propose we use? The point of the high-end Core i7 as others have already pointed out is to eliminate any chance of a CPU bottleneck which could limit GPU performance. When it comes to gaming the Core i7-3960X really isn't any faster than the Core i7-3770K which isn't much faster than a similar clocked Core i5 processor. So really the results reflect what you would get with those processors.

If we tested with a budget Core i3 or AMD Phenom processor for example then some of the results would be CPU bound and all that might tell us is that all GPU configurations tested will delivering the same performance, is that useful?


Budget gamers are MORE likely to already have bottlenecks in their system. Whether they're still running a Core 2 Duo/Quad with 2GB DDR2 RAM @ 1066MHz, or an Athlon II X4 with a Radeon 7950.

A 3960X and 3770K are two CPU's a budget gamer is least likely to have. Budget gamers are MORE likely to buy AMD, and (high end) AMD CPU's aren't great in single monitor setups, let alone three of them.
 
Not sure how many gamers running a Core 2 Duo still are considering a triple monitor setup with SLI GTX 660 Ti cards. But what you are really saying is they should read our CPU articles first, learn which CPU to buy and then invest in the GPUs, right?
 
Back