Triple Monitor Gaming on a Budget: SLI vs. Crossfire vs. Single High-end GPU

By on February 25, 2013, 4:22 AM

Considering next-gen cards are still months away, we didn't expect to bring any more GPU reviews until the second quarter of 2013. However, we realized there was a gap in our current-gen coverage: triple-monitor gaming. In fact, it's been almost two years since we pitted the HD 6990 and GTX 590 against each other to see how they could cope with the stress of running games at resolutions of up to 7680x1600.

We're going to mix things up a little this time. Instead of using each camp's ultra-pricey dual-GPU card (or the new $999 Titan), we're going to see how more affordable Crossfire and SLI setups handle triple-monitor gaming compared to today's single-GPU flagships.

On AMD's side, we'll test a pair of HD 7850s (~$360) and an HD 7970 (~$430), while Nvidia's corner will feature two GTX 660 Tis (~$580) and the venerable GTX 680 (~$470).

Read the complete article.




User Comments: 64

Got something to say? Post a comment
1 person liked this | Skidmarksdeluxe Skidmarksdeluxe said:

Running Xfire or SLI is all very well if you can put up with the pitfalls but I'm a great believer in the KISS theory. Keep It Simple Stupid. I'm really not interested in double the heat, double the price, double the noise & double the complication for less than double the performance. I'll just stick to a higher end single card thank you.

1 person liked this | j05hh j05hh said:

I agree, nice high end card with a 24-30" monitor trumps tri-screen every time.

ghasmanjr ghasmanjr said:

This was an excellent article and it is helpful since I am working on gathering items for a triple monitor setup. It seems like the games I enjoy would require dual 670s or dual 680s to enjoy in all of their glory (Far Cry 3 for example). I'm going to wait until a price drop and pick up another 680 after seeing the results for FC3 and Hitman.

Guest said:

You must have completely missed pcper.com's review of Titan and its update to its testing method. It appears that despite the frames per second on crossfire being very high in practice it produces mostly 'runt' frames and that the frame is no better than having a single card. Put simply by only using FPS you have a 100% totally useless performance review. The whole industry is changing to actually measuring real performance over time and here we have average FPS over the entire game. *shakes head*

Nothing you have shown tells me if these cards and resolutions are playable at all.

Computer Ed Computer Ed said:

Okay this is an interesting piece and I think it brings some great information but the title and hence to concept of the piece does not fit the actual piece. "On a Budget" implies an attempt to keep the cost down, using a Intel Core i7-3960X Extreme Edition is NOT being on a budget or trying to keep cost down in any form. In fact if the person spent that much on the CPU then they likely would not care to spend the same kind of cash on the GTX 690 or high end dual cards.

fimbles fimbles said:

Okay this is an interesting piece and I think it brings some great information but the title and hence to concept of the piece does not fit the actual piece. "On a Budget" implies an attempt to keep the cost down, using a Intel Core i7-3960X Extreme Edition is NOT being on a budget or trying to keep cost down in any form. In fact if the person spent that much on the CPU then they likely would not care to spend the same kind of cash on the GTX 690 or high end dual cards.

This is a GPU test, im guessing the i7 was used to eliminate any bottleneck issues.

ChangWizzle ChangWizzle said:

Okay this is an interesting piece and I think it brings some great information but the title and hence to concept of the piece does not fit the actual piece. "On a Budget" implies an attempt to keep the cost down, using a Intel Core i7-3960X Extreme Edition is NOT being on a budget or trying to keep cost down in any form. In fact if the person spent that much on the CPU then they likely would not care to spend the same kind of cash on the GTX 690 or high end dual cards.

I think your missing the point. In test scenario's like this, reviewers us a high powered CPU in order to eliminate it as a bottleneck. I mean, what good would this review be if he used a core i3 and the results were all the same, simply because the CPU was not fast enough to keep the GPU's fed with data.

EEatGDL said:

Running Xfire or SLI is all very well if you can put up with the pitfalls but I'm a great believer in the KISS theory. Keep It Simple Stupid. I'm really not interested in double the heat, double the price, double the noise & double the complication for less than double the performance. I'll just stick to a higher end single card thank you.

A whole testing article and it seems you didn't read it at all. Is not "doubling" anything here: SLI with mid-budget cards is giving ~30% more performance while consuming ~30% more energy and costing less than a high-end card. And it has the benefit that you can buy one mid-budget card and later add another identical card when it gets cheaper. The test is not comparing a single GTX 680 vs dual GTX 680 in SLI.

To me this article is de-mythifying some ideas about SLI and Crossfire configurations and you're just too narrowed to talk in absolutes.

fimbles fimbles said:

Great article,

Im still quite happy with my 460 sc,s in sli.

I get very similar performance to my brothers 670 at two thirds of the price. Also I have had no issues with sli, Very few games dont take advantage of it, and the ones that do usually are either old or dont require that much horsepower anyway.

Nvidia are on the ball, and there is usually a sli profile released before the actual game is released.

Heat is not an issue for me either, I would recommend buying a motherboard with a decent gap between your cards if you are using stock cooling.

Noise is of course doubled, though the 9 fans in my case drown out anything anyway, And im not really fussed if my pc is noisy.

misor misor said:

Okay this is an interesting piece and I think it brings some great information but the title and hence to concept of the piece does not fit the actual piece. "On a Budget" implies an attempt to keep the cost down, using a Intel Core i7-3960X Extreme Edition is NOT being on a budget or trying to keep cost down in any form. In fact if the person spent that much on the CPU then they likely would not care to spend the same kind of cash on the GTX 690 or high end dual cards.

I think your missing the point. In test scenario's like this, reviewers us a high powered CPU in order to eliminate it as a bottleneck. I mean, what good would this review be if he used a core i3 and the results were all the same, simply because the CPU was not fast enough to keep the GPU's fed with data.

then wth does "budget" means? or at least what it means in the context of this review?

"budget cards" = amd 7850s or nvidia gtx660s?

Computer Ed Computer Ed said:

I think your missing the point. In test scenario's like this, reviewers us a high powered CPU in order to eliminate it as a bottleneck. I mean, what good would this review be if he used a core i3 and the results were all the same, simply because the CPU was not fast enough to keep the GPU's fed with data.

I see this excuse all the time and do not buy it and am amazed at how many people do. A test that pushes outside the bounds of a realistic build is meaningless beyond an academic pursuit. The bottle neck WILL exist in the real world and a measurement of it is something that makes since. Plus it is valuable information because it can show you if a card you want to use will be bottlenecked or not and thus help you make better selections.

My objection was the use of the term budget when it was not in the title it was about just video cards. Also lets be real for a moment, would not a bottleneck when doing this type of setup be a real issue and then should be shown?

Come on guys, giving reviewers a free pass for being lazy and taking the easy way is not what we should demand.

Hammayon Hammayon said:

You have the 660 ti in SLI competing with the 7850 (much cheaper cards).

You should have it compete with the equivalently priced 7950 and you''ll find that AMD's offering gives you much more bang for the buck than the Nvidia.

You are biased towards NVIDIA with this article. You are pushing dual card setups versus single and yet the most balanced options (660ti and 7950) aren't even compared. Shame on you.

Remove this article and do a PROPER REVIEW

BMfan BMfan said:

This looks very much like the Legion Hardware review of the HIS 7850 4gb xfire review.

One thing I was hoping for was a comparison of 8x8 vs 16x16 xfire runs.

I think the difference isn't that big but I would like the figures in front of me.

The reason I would like to know is that last week Monday I bought a Gigabyte 7850OC card and my WEI score on Graphic's was 7,9 but as soon I put in the second card on Saturday it dropped to 7,7 because my board only supports 8x8.

I know that score doesn't matter too much in games,that's why I would like to see what it actually does in games when comparing fps.

BlueDrake said:

You have the 660 ti in SLI competing with the 7850 (much cheaper cards).

You should have it compete with the equivalently priced 7950 and you''ll find that AMD's offering gives you much more bang for the buck than the Nvidia.

You are biased towards NVIDIA with this article. You are pushing dual card setups versus single and yet the most balanced options (660ti and 7950) aren't even compared. Shame on you.

Remove this article and do a PROPER REVIEW

How is this not a proper review? You clearly skimmed over the actual review, and likely bashed based on the hardware alone. It's a higher budget SLI to an easier budget Crossfire, but both are in the general same vein. Also how would a 7950 really stack up, when a 7970 is flat out better in pricing? It's to compete against the 680 in the review. Would you rather they used a 690 and totally defeat the point?

The 660 Ti's are the best overall price / performance. 670's would seem pointless and 650 / Ti are really a cheap, pointless card many mid-level users would ignore. Feel free to write your own review, get your own hardware and prove how it's biased. It's the games that are clearly biased, there's some games that favor AMD hardware. You going to rag on Nvidia about how bad, the cards are on those games?

So what I get is.. you'd rather them use a 660 Ti and 7950 as single cards also? For what purpose is your sad attempt at things? This is on a triple monitor setup, those cards wouldn't have enough to even compare. So I can literally say your comment is utter crap, and a waste of space but whatever floats your boat. :P

Ma_ga said:

I Too think this review wasn't setup properly:

- unrealistic CPU

- GPUs models compared

- 4GBs of ram

TomSEA TomSEA, TechSpot Chancellor, said:

I could never get into triple-monitor gaming. The bezel separations would be a major distraction. Besides the fact that you're still going to be focusing 90% of your attention on the middle monitor, the other 10% being just peripheral vision stuff.

I have a 27" Dell UltraSharp 2560 x 1440 monitor - works perfectly for my gaming needs.

madboyv1, TechSpot Paladin, said:

ITT people are making a lot of hubbub about non-issues, though the new testing methodology on pcper is very interesting. As for me, no idea how I'll handle the GPU issue for my next build. *rolleyes*

1 person liked this | JC713 JC713 said:

This was an excellent article and it is helpful since I am working on gathering items for a triple monitor setup. It seems like the games I enjoy would require dual 670s or dual 680s to enjoy in all of their glory (Far Cry 3 for example). I'm going to wait until a price drop and pick up another 680 after seeing the results for FC3 and Hitman.

Actually I was watching a benchmark you would be interested in:

https://www.youtube.com/watch?v=Qw24094REh0

As you can see, the 680 struggles by itself at that resolution, but with 2, it is a matter of the game and drivers maturing.Some games rock with 2 cards at that resolution but at the same time, some struggle because the game scales badly. Good luck on your decision bud!

Skidmarksdeluxe Skidmarksdeluxe said:

A whole testing article and it seems you didn't read it at all. Is not "doubling" anything here: SLI with mid-budget cards is giving ~30% more performance while consuming ~30% more energy and costing less than a high-end card. And it has the benefit that you can buy one mid-budget card and later add another identical card when it gets cheaper. The test is not comparing a single GTX 680 vs dual GTX 680 in SLI.

To me this article is de-mythifying some ideas about SLI and Crossfire configurations and you're just too narrowed to talk in absolutes.

Maybe you're too narrowed to see the sarcasm in my post. Read it again. The penny may drop but in your case...

amstech amstech, TechSpot Enthusiast, said:

Can't believe a 680 and 7970 in stock form are only 5-15 FPS behind SLi GTX 660Ti's in several games. My Windforce 3X 670 handles a stock 7970 and 680. Factor in single GPU smoothness and less driver hassles and you could make an argument either way.

Answer for me? SLi 670's

3 people like this |
Staff
Steve Steve said:

I Too think this review wasn't setup properly:

- unrealistic CPU

- GPUs models compared

- 4GBs of ram

I will explain the CPU choice in a moment because you are not the only one that doesn't understand the reason it was used.

The GPU's are not being directly compared, they are two different options that represent two very different prices and if you read the review we spoke about this.

4GB's of RAM? Not sure you did the maths rights here, the equation is 4 x 2GB =...

Okay this is an interesting piece and I think it brings some great information but the title and hence to concept of the piece does not fit the actual piece. "On a Budget" implies an attempt to keep the cost down, using a Intel Core i7-3960X Extreme Edition is NOT being on a budget or trying to keep cost down in any form. In fact if the person spent that much on the CPU then they likely would not care to spend the same kind of cash on the GTX 690 or high end dual cards.

What CPU do you propose we use? The point of the high-end Core i7 as others have already pointed out is to eliminate any chance of a CPU bottleneck which could limit GPU performance. When it comes to gaming the Core i7-3960X really isn't any faster than the Core i7-3770K which isn't much faster than a similar clocked Core i5 processor. So really the results reflect what you would get with those processors.

If we tested with a budget Core i3 or AMD Phenom processor for example then some of the results would be CPU bound and all that might tell us is that all GPU configurations tested will delivering the same performance, is that useful?

This has nothing to do with being lazy, that's just silly. All these tests were run from scratch. I have a range of different processors and platforms available but for GPU testing we always use the fastest available option.

You have the 660 ti in SLI competing with the 7850 (much cheaper cards).

You should have it compete with the equivalently priced 7950 and you''ll find that AMD's offering gives you much more bang for the buck than the Nvidia.

You are biased towards NVIDIA with this article. You are pushing dual card setups versus single and yet the most balanced options (660ti and 7950) aren't even compared. Shame on you.

Remove this article and do a PROPER REVIEW

Go Green Team!

This looks very much like the Legion Hardware review of the HIS 7850 4gb xfire review.

One thing I was hoping for was a comparison of 8x8 vs 16x16 xfire runs.

I think the difference isn't that big but I would like the figures in front of me.

The reason I would like to know is that last week Monday I bought a Gigabyte 7850OC card and my WEI score on Graphic's was 7,9 but as soon I put in the second card on Saturday it dropped to 7,7 because my board only supports 8x8.

I know that score doesn't matter too much in games,that's why I would like to see what it actually does in games when comparing fps.

I have done 8x8 vs 16x16 testing in the past, with these graphics cards the short answer is it makes no difference. Ohh and yeah I write for Legion Hardware as well

hahahanoobs hahahanoobs said:

Okay this is an interesting piece and I think it brings some great information but the title and hence to concept of the piece does not fit the actual piece. "On a Budget" implies an attempt to keep the cost down, using a Intel Core i7-3960X Extreme Edition is NOT being on a budget or trying to keep cost down in any form. In fact if the person spent that much on the CPU then they likely would not care to spend the same kind of cash on the GTX 690 or high end dual cards.

I think your missing the point. In test scenario's like this, reviewers us a high powered CPU in order to eliminate it as a bottleneck. I mean, what good would this review be if he used a core i3 and the results were all the same, simply because the CPU was not fast enough to keep the GPU's fed with data.

One could also argue, that if someone on a budget is following this guide to build a triple monitor setup, the (avg fps) results could be misleading. How are they to know how their (for example) 4 thread i5 will compare to a 12 thread i7?

What CPU do you propose we use? The point of the high-end Core i7 as others have already pointed out is to eliminate any chance of a CPU bottleneck which could limit GPU performance. When it comes to gaming the Core i7-3960X really isn't any faster than the Core i7-3770K which isn't much faster than a similar clocked Core i5 processor. So really the results reflect what you would get with those processors.

If we tested with a budget Core i3 or AMD Phenom processor for example then some of the results would be CPU bound and all that might tell us is that all GPU configurations tested will delivering the same performance, is that useful?

Budget gamers are MORE likely to already have bottlenecks in their system. Whether they're still running a Core 2 Duo/Quad with 2GB DDR2 RAM @ 1066MHz, or an Athlon II X4 with a Radeon 7950.

A 3960X and 3770K are two CPU's a budget gamer is least likely to have. Budget gamers are MORE likely to buy AMD, and (high end) AMD CPU's aren't great in single monitor setups, let alone three of them.

5 people like this |
Staff
Steve Steve said:

Not sure how many gamers running a Core 2 Duo still are considering a triple monitor setup with SLI GTX 660 Ti cards. But what you are really saying is they should read our CPU articles first, learn which CPU to buy and then invest in the GPUs, right?

JC713 JC713 said:

Can't believe a 680 and 7970 in stock form are only 5-15 FPS behind SLi GTX 660Ti's in several games. My Windforce 3X 670 handles a stock 7970 and 680. Factor in single GPU smoothness and less driver hassles and you could make an argument either way.

Answer for me? SLi 670's

Maybe buy the 4GB version

Not sure how many gamers running a Core 2 Duo still are considering a triple monitor setup with SLI GTX 660 Ti cards. But what you are really saying is they should read our CPU articles first, learn which CPU to buy and then invest in the GPUs, right?

This proves that a Core 2 Duo is the bottleneck with new cards:

cliffordcooley cliffordcooley, TechSpot Paladin, said:

I'm saying people should decide where they want to be in gaming and purchase their CPU and GPU accordingly.

SLI and Crossfire should not even be an option to consider, if all they have is a budget CPU. I feel this review was about budget SLI/Crossfire configurations, not budget CPU's.

DKRON said:

I've been using crossfire configuration for 5 years now and would never go back, having 2 cards for the game to use makes both the stress and temperatures lower for uses and therefore the cards will last longer. started off with 5850's and now have 7870's, there is no game which I cannot play max settings which is perfect

baN893 baN893 said:

Go Green Team!

rofl. It's good to see TechSpot hasn't lost their sense of humor Great review btw, really appreciate the work put into this

madboyv1, TechSpot Paladin, said:

One could also argue, that if someone on a budget is following this guide to build a triple monitor setup, the (avg fps) results could be misleading. How are they to know how their (for example) 4 thread i5 will compare to a 12 thread i7?

I don't know... maybe that most games have a hard enough time using four threads as it stands? Last time I checked, it was mostly RTSs and some simulators that really took advantage of more than 2-4 threads.

EEatGDL said:

Maybe you're too narrowed to see the sarcasm in my post. Read it again. The penny may drop but in your case...

Then tell jo5hh too, because it seems you're bad writing sarcasm; no quote, no exageration or an ending that put clears your sarcasm. I'm pretty sure I'm not the only one following your post who thought you were talking serious.

Footlong Footlong said:

Not to mention microstuttering that increases a lot in any dual chip solution. I prefer a single card. It's cheaper in my eletric bill.

baN893 baN893 said:

One could also argue, that if someone on a budget is following this guide to build a triple monitor setup, the (avg fps) results could be misleading. How are they to know how their (for example) 4 thread i5 will compare to a 12 thread i7?

I don't feel like this should be a place for hypothetical arguments such as the one you are making. Just because you are trying to build a budget setup shouldn't mean that you lack basic understanding of CPU bottlenecks. If you are going to build a desktop and want to do it right (especially if you are going through reviews of triple monitor setups) you should have come across CPU reviews where it is evident that the FPS difference between an 1155 i5 and a 2011 i7 are seriously marginal.

In case any newbie who doesn't know about this, this is for you: [link]

Guest said:

Just thought I'd pitch in some useless info. I run a Core2 Quad @ 3.2ghz with a single Gigabyte 7850OC 2GB. I've just finished the sleeping dogs campaign at medium settings running 5760x1080 and while cleeeeeaarly not maxed out my frame rate never dropped low enough for me to notice or for it t break my immersion.

If I could fit a second gpu on this old matx mobo I would probably xfire my 7850 but that isn't going to happen. If anything this review has helped me to decide I'm better off selling my 7850 and buying a 7950 instead. Total cost will be about the same but with less hassle driver wise and no need to upgrade anything else.

Luay said:

No one would buy the $280 HIS 4GB HD7850 when a Gigabyte, XFX, HIS HD7950 is $20 away.

If I see someone try to do it like Gwen from the Ghost Machine, I would make like Jack.

http://www.youtube.com/watch?v=ZDkdD3jwAS4

I'm assuming you will conclude this way as I haven't even started to read the article yet, so thanks ahead for the lesson. On with the reading.

Guest said:

Amd tressfx open standard innovation,must see-

(tomb raider the first game to use revolutionary tressfx real time hair simulation)

http://blogs.amd.com/play/tressfx/

[link]

Souvs Souvs said:

Showdown of 7950 Crossfire vs 680 SLI at 5760*1080 (Eyefinity/Surround)

[link]

amd the god of performance and also price/performance especially at super-high resolution

Amd tressfx open standard innovation,must see-

(tomb raider the first game to use revolutionary tressfx real time hair simulation)

http://blogs.amd.com/play/tressfx/

[link]

JC713 JC713 said:

Just thought I'd pitch in some useless info. I run a Core2 Quad @ 3.2ghz with a single Gigabyte 7850OC 2GB. I've just finished the sleeping dogs campaign at medium settings running 5760x1080 and while cleeeeeaarly not maxed out my frame rate never dropped low enough for me to notice or for it t break my immersion.

If I could fit a second gpu on this old matx mobo I would probably xfire my 7850 but that isn't going to happen. If anything this review has helped me to decide I'm better off selling my 7850 and buying a 7950 instead. Total cost will be about the same but with less hassle driver wise and no need to upgrade anything else.

This is hard to believe

Amd tressfx open standard innovation,must see-

(tomb raider the first game to use revolutionary tressfx real time hair simulation)

http://blogs.amd.com/play/tressfx/

[link]

This is huge in computer animated films also

Skidmarksdeluxe Skidmarksdeluxe said:

This is hard to believe

I had major issues with 2 X GTX460's installed. Ended up selling both & buying a GTX 570. I never looked back. I've since upgraded but I'll never ever consider dual setup's again.

JC713 JC713 said:

I had major issues with 2 X GTX460's installed. Ended up selling both & buying a GTX 570. I never looked back. I've since upgraded but I'll never ever consider dual setup's again.

Is it too loud or hot?

Skidmarksdeluxe Skidmarksdeluxe said:

Is it too loud or hot?

Neither. I just had a lot of glitching, erratic frame rate problems & crashing to desktop while playing. (No. It wasn't my PSU)

JC713 JC713 said:

Neither. I just had a lot of glitching, erratic frame rate problems & crashing to desktop while playing. (No. It wasn't my PSU)

Interesting, I guess it is worth buying a single card rather than 2 I guess

1 person liked this | Skidmarksdeluxe Skidmarksdeluxe said:

Interesting, I guess it is worth buying a single card rather than 2 I guess

Your mileage may vary, don't let me put you off but there are pitfalls that are inherent with multi gpu setups. If you decide to go with it I wish you the best of luck. As for me... I'll stick with single cards.

JC713 JC713 said:

Your mileage may vary, don't let me put you off but there are pitfalls that are inherent with multi gpu setups. If you decide to go with it I wish you the best of luck. As for me... I'll stick with single cards.

Plus some games scale really badly so it basically kills the point of the 2nd card

hahahanoobs hahahanoobs said:

You have the 660 ti in SLI competing with the 7850 (much cheaper cards).

You should have it compete with the equivalently priced 7950 and you''ll find that AMD's offering gives you much more bang for the buck than the Nvidia.

You are biased towards NVIDIA with this article. You are pushing dual card setups versus single and yet the most balanced options (660ti and 7950) aren't even compared. Shame on you.

Remove this article and do a PROPER REVIEW

Wondering why you skipped over the 7870. It also comes close to the 7950 at times and is cheaper than a 660Ti.

One could also argue, that if someone on a budget is following this guide to build a triple monitor setup, the (avg fps) results could be misleading. How are they to know how their (for example) 4 thread i5 will compare to a 12 thread i7?

I don't feel like this should be a place for hypothetical arguments such as the one you are making. Just because you are trying to build a budget setup shouldn't mean that you lack basic understanding of CPU bottlenecks. If you are going to build a desktop and want to do it right (especially if you are going through reviews of triple monitor setups) you should have come across CPU reviews where it is evident that the FPS difference between an 1155 i5 and a 2011 i7 are seriously marginal.

In case any newbie who doesn't know about this, this is for you: [link]

Ah, my bad. That 4T i5 should of read 4T i3 (or any AMD CPU).

Ah yea, I don't need that link... Mainly because it only shows 4 games out of 1000, and not even the best ones. If you think those 4 games tested are a full representation of multi-threading performance in games, then you stopped yourself short. Juss sayin.

I've built every computer I've ever owned, each costing at least $2000 (piece by precious piece). Click my name and take a look at my specs.

For the games I was playing at the time, and following all major CPU and GPU reviews, I knew I could get by with an i5 2500K @ 4.5GHz, but the majority of gamers on a budget, buy i3's and AMD CPU's, and we all know about i3's and AMD CPU's, which makes this multi-GPU review quite confusing on why they left out budget CPU's? I get the whole bottleneck thang, but budget gamers are MORE likely to already have bottlenecks, so showing, for example, 2 660Ti's getting 60fps @ 5760x1080, is not a true representation of what a budget gamer with an Phenom II 980 or Athlon II X4 will experience... IMO, and by the looks of it, others here too.

I mean, when we (me anyway) look at single GPU reviews using a single monitor, and see a range of CPU's tested with ours included in them, we can honestly expect to get VERY close to that performance.

When you show half the story, you get half the facts.

Not sure how many gamers running a Core 2 Duo still are considering a triple monitor setup with SLI GTX 660 Ti cards. But what you are really saying is they should read our CPU articles first, learn which CPU to buy and then invest in the GPUs, right?

Maybe you need to visit your forums once in a while.

baN893 baN893 said:

Ah, my bad. That 4T i5 should of read 4T i3 (or any AMD CPU).

Ah yea, I don't need that link... Mainly because it only shows 4 games out of 1000, and not even the best ones. If you think those 4 games tested are a full representation of multi-threading performance in games, then you stopped yourself short. Juss sayin.

I've built every computer I've ever owned, each costing at least $2000 (piece by precious piece). Click my name and take a look at my specs.

For the games I was playing at the time, and following all major CPU and GPU reviews, I knew I could get by with an i5 2500K @ 4.5GHz, but the majority of gamers on a budget, buy i3's and AMD CPU's, and we all know about i3's and AMD CPU's, which makes this multi-GPU review quite confusing on why they left out budget CPU's? I get the whole bottleneck thang, but budget gamers are MORE likely to already have bottlenecks, so showing, for example, 2 660Ti's getting 60fps @ 5760x1080, is not a true representation of what a budget gamer with an Phenom II 980 or Athlon II X4 will experience... IMO, and by the looks of it, others here too.

I mean, when we (me anyway) look at single GPU reviews using a single monitor, and see a range of CPU's tested with ours included in them, we can honestly expect to get VERY close to that performance.

When you show half the story, you get half the facts.

Just out of curiosity, could you please link me the review, or multiple ones as well, where I see a full representation of multi-threading performance in games? I'm always willing to learn something new.

That link also wasn't intended for you. It is for people who are new to the whole thing. Also, you made your profile private so people like me aren't allowed to drool over your specs. :'(

I see what you are saying with not including low-budget CPU's, but this review isn't meant to review CPU performance in triple monitor setups. It's a standoff of

"SLI vs. Crossfire vs. Single High-end GPU"

I don't think this review is trying to say "this is a budget build that is perfect for triple monitor gaming", simply "this is how budget SLI vs. Crossfire vs. Single High-end GPU will compare against each other when CPU bottlenecks are eliminated". TechSpot could/should do a follow up review saying something along the lines of "based on our recent review about budget triple monitor graphic card configurations, we found Crossfire 2GB HD 7850's to be the best value solution. Now let's see how low we can go with processors that are cheaper than an 1155 i5 3570k/2500k without bottlenecking the GPU's performance."

I hope TechSpot would do that, as part of a "Budget gaming series" or something. I'm sure we would agree that that would be beneficial, informative, and educational to all of us.

jeetshek jeetshek said:

I was planing to buy a MSI Lightning 7970 for 5760x1080 resolution. The plan is after purchasing this bad boy, I will purchase one more cheap 7970 or a 7950.

I was wondering few things

1. If I will push Lightning 7970's overclocking potential, do I need a second one ?

2. Should I go for 7870 crossfire instead of a 7970 ?

3. I have a 1366 X 768 montor (Please dont laugh) . Though I have plans for upgrading the display but....Can I add 2 more monitors of same resolutions for playing games ? will it look good.

syntaxbreaker syntaxbreaker said:

Ohh that is pretty cool of a gaming gear men, but pretty cool budget too

4 people like this |
Staff
Steve Steve said:

I was planing to buy a MSI Lightning 7970 for 5760x1080 resolution. The plan is after purchasing this bad boy, I will purchase one more cheap 7970 or a 7950.

I was wondering few things

1. If I will push Lightning 7970's overclocking potential, do I need a second one ?

2. Should I go for 7870 crossfire instead of a 7970 ?

3. I have a 1366 X 768 montor (Please dont laugh) . Though I have plans for upgrading the display but....Can I add 2 more monitors of same resolutions for playing games ? will it look good.

1. For the resolution you play at NO.

2. No, stick with one high-end card if you can.

3. You can add two more 1366x768 screens but I wouldn't bother. A single cheap 27" with a 1920x1080 resolution would look much better in my opinion.

1 person liked this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

@Steve, we gotta work on get your likes further up the charts. You and the other article contributors do so much more for TechSpot than myself, I feel ashamed to have the score I do.

Souvs Souvs said:

I was planing to buy a MSI Lightning 7970 for 5760x1080 resolution. The plan is after purchasing this bad boy, I will purchase one more cheap 7970 or a 7950.

I was wondering few things

1. If I will push Lightning 7970's overclocking potential, do I need a second one ?

2. Should I go for 7870 crossfire instead of a 7970 ?

3. I have a 1366 X 768 montor (Please dont laugh) . Though I have plans for upgrading the display but....Can I add 2 more monitors of same resolutions for playing games ? will it look good.

Jeet, a single 7970/7950 will suffice....but always use latest driver from amd (13.2 beta 6 at present) and happy gaming.....

and ,as for 1080p is only noticeable if you are using 40"+ big monitor and several inch away from monitor...

don't go for market scam,720p hd is still a great resolution for gaming man....see the below uld for that...

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.