GTX 670 SLI

Alpha Gamer

Posts: 357   +119
I'm considering acquiring a 2nd gtx 670 and SLI it with the one I currently own. Few questions:
1- They can be from different manufacturers, as long as they have the same amount of VRAM, right?
2- Would you suggest any particular manufacturer/model? The one I own is this.
3- An i5 2500k @ 4.2Ghz won't bottleneck them, right?
4- Are there any issues I should be aware of?
 
In a sense, as long as they have the same amount of vram, you can SLI up to 3 of them together. However, if the core/boost clocks are different, it will default with the lower clock speeds for both. I have not done different clocked cards in awhile so it may act differently than it used to, but I would suggest just getting the same clocked cards at least so you would not have to worry.

Hope that helps!
 
I would not SLI them unless you are running at a higher resolution then 1080P. I would instead wait for the 800 series to be released. Make sure your PSU can handle the 2nd card. Finally, make sure you case has enough ventilation since 2 of those open shroud cards will produce a lot of heat in your case.
 
I would not SLI them unless you are running at a higher resolution then 1080P. I would instead wait for the 800 series to be released. Make sure your PSU can handle the 2nd card. Finally, make sure you case has enough ventilation since 2 of those open shroud cards will produce a lot of heat in your case.
PSU and ventilation is not a problem. Regarding your position of not SLIing them at 1080p or less, is it because you think a single gtx 670 can handle pretty much anything at those settings, or you're considering some technical drawback I'm unaware of? Like my framerate not scaling that well in 1080p?
 
What are your full system specs? You are running at 1080P right? Also, what games do you play?
 
What are your full system specs? You are running at 1080P right? Also, what games do you play?
Core i5 2500k @ 4.2Ghz
Asus P8P67 PRO
Gigabyte GeForce GTX 670 OC
2x4Gb Corsair
PSU - Corsair 850w
Windows 7 - 64bit -Ultimate

Right now, the only game I play is dark souls, but I'm looking forward to the next batman, witcher 3, cyber punk 2027, dragon age 3 and of course, dark souls 2.
Yeah, I game at 1080p
 
Download the latest driver from nVidia: http://www.geforce.com/drivers/results/62791. Chose the custom install option, check the box that says "Clean Install". That should eliminate any issues you are having with frame rates. A 670 would be enough for all those games @ 1080P for now. I think you should just wait for the 800 series bud :). Lowering settings from Ultra to High wont change much, and you wont even notice. Turning down detail until you upgrade is the best choice. All those games you can probably play at 1080P at ultra, as long as you upgrade drivers after each release. The 670 may struggle with Witcher 3 though.
 
Download the latest driver from nVidia: http://www.geforce.com/drivers/results/62791. Chose the custom install option, check the box that says "Clean Install". That should eliminate any issues you are having with frame rates. A 670 would be enough for all those games @ 1080P for now. I think you should just wait for the 800 series bud :). Lowering settings from Ultra to High wont change much, and you wont even notice. Turning down detail until you upgrade is the best choice. All those games you can probably play at 1080P at ultra, as long as you upgrade drivers after each release. The 670 may struggle with Witcher 3 though.
Yeah, maybe waiting for the gtx 800 series is not that a bad idea. Let's just see if I can hold on to that after Witcher 3 kicks my pc in the nuts...
 
Yeah, maybe waiting for the gtx 800 series is not that a bad idea. Let's just see if I can hold on to that after Witcher 3 kicks my pc in the nuts...

Trust me man, you wont see a difference between max graphics and a level lower. It is just less intense AA (which basically flattens objects). Use Geforce Experience to optimize your experience.
 
I'm considering acquiring a 2nd gtx 670 and SLI it with the one I currently own. Few questions:
1- They can be from different manufacturers, as long as they have the same amount of VRAM, right?
They can be from different manufacturers. Different framebuffers so long as the memory frequency is the same can be extremely problematic (more so since CoolBits is no longer an option). The framebuffer is limited to the smaller of the two. This is because the buffer from the second card in the SLI setup flips to the first card. If the first card is 4GB then the second card is only transferring 2GB, and if the second card is 4GB then it can only transfer 2GB to the 2GB card. Considering the potential problems and cost of a 4GB card over a 2GB, I'd just stick with the latter.
2- Would you suggest any particular manufacturer/model? The one I own is this.
Another Gigabyte might make the overall look more aesthetic maybe, but I'd look at pricing (I presume you're looking at second hand) and warranty. EVGA might be a better bet since the 3 -year warranty is linked to the card and not the registered owner. Any warranty claims are done by the subsequent owner under the Guest RMA procedure. EVGA cards usually stick to the reference memory frequency (6008M effective) that your Giga card has. You can adjust the second card with EVGA Precision or MSI Afterburner to synchronize the core clocks ( the boost clocks on Kepler cards are independent under SLI so that area is immaterial)
3- An i5 2500k @ 4.2Ghz won't bottleneck them, right?
No problem
4- Are there any issues I should be aware of?
Not really aside from the vagaries of multi-GPU scaling and profiles.
With SLI you'll see higher minimum frame rates, and you should be able to move up the game image quality substantially (enabling a higher level of AA, post processing and PhysX).
You can see the difference in "highest playable settings" between a single card and SLI. You can also check any side-by-side comparison between the GTX 680 and GTX 690 (Your card is within a percentage point or two of a stock GTX 680) to see what kind of increases you can expect.
Unless you see a great buy, I'd hold off for week since the GTX 760 (and 760 Ti) is due to be launched in the next week- which should drive down GTX 670 prices in the resell market.
 
I agree with JC, I don't think the trouble of SLI is really worth it in your situation. Apart from the Witcher 3, I don't think any of those games will be stressing your 670 at all, and should be playable on high at 1080p. Even the Witcher 3 should be ok as long as you turn the AA down.
 
They can be from different manufacturers. Different framebuffers so long as the memory frequency is the same can be extremely problematic (more so since CoolBits is no longer an option). The framebuffer is limited to the smaller of the two. This is because the buffer from the second card in the SLI setup flips to the first card. If the first card is 4GB then the second card is only transferring 2GB, and if the second card is 4GB then it can only transfer 2GB to the 2GB card. Considering the potential problems and cost of a 4GB card over a 2GB, I'd just stick with the latter.

Another Gigabyte might make the overall look more aesthetic maybe, but I'd look at pricing (I presume you're looking at second hand) and warranty. EVGA might be a better bet since the 3 -year warranty is linked to the card and not the registered owner. Any warranty claims are done by the subsequent owner under the Guest RMA procedure. EVGA cards usually stick to the reference memory frequency (6008M effective) that your Giga card has. You can adjust the second card with EVGA Precision or MSI Afterburner to synchronize the core clocks ( the boost clocks on Kepler cards are independent under SLI so that area is immaterial)

No problem

Not really aside from the vagaries of multi-GPU scaling and profiles.
With SLI you'll see higher minimum frame rates, and you should be able to move up the game image quality substantially (enabling a higher level of AA, post processing and PhysX).
You can see the difference in "highest playable settings" between a single card and SLI. You can also check any side-by-side comparison between the GTX 680 and GTX 690 (Your card is within a percentage point or two of a stock GTX 680) to see what kind of increases you can expect.
Unless you see a great buy, I'd hold off for week since the GTX 760 (and 760 Ti) is due to be launched in the next week- which should drive down GTX 670 prices in the resell market.
Thanks for all the thorough info
 
Yea, at this point I would just tell you to wait since your only gaming at 1080p, a 670 just has plenty of power to back up gaming like that. Wait for like a new series of cards and then upgrade when you need it.
 
Thanks for all the thorough info
No problem.
Since your question revolved around getting the second card that's what I addressed, as opposed to answering an unasked question of should you buy a second card.

The GTX 670 is one of the best bang-for-buck cards for HD gaming as you undoubtedly already know. Adding a second card is a solid option if the cash outlay isn't too great. If the other option is a single card upgrade then IMO you'll have a wait for Maxwell (6-9 months since Apple has earmarked the bulk of TSMC's initial capacity). It will also likely be an expensive option since the GPU will be built on the 20nm process. Raw production cost for the GPUs will be in the order of 50% higher than for 28nm ones -and that's assuming the yield from the get-go is on par with the high yields TSMC currently enjoys. If Nvidia integrate an ARM CPU into the high-perf. GeForce line of boards then that price goes higher still.

The other single card option is the one being exercised by quite a few people in your situation (and myself incidentally), and that is the GTX 780. The card won't provide the same GPU power as two 670's, but it is a single card option that allows for very high levels of game image quality (likely maximum quality @ 1080 in the vast number of games). The EVGA SC w/ ACX cooler is $10 more than the reference card, and allows for significant OC/boost headroom (more than the reference blower). The card will give ~45-50% more performance than the single OC'ed 670 (around the same as the Titan) in factory trim, and ~ 65% more performance with a 24/7 overclock (say 1100 boost/ 6800-7200 effective memory) which stack up relatively well with the 50-90% range you could expect by adding a second 670. At present you'll only find one air cooled 780 clocked higher ( the improbable Inno3D iChill HerculeZ ), although the Gigabyte WF3 isn't that much behind, but is a little more expensive.
 
@Divide
Thanks again for the additional thoughts on the gtx 780. I'll give it all some more thought myself. Even if I did buy a 2nd gtx 670, I wouldn't do it before actually needing (most surely not before witcher 3). For now, single gpu it is.
 
No problem.
Since your question revolved around getting the second card that's what I addressed, as opposed to answering an unasked question of should you buy a second card.

The GTX 670 is one of the best bang-for-buck cards for HD gaming as you undoubtedly already know. Adding a second card is a solid option if the cash outlay isn't too great. If the other option is a single card upgrade then IMO you'll have a wait for Maxwell (6-9 months since Apple has earmarked the bulk of TSMC's initial capacity). It will also likely be an expensive option since the GPU will be built on the 20nm process. Raw production cost for the GPUs will be in the order of 50% higher than for 28nm ones -and that's assuming the yield from the get-go is on par with the high yields TSMC currently enjoys. If Nvidia integrate an ARM CPU into the high-perf. GeForce line of boards then that price goes higher still.

The other single card option is the one being exercised by quite a few people in your situation (and myself incidentally), and that is the GTX 780. The card won't provide the same GPU power as two 670's, but it is a single card option that allows for very high levels of game image quality (likely maximum quality @ 1080 in the vast number of games). The EVGA SC w/ ACX cooler is $10 more than the reference card, and allows for significant OC/boost headroom (more than the reference blower). The card will give ~45-50% more performance than the single OC'ed 670 (around the same as the Titan) in factory trim, and ~ 65% more performance with a 24/7 overclock (say 1100 boost/ 6800-7200 effective memory) which stack up relatively well with the 50-90% range you could expect by adding a second 670. At present you'll only find one air cooled 780 clocked higher ( the improbable Inno3D iChill HerculeZ ), although the Gigabyte WF3 isn't that much behind, but is a little more expensive.
Why would someone pay $650 for the performance Alpha Gamer can get with another GTX 670, that is $150 less. Yes, a single GPU offering is always the better choice, but for price range, the 670 makes much more sense.
 
Why would someone pay $650 for the performance Alpha Gamer can get with another GTX 670, that is $150 less.Yes, a single GPU offering is always the better choice, but for price range, the 670 makes much more sense.
Maybe you missed my earlier post:
The GTX 670 is one of the best bang-for-buck cards for HD gaming as you undoubtedly already know. Adding a second card is a solid option if the cash outlay isn't too great.
As for the option of a single card - that stemmed from the talk of waiting for Maxwell, but I'll use my own situation as an example.
The single GTX 670 I'm using at present is one of the higher specced SKU's available. It is also not particularly common. As such it holds value out of proportion with what a used GTX 670 would normally sell for (as do other vendor specials if they aren't beset with problems) for those seeking an identical card for SLI purposes.
There are two choices. Buy another used GTX 670 for SLI (as I did with my previous 580's) which is a cheap performance multiplier. The downside is that their value decreases markedly as the end of their warranty period approaches, and as the series marches from GTX 6xx to 7xx and then to 8xx.
Second choice to sell the card whilst the market value is still high and a sizeable amount of warranty is extant, and move to a card that offers a tangible upgrade in performance, Bear in mind that the 3GB framebuffer in my case would come in handy since I game at 2560x1440, although even 1080 can saturate a 2GB framebuffer given enough antialiasing and post processing. The obvious downside is the decreased performance-per-$, but that is somewhat mitigated by the high price I'll get for my Jetstream 670, and the simplicity of a single GPU (at least for the time being).

So, from an economic standpoint it comes down to the original card (say $370-400) plus the second card ($280-300) gives you an investment of $650-700 in the 670 SLI setup. An investment that on face value is worth $560-600 today ( 2 x $280-300) and less tomorrow. This time next year, those two cards will be worth maybe $400-450, and you'd need to add ~$150 if buying a Maxwell based card* if price segments stay static. $800-850 net spend

The other alternative is you sell the 670 and take a depreciation loss of $70-100, add $350 to buy a GTX 780. Resell the 780 for ~$450-500 when Maxwell comes-a-callin' (assuming that Maxwell offers enough to warrant the upgrade), and add ~$100 to get the same Maxwell card*. $750 net spend.

* assuming a GM 104 type-GPU (like-for-like in the product hierarchy)

Most issues I've seen with the cost of upgrading usually stem from the user not maximizing their return by reselling their old kit at the optimum time. Graphics cards have a very distinct economic lifespan that tails off rapidly after two years, Check the prices of GTX 580's and HD 6970's in the resell market and work out the rate of depreciation as soon as a new series enters the public consciousness.

Of course, those are all empirical arguments to bolster the real reason. Some of us just love buying new s___.
In the three an a half years I've been a member here, my graphics cards ( Up until recently I had two rigs - one Nvidia, one AMD) have moved from HD 4890 >> HD 5850 BE (and CFX) >> HD 5850 Toxic 2GB (and CFX), back to XFX 5850 BE's, HD 5770 (and CFX). and GTX 280 (3SLI WC), GTX 580 (and SLI) + GTX 280 (PhysX), and my present lonely GTX 670- which was bought almost solely as a placeholder until GK 110 arrived. This is probably more up/sidegrading than most people would usually undertake. Even taking into account these swap outs it doesn't actually cost me too much to upgrade thanks to the resale value. The golden rule is never buy the reference card unless you plan to put it underwater. Reference cards don't do particularly well in resell.
 
Interesting. As a side note, I think Maxwell and the HD 9000 series will carry more VRAM with the growth of 4k content development. Do you agree DBZ?
 
JC713
Each generation has brought more VRAM as we continue forward to higher definition content. Ill put my estimates based on the expectations of the HD 8xxx series and the gtx 7xx series that the cards in the 9xxx and 8xx series will continue with probably 6gb being standard on the upper cards with 4gb or 3gb on the lower models.
 
JC713
Each generation has brought more VRAM as we continue forward to higher definition content. Ill put my estimates based on the expectations of the HD 8xxx series and the gtx 7xx series that the cards in the 9xxx and 8xx series will continue with probably 6gb being standard on the upper cards with 4gb or 3gb on the lower models.

They are rumoring the 8970 to have 6GB of memory. Bandwidth has to keep up also.
 
Oh yea I forgot JC713, I was thinking they said 4gb was the rumor but your right, I just looked again mb.
Yea, honestly as time comes, memory will get to crazy levels because on new tvs and different setups. The bandwidth part will be the deciding factor honestly and it will be who goes for the higher bandwidth first more than memory in my opinion (Since were already getting to levels of Memory that are almost unnecessary except in very extreme scenarios).
 
Interesting. As a side note, I think Maxwell and the HD 9000 series will carry more VRAM with the growth of 4k content development. Do you agree DBZ?
Probably a given I'd say.
Where onboard vRAM was more of a marketing ploy in the past, 4K, 3D, and multi display gaming make the requirement somewhat necessary now. The problem is that because memory IC's are limited to 2 gigabit density you end up having to run the GDDR5 in clamshell mode with half the chips stuck on the back of the PCB and a lot of chips and traces to take account of (for example a 6GB HD 7970 would need 24 individual 2-gbit memory chips = 48-gbit / 8 (bits per byte) = 6GB). The future holds the promise of 4-gbit IC's (if the PS4 is any indication), which means that the IC requirement is halved, as well as the likelihood of lower prices for existing 2-gbit IC's as 4 gigabit comes on stream.

What I haven't 'managed to find concrete evidence of so far is if GDDR6 chips with 4-gbit density are on the same timescale as GDDR5. If they aren't then you could have a mix of high vRAM capacity GDDR5 cards being launched with lower capacity GDDR6 models - a similar situation to what has happened with insane amounts of DDR3 on entry level cards selling alongside more traditional capacity GDDR5 models.
 
Probably a given I'd say.
Where onboard vRAM was more of a marketing ploy in the past, 4K, 3D, and multi display gaming make the requirement somewhat necessary now. The problem is that because memory IC's are limited to 2 gigabit density you end up having to run the GDDR5 in clamshell mode with half the chips stuck on the back of the PCB and a lot of chips and traces to take account of (for example a 6GB HD 7970 would need 24 individual 2-gbit memory chips = 48-gbit / 8 (bits per byte) = 6GB). The future holds the promise of 4-gbit IC's (if the PS4 is any indication), which means that the IC requirement is halved, as well as the likelihood of lower prices for existing 2-gbit IC's as 4 gigabit comes on stream.

What I haven't 'managed to find concrete evidence of so far is if GDDR6 chips with 4-gbit density are on the same timescale as GDDR5. If they aren't then you could have a mix of high vRAM capacity GDDR5 cards being launched with lower capacity GDDR6 models - a similar situation to what has happened with insane amounts of DDR3 on entry level cards selling alongside more traditional capacity GDDR5 models.

Very interesting. When is the projected release of graphics cards with GDDR6?
 
Oh yea I forgot JC713, I was thinking they said 4gb was the rumor but your right, I just looked again mb.
Yea, honestly as time comes, memory will get to crazy levels because on new tvs and different setups. The bandwidth part will be the deciding factor honestly and it will be who goes for the higher bandwidth first more than memory in my opinion (Since were already getting to levels of Memory that are almost unnecessary except in very extreme scenarios).

Yeah. I wonder how much memory we will see in GPUs when ray tracing gets adopted. Here is the latest rumor: http://wccftech.com/rumor-amd-radeon-hd-8970-pictured-features-curacao-xt-core-2304-sps/.
 
Very interesting. When is the projected release of graphics cards with GDDR6?
DDR4 is already sampling, so my guess is that GDDR6 could be ready to go in around nine months. The problem is that the new cards would need to have the GDDR6 memory controllers worked into the GPU design to take advantage of it- which means that either Volcanic Islands will utilize GDDR5 if the late-2013 launch cheerleaders are right, or more likely IMO, that VI will be a staggered launch with high volume urgently needed mobile/mainstream chips launching first with GDDR5, and the high performance chips launching later (along with Maxwell) using GDDR6- probably late in Q1 2014.

EDIT: Your WCCF link shows a Curacao chip that is offering only an incremental advancement over Tahiti. The increased raster back end will help ( Tahiti is somewhat restrained in this regard), but I'd expect clocks to be similar to those already seen since the die is going to be slightly larger and it's still 28nm. So much for 20nm by years end if this is rumour is true - not that I gave it any credence anyway.

BTW: That supposed HD 8970 image ? Its fake. Photoshopped (foreshortened) HD 7990.

Anyhow, this thread is now seriously off topic.
 
Back