1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Testing GTX 1080 SLI Performance with Dual Palit GeForce Cards

By Steve · 25 replies
Jul 5, 2016
Post New Reply
  1. Nvidia's new top of the line GPU arrived with quite a bang. The GeForce GTX 1080 allows for no-compromise 4K gaming on a single GPU. In our full review we found that even with the eye candy turned all the way up it was possible to achieve playable performance in all the latest titles.

    And yet in spite of this, that didn't mean it was able to hit 60 fps at all times. Hitting that lofty target requires more than any single card on the market can deliver.

    The solution? Two GTX 1080s, of course.

    In this article, we're taking a close look at the GTX 1080's dual-card SLI performance at 4K resolution to see exactly how much more graphics-crunching horsepower that second GPU brings to the table.

    Read the complete article.

  2. Theinsanegamer

    Theinsanegamer TS Evangelist Posts: 634   +606

    Did you guys test with two old bridges, or just the new bridge? Other sites have reported no difference between two old bridges and a single new HB bridge on dual 1080s, although none of them have tested frame latency with such a setup.

    Also, dual 1080's are amazing. at 4k. at 2k though, it seems like even 1 is a bit of overkill unless you are seeking 144FPS for a high refresh display. more reason to get a cheaper 1070 or two.

    Speaking of, any chance of a dual 1070 review in the future?
    Evernessince, Reehahs and wastedkill like this.
  3. amstech

    amstech IT Overlord Posts: 1,712   +874

    Whats going on with the CrossfireX 390X? Yikes.
    Not picking on it, just honestly curious.

    This is why I stick with big single GPU cards.
  4. Burty117

    Burty117 TechSpot Chancellor Posts: 3,042   +793

    I was thinking the same thing, it's minimum framerate is lower in crossfire than just on it's own in Ashes of Singularity!
  5. CaptainTom

    CaptainTom TS Maniac Posts: 253   +103

    Pretty terrible scaling imo. If you had to use 2 cards the 1070 makes way more sense.

    I will also say that until we can game at 60 FPS in 4K it doesn't matter how strong these cards get. 1080p was saturated in 2011, and 1440p is the only case this might help...
  6. Evernessince

    Evernessince TS Evangelist Posts: 1,788   +1,007

    It's the game selection. I believe the witcher 3 only gets something like 52% scaling with crossfire. Similar story for Tomb Raider and The division.

    I like the thick heatsinks on these cards but I don't like the shroud. Heatsinks like these usually cool very well.
  7. amstech

    amstech IT Overlord Posts: 1,712   +874

    Hmm interesting.
    One of my AMD buddies has a 390X and that cards a beast, wonder if the new Crimson drivers would help?
  8. Evernessince

    Evernessince TS Evangelist Posts: 1,788   +1,007

    The crimson drivers helped with frame pacing to reduce micro stutter but usually the cross-fire game improvements are either released when the game comes out or in a monthly update. Almost always helps to have the latest driver.
  9. JetFixxxer

    JetFixxxer TS Enthusiast Posts: 40   +6

    I can't get Doom to run in CF with my 390x's. It does run very well on a single card on Nightmare Settings at 1080.
  10. VitalyT

    VitalyT Russ-Puss Posts: 3,456   +1,735

    No Crysis? How come? It is the most demanding title, should the first one on the list!
  11. Steve

    Steve TechSpot Editor Topic Starter Posts: 2,358   +1,517

    How big is the Crysis 3 multiplayer scene? I am guessing there are few people looking at playing the single player campaign on their 1080’s, given the game is now more than 3 years old. This was the thinking behind focusing on modern titles such as Far Cry Primal, Overwatch, DOOM and The Division for example.
    hahahanoobs likes this.
  12. SirGCal

    SirGCal TS Maniac Posts: 365   +136

    As soon as the 1080 Ti comes to light, I'll pick up a pair. But I waited this long, I can wait a bit longer.
    hahahanoobs and atcapistrano like this.
  13. Boilerhog146

    Boilerhog146 TS Addict Posts: 209   +52

    I'll be an advocate for multi gpu tech.my main rig has been multi gpu since I moved from AGP to pci-e.
    and I'm extremely excited to see such awesome scaling in some titles .90% scaling is unheard of ,well done.
    this should tell some game dev. that it must not be too difficult to code for multi gpu.but my guess, games that aren't written with multi GPU in mind are to be ported to console ,where we will never see multi gpu's.. Ashes to Ashes.lol :p
    Last edited: Jul 6, 2016
  14. Boilerhog146

    Boilerhog146 TS Addict Posts: 209   +52

    your opinion is wrong ,sorry ,98% ,93%,even 67% ,is NOT terrible scaling,if a title isn't fully coded for multigpu's ,of course the scaling isn't going to be there ,but obviously some DEV,are doing their homework .and that's impressive,

    I don't mind that some game DEV. are lazy, and want to make as much cash for as little work.as possible ,COD anyone ,I gave up on infinity ward for that exact reason.same game engine title after title after title,

    we the ignorant have learned ,the stupid still worship the ground they walk on...
  15. Boilerhog146

    Boilerhog146 TS Addict Posts: 209   +52

    @ Steve , I find that games in single player that have the big baddies , that you have to run &gun ,,farcry,Doom etc.are shite in Multiplayer.but games that have just high numbers of fighters ,BF ,COD etc.are IMO designed to be great multi player experiences.
    FEAR comes to mind .it didn't come with a multiplayer mode on release ,it came as fear combat later..
  16. Boilerhog146

    Boilerhog146 TS Addict Posts: 209   +52

    Personally I would like to see a multi player Hybrid.where you are in a battle with a bunch of guys ,ver, a bunch of guys fighting each other ,then a big baddy comes out of the woodwork,and every one has to join forces to get rid of the baddy ,then get back to their own battle ,until another baddy shows up..interesting concept ,NO?
    Last edited: Jul 6, 2016
  17. ddferrari

    ddferrari TS Maniac Posts: 282   +116

    Did you read the review? We CAN game at 60fps 4K. Every game they tested ran at 60+ fps at 4K ultra. A few ran over 100fps.
  18. SirGCal

    SirGCal TS Maniac Posts: 365   +136

    Yup, still using a 690 here. So like I said, waited this long... But it plays everything just swell, I just want more everything and move up to 4k res screens, etc. Pretty soon. As for coding, it is so rare that I have to turn off GPU for a game to work. Most often it's a mild upgrade instead of a giant one with the extra GPU enabled. But many major games play wounderfully. I'll continue to run 2 cards.
  19. robb213

    robb213 TS Addict Posts: 328   +98

    I think @Steve should include info about what the bridges were intended to improve upon. A lot of people think due to the "increased bandwidth" that it should mean better performance is possible. But it's not that cut and dry, as some Nvidia slides I saw a month or so ago explained the real purpose: increased bandwidth to properly support resolutions of 2560x1440 and above with higher refresh rates.

    Chances are if you use a Maxwell or earlier card, you'd notice terrible microstutter (or worse like frame problems/latency) when playing at such resolutions like 4K @ 120 Hz. Additionally, I don't know if 2 of the older bridges are a substitute for the new SLI HB bridge. For all I know it may still treat it as 1 bridge even though 2 are hooked up. Apparently that LED bridge that came out some time ago was an actual improvement and didn't know that. I use the old original bridges, so playing at 4K @ 120 Hz in nearly each modern game gives me problems.

    The following is the data sheet I saw from a presentation dump: http://core0.staticworld.net/images/article/2016/05/sli-configuration-100661339-large.png
  20. DAOWAce

    DAOWAce TS Booster Posts: 261   +39

    First time I ever went SLI was with the 460 Ti's.

    I vowed to never go SLI again.

    After NVIDIA screwed me with the 780 Ti's release, I went SLI to extend the lifespan of my system until Pascal's release. Bought a used 780 for $200, ontop of the $700 I spent for my REGULAR 780 many months after release. There was zero mention of a "Ti" card coming out, let alone one of their highest end card (sans titan).

    After running SLI again, it made me never want to run SLI again.. again. Reminded me of all horrible issues with it, from poor scaling, microstutter and insane heat output (top card is literally 15C higher than the bottom).

    Not to mention multiple GAME ENGINES (UE4) have no support for SLI, and the games that actually work with SLI all have either poor scaling, or multiple rendering issues. I can't recall the last thing that came out in the past few years with great SLI support that wasn't a benchmark utility.

    There's a lot more to a gaming experience than "FPS", and almost nobody (tech sites) looks at them. It's an atrocity.

    The only objective benefit to a multi-GPU setup is being able to run applications on individual GPUs based on what monitor they're on. This makes multi-boxing a much better experience... unless you're using anything but Windows 10 with WDDM 2.0, which finally fixed the atrocious multi-monitor issues plaguing Windows since Vista.

    Wish we could go back to the days of Windows XP..
  21. Boilerhog146

    Boilerhog146 TS Addict Posts: 209   +52

    Don't know what you were doing wrong but I've never had issue with sli,or Quad sli for that matter.I must have chosen the right games.
    In First Person Shooters ,there's an old saying that still holds true today,Frames is Life.
  22. Jack007

    Jack007 TS Booster Posts: 179   +38

    It is like amazing grace playing 4k games on this. Should last a very long time with driver updates. Until pascal hits with the new hbm memory and card slot
  23. Theinsanegamer

    Theinsanegamer TS Evangelist Posts: 634   +606

    I've been running SLI since 2010, only issue was one game that had color corruption, and that was fixed.

    Higher temperatures are well known for SLI. If your case is not set up properly, you will have issues. My top 770 is a whopping 3C higher then the bottom one, but my case is designed to properly vent heat out of the case.

    Perhaps the problem is your setup, not SLI?
  24. DAOWAce

    DAOWAce TS Booster Posts: 261   +39

    I'd be here for hours trying to list all the games that had issues due to SLI. Just browse the NVIDIA SLI forum and look at all the complaints in just the last half a year. Just because you don't play a game with issues, or don't notice or know about the issues, doesn't mean they don't have them.

    I use open air fans, not blower design, so that's already going to vastly increase heat inside the case. It doesn't matter what case I use unless it's a test bench, which no regular consumer uses. It's unavoidable and actually made me think about using a blower style cooler, but I still refuse to due to the amount of noise it would produce.. and this is mostly moot anyway if using a single card. And no, they're not stacked next to eachother; there's 2 full PCI gaps between them, which is what I specifically selected my motherboard for.

    Current case is a Phanteks Enthoo Primo, if you were wondering. Might go full watercooling in future, but it's been too big of an investment for me thus far.
  25. Your dad

    Your dad TS Rookie

    Awesome article, thx!
    I have one question, maybe you could shed some light. I have the exact same gpu and Asus z270 mobo with standard pci spacing (should be like on yours). And I want to buy the second gpu for sli. But it seems to me there will be no side between gpus due to their huge heatsinks. Was that the case on your mobo? Did it cause any trouble and what were the temps?

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...