Radeon and GeForce GPUs work well together in DirectX 12 multi-adapter

By Scorpus ยท 10 replies
Oct 27, 2015
Post New Reply
  1. DirectX 12 comes with a ton of new features that game developers can harness, and one of the coolest of these features is multi-adapter support, which allows you to harness the power of multiple non-identical GPUs at the same time.

    This means that developers, if they choose to do so, can create a game that supports both Nvidia and AMD graphics cards at the same time in a multi-GPU setup. With DirectX 11, this simply would not be possible as games would be restricted to typical SLI or CrossFire solutions, but with DirectX 12, many more possibilities are open to developers.

    Oxide Games, again at the forefront of DirectX 12 development, has already implemented Unlinked Explicit Multiadapter mode in an alpha version of their upcoming game Ashes of the Singularity. The guys at AnandTech have managed to get their hands on a preview version of the game with this mode enabled, and have tested it with a collection of non-identical cards from both AMD and Nvidia.

    The results of their tests are a little surprising, with Ryan Smith of AnandTech concluding that the game performs best when you mix AMD Radeon and Nvidia GeForce cards together in the one system.

    When AnandTech tested with just a pair of Nvidia GPUs (a GTX 980 Ti paired with a GTX Titan X), they experienced 46% better frame rates than with a single GTX 980 Ti. But when the 980 Ti was paired with an AMD Radeon R9 Fury X, frame rates improved by 68% over a single GTX 980 Ti, and 75% over a single Fury X. Interesting to say the least.

    Further testing performed by AnandTech revealed that using the Radeon GPU as the system's primary card delivered better results than if the GeForce GPU was the primary card. Scaling across resolutions was also good when high-end cards were used.

    The full results over on AnandTech are very interesting and well worth the read, although it should be noted that Unlinked Explicit Multiadapter mode in Ashes of the Singularity is still just a tech demo, and performance figures from the benchmark may not end up reflecting real-world in-game performance.

    And stay tuned, because we will be putting Ashes of the Singularity, with its DirectX 12 mode, through its paces in the near future.

    Permalink to story.

  2. VitalyT

    VitalyT Russ-Puss Posts: 3,665   +1,950

    Trying to understand how this works...

    I thought that CrossFire and SLI were supported only via a physical hardware bridge that's proprietary for each of the two platforms, like these ones:


    Are they no longer necessary? Then what was their real function to begin with - a decorated on/off switch? :) Wouldn't have thought so, given how many pins they both feature. :)
    Last edited: Oct 27, 2015
  3. letsgoiowa

    letsgoiowa TS Rookie

    AMD hasn't used the Crossfire bridges since the 6000 series AFAIK. I don't know about Nvidia, but I think their recent cards don't either. Should be fine.
    VitalyT likes this.
  4. wiyosaya

    wiyosaya TS Evangelist Posts: 1,935   +763

    Personally, this does not surprise me. I run some compute applications and it has been known for some time (though I have no explicit references for this except to say you might find some at http://gpgpu.org/) that in certain gpu compute programs, AMD far outperforms nVidia. Take, for instance, bitcoin mining. AMD has long been considered the better manufacturer for that particular application.

    For me, I'll assume that this means that they both have their strengths and weaknesses and that running the two such that the workload is distributed among them at least gives the possibility of leveraging the strengths of both. Who knows what is going on in DX 12, but IF the developers are aware or able to dynamically determine the strengths of the cards that are in the system, then it seems highly likely that they could optimize workloads in a manner such that the tasks are distributed to play to the strengths of whatever cards are in the system whether or not the cards are physically connected by a bridge.
    So far, I have not run cards in SLI but I am planning on building an SLI system in the near future. On an nVidia site, I found that an SLI bridge was not necessary for SLI, however, if used, the performance of SLI is better with the bridge than without.

    From this page - http://www.geforce.com/whats-new/guides/introduction-to-sli-technology-guide#5
    VitalyT likes this.
  5. Thought I heard somewhere about bridges not being necessary anymore because it goes through the motherboard now (newer ones), but I could be wrong.
  6. sthet81

    sthet81 TS Rookie Posts: 18

    Only Nvidia's GPU still require a connector as shown on your pics. AMD's newer card doesn't and you can actually mix and match different cards from different generation unlike Nvidia. Please refer to their site for more info.
    VitalyT likes this.
  7. sthet81

    sthet81 TS Rookie Posts: 18

    In order to have your cards in SLI, you need the SLI bridge connector and you need to enable SLI mode in your Nvidia control panel. If you don't have an SLI connector, then only one card will be used as the GPU while the other card turns into a PhysX card only. Complete waste of GPU power. Please note you can only SLI within the same GPU family meaning Maxwell only not Maxwell and Kepler.
  8. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 9,728   +3,701

    Then how do you explain the article or the following write up?

  9. sthet81

    sthet81 TS Rookie Posts: 18

    I was replying to question pertaining to someone wanted to Use their cards in SLi. Just wanted to clarify how it works now with Direct X 11 not referring to how Direct X 12 can utilize multi card setup in the future.
  10. Adhmuz

    Adhmuz TechSpot Paladin Posts: 1,828   +633

    PCI-E 3.0 has the ability for multi GPU setups to be used sans bridge, take AMD for example, however Nvidia likes to force their consumers to use a piece of their technology (SLI required a physical nForce chip-set on the mobo originally combined with the bridge) now I believe they have done away with said chip and it's just a software limitation implemented in the drivers that goes based on the motherboard you have.

    This all applies to DX11 however and DX12 looks like it's going to change this dramatically if they can get it to work right with any/all GPUs, it would only make sense to use both at this point in time, games are being coded to take advantage of one brand of GPU or the other. By having one of each you can experience the best of both worlds, the only concern I have is the whole primary, secondary card, this may effect performance based on the game being played. It would be a shame to have to physically swap GPUs each time you wanted to use the stronger card for your application, or at minimum switch which card your display is connected to.

    Really interested to see where this goes, now to find a way to get DX12 working in Windows 7.
  11. wiyosaya

    wiyosaya TS Evangelist Posts: 1,935   +763

    So the info about 2-way sli at the nvidia site is bogus? OK. ;)

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...