Previewing DirectX 12 Mixed GPU Gaming with Ashes of the Singularity

By Steve
Feb 24, 2016
Post New Reply
  1. Ashes of the Singularity gave us an early peek at how AMD and Nvidia's current GPUs are shaping up in Microsoft's latest API when we checked out the upcoming DX12 real-time strategy title last November. Back then our focus was primarily on DX11 vs. DX12 performance with Nvidia's Maxwell and AMD's GCN 1.2 architectures.

    Although the game still doesn't have an official release date, we know Stardock has been hard at work as shown by the cool new features in its Ashes of the Singularity benchmark 2.0. The most notable of them is 'explicit multi-adapter' (EMA), DirectX 12's multi-GPU technology, which gives game developers precise control over the workloads of their engine and direct control over the resources offered by each GPU in a system.

    Among other things, this enables support for both AMD and Nvidia GPUs in the same system, meaning it is possible to pair a GeForce GTX 980 Ti with a Radeon R9 Fury X for example. This mind blowing feature is what we will be focusing on today.

    Read the complete review.

  2. RoosterFish2

    RoosterFish2 TS Enthusiast Posts: 37   +30

    Wow that is amazing. Never thought I would see GPUs from different vendors being run on the same machine. Looks like review sites will have a lot of work to do when DX12 fully arrives. Now I don't have to worry about my 970 not being able to last because the "3.5"gb of VRAM will become 7 if I buy another one.

    Will I be able to run my GTX480 alongside my 970? What gen does a GPU have to be to run in this mode? Now that I ask, I will venture a guess and say it has to be DX12 compatible?

    Looking forward to the comments and seeing how this shakes up the industry.
  3. Raoul Duke

    Raoul Duke TS Guru Posts: 860   +307

    This looks totally awesome, both the development and the game too!
  4. Adhmuz

    Adhmuz TechSpot Paladin Posts: 1,617   +493

    Well, it's about damn time we get to mix and match GPUs and on top of that they add VRAM without dividing by the number of GPUs! This is easily the most anticipated feature set of DX12.

    The only question I have, when it's mentioned that 16GB of RAM is required for this to work properly, is that because 8GB was not enough, could 12GB be sufficient? With 16GB how much RAM was actually being used?

    I found this on Trustedreviews

    "DirectX 12 will be supported by the vast majority of PC graphics cards already on the market. Any Nvidia card since the launch of the Fermi architecture (GTX 400 series), any Intel graphics since Haswell and any GCN-based AMD cards (HD 7000 series) support it, which makes for around 70% of the existing install base."

    Not sure if ALL features will be compatible however.
    RoosterFish2 likes this.
  5. Squid Surprise

    Squid Surprise TS Guru Posts: 810   +246

    Interested if Triple Cards will make a difference.... I got triple Titan X cards that are just itching to try!
  6. darkzelda

    darkzelda TS Addict Posts: 251   +86

    And amazingly there is People complaining that some future games use DX12. I'm really worried about my GTX 970, Nvidia doesn't seem to support their cards for long, and this results seem to show just that.
  7. veLa

    veLa TS Evangelist Posts: 699   +164

    I'm not surprised at all that Radeon cards are consistently "better leaders."
  8. psycros

    psycros TS Evangelist Posts: 1,295   +663

    Yeah, this is really concerning. There's nothing said about whether or not this HUGE barrier to DX12 acceptance will be addressed. What good is a standard who's biggest selling point is a major improvement in multi-GPU support when 99% of us won't have the hardware for it?
  9. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 8,430   +2,822

    This is still just one game and may not be a true indication of things to come. I will however say that if it is, it is also a strong indication of how sorry AMD drivers have been.
  10. Lionvibez

    Lionvibez TS Evangelist Posts: 1,079   +331

    Never heard of hybrid physx.

    I've had a Radeon 7970Ghz + GTX650 super clocked for a few years now.

    not quite the same but still AMD + NV in one machine :)
    Last edited: Feb 24, 2016
  11. RoosterFish2

    RoosterFish2 TS Enthusiast Posts: 37   +30

    I knew I could use an old Nvidia just for PhysX, didnt know you could pair it with a Radeon that ran the graphics side. Thanks for the info.
  12. Adhmuz

    Adhmuz TechSpot Paladin Posts: 1,617   +493

    Easier said then done, I tried this, on several occasion, different cards, different version of windows, driver hacks, work arounds. It never worked in game, only in PhysX simulations... Even when I had two Nvidia cards of different generation they never played nice, it would always switch back to my main GPU for PhysX leaving the second card unused. PhysX needs to die and an open standard needs to fully replace it already, Nvidia hasn't done anything with the standard for years.
    darkzelda likes this.
  13. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    Lucidlogix's Hydra attempted to do the same thing - briefly. Some motherboards even shipped with the chip. Two years later and it was given the bullet to the back of the head.

    As for mixed-GPU in games, as many commentators and reviewers have noted, the onus for making sure this works is firmly in the hands of game developers who presently can't even manage to port a console game to PC without a lengthy teething process and using the customer base as beta testers. Adding extra work and financial input to make mixed-GPU happen consistently sounds remote since I doubt Nvidia and AMD will be footing the bill, and mixed-GPU support very likely isn't a big enough selling point on its own to make it a game developers priority.
    Nice tech, and it would be nice to see it implemented properly - I.e. load balancing for performance mismatched cards, better memory resource allocation (I.e. one card with a large memory framebuffer to hold the textures can be used with a smaller capacity card which might be responsible for global illumination).
    I'd say AotS is a best case (AMD)/worst case (Nvidia) scenario. Oxide developed the Nitrous engine in tandem with AMD as a demonstrator (Star Swarm) for GCN and Mantle. Safe to say that by the time DX12 games start shipping that there will be a variety of implementations ( most centered on Unreal Engine 4 at present - and that has close dev ties with Nvidia) - some will undoubtedly highlight architectural strengths and weaknesses depending on engine and degree of IHV involvement. Add in Microsoft's involvement, the evolution of Vulkan, and the upcoming new graphics architectures and the future seems no more clear cut than the past has been.
    Last edited: Feb 24, 2016
  14. hahahanoobs

    hahahanoobs TS Evangelist Posts: 1,599   +411

    Thanks for this Steve.

    Now we wait to see if nVIDIA can do better than they did with a future driver.
    As for mixing GPU's, my OCD won't allow it. That and if one card does better than the other, why would you mix them?
    Steve and cliffordcooley like this.
  15. Technician

    Technician TS Addict Posts: 677   +113

    DX12 looks less promising to me than Vulkan.
  16. Lionvibez

    Lionvibez TS Evangelist Posts: 1,079   +331

    For the games that I own that support it I've had no problems.

    Borderlands 2, Batman AC

    I can't speak towards the issues you had with dual NV cards but I do agree an open standard is whats needed going forward.
  17. Fernando GC

    Fernando GC TS Member Posts: 18

    Is this a dream? Mix and Match video cards sounds amazing!
  18. Fabio Turati

    Fabio Turati TS Rookie

    I think there's an error on the second graph on page 7. The description is identical to the first one, but I think for the second one you have run the test at 1600p.
  19. misor

    misor TS Evangelist Posts: 1,157   +195

    The mix and match mode will render the old joke obsolete:
  20. mikedvideo

    mikedvideo TS Rookie

    Will this apply to Adobe users for rendering and editing?
  21. robb213

    robb213 TS Addict Posts: 309   +92

    Sadly, like Dividebyzero already said, I doubt this will come much to fruition. It reminds me of DX11 back in 2010 (6 years for wide support). I'm sure the biggest games like Battlefield, GTA 6, and other marvels will support it, but I don't see it bring much to mention unless you have a previous generation AMD or Nvidia sitting in your closet that you can start using again. What really excites me is the additive VRAM pool feature--but that'll be rare too I bet.

    Yeah, Linus over on his YouTube channel did something like that. I think it was a GTX 780 and a R9 290X? If I remember right too, it worked well for what it was--he was able to get Physx running fine with the AMD running as a "master" so to speak. But my memory is fuzzy as hell.

    In your case you may be better off not using the 480 at all. I have mine in my closet and wondered the same thing. Considering its power draw, and fewer CUDA cores, it may actually perform worse than running just the 970 (educated guess).
    LucidLogix has to be the biggest fail I can recall. When I tried a whack at it, the microstutter was so bad it mimicked as if the frame rate was low. Then again, I recall hearing about how it never worked to begin with and simply forced your system to be pretend it was getting more when it wasn't.
  22. HaMsTeYr

    HaMsTeYr TS Maniac Posts: 376

    The GTX 480 isn't DX12 compatible, so I really doubt you can run them together :(
  23. Badvok

    Badvok TS Booster Posts: 115   +46

    Would anyone run a dual 390X? It may be cold outside at the moment but a dual 390X setup would solve all your heating problems ;) You might actually need to turn on the air-con.
    Peter Farkas likes this.
  24. Badvok

    Badvok TS Booster Posts: 115   +46

    Yeah, I know what you mean, the last update for the 9800GTX I have running in a old PC was November last year! I mean come on NVidia get your act together you should be rolling out new drivers for this 8 year old card every month!
    cliffordcooley and Peter Farkas like this.
  25. failquail

    failquail TS Rookie

    It would work if it wasn't for the fact Nvidia actively disables GPU Physx if an AMD card is present.
    I have heard of driver hacks to get around that, but you have to fight with the Nvidia drivers to make that happen.

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...