The Rise and Fall of Multi-GPU Graphics Cards

Unless you are working on deep learning or some other specific workstation project, I see no reason at all for multi GPU designs for gaming.

Not unless someone builds one specifically for gaming / streaming maximum performance. That would be really interesting.

 
Unless you are working on deep learning or some other specific workstation project, I see no reason at all for multi GPU designs for gaming.
Meaning if you are working on deep learning, you have a reason for multi-GPU gaming? :)

Seriously, multi-gpu designs make sense on a theoretical basis. Practically, though, their price-performance was cannibalized at the low end by increasingly powerful chips, and at the end high by multi-card setups. Squeezed on both ends, they eventually vanished.
 
"For a time, they served a niche market very well, but the excessive power demands and jaw dropping prices are something that nobody wants to see again."

Maybe YOU dont want to see again, but some of us still pine for the days of multi GPU excess, when you could scrape together enough cash for a pair of 550tis and perform nearly at the level of a stock 580.

Even as chips get more powerful, some would still like to lash several GPUs together, for things like VR or higher resolution gaming

I still think MCM GPUs are the future, as silicon gets pushed to the edge of its capability the use of multiple smaller chips in a rendering cluster as opposed to a single monumental chip will make more sense.
 
I don't really get the hate for SLi or why its being killed off.

I had an Asus Strix Nvidia 970 Sli setup from 2014-2018 and I had nothing but a great experience with it.

I was able to play stuff like The Witcher 3, max settings at 4K (albeit at a locked 30fps) Project Cars at 4K 60fps, the new Tomb Raider at the same and many others.

Can't say I ever really noticed the problems with it that so many others complained about. I suspect many of them had never actually tried it and were just jumping on the bandwagon of internet hate for the sake of it, TBH.

 
I still think MCM GPUs are the future, as silicon gets pushed to the edge of its capability the use of multiple smaller chips in a rendering cluster as opposed to a single monumental chip will make more sense.

The problem is and always has been latency. Yes, you drive up frame rates, but you also run into problems keeping a consistent framerate that also matches your displays refresh window. That's why Microstutter was always a thing.

You also need to consider that nowadays it's on the developer to implement multi-GPU support, and given the small user-base no one seriously wants to bother with it anymore.

Also consider that when you have $700 cards that can handle 4k60, there really isn't that much need for additional performance.
 
The problem is and always has been latency. Yes, you drive up frame rates, but you also run into problems keeping a consistent framerate that also matches your displays refresh window. That's why Microstutter was always a thing.

You also need to consider that nowadays it's on the developer to implement multi-GPU support, and given the small user-base no one seriously wants to bother with it anymore.

Also consider that when you have $700 cards that can handle 4k60, there really isn't that much need for additional performance.
I don't really get the hate for SLi or why its being killed off.

I had an Asus Strix Nvidia 970 Sli setup from 2014-2018 and I had nothing but a great experience with it.

I was able to play stuff like The Witcher 3, max settings at 4K (albeit at a locked 30fps) Project Cars at 4K 60fps, the new Tomb Raider at the same and many others.

Can't say I ever really noticed the problems with it that so many others complained about. I suspect many of them had never actually tried it and were just jumping on the bandwagon of internet hate for the sake of it, TBH.
A long time back - I used crossfire with two 5870s (Can't recall the GPUs but I think it was the AMD 5870s) - and had issues like micro stutter. Also some gaming engines did not support a multi GPU setup (e.g. I think it was RAGE - engine only use a single GPU) ... I just recall it being a frustrating experience, and since have always gone for a single GPU setup.
 
I recently built a system with spare parts. I have two GTX 980ti cards, custom SLI profile, and can run Borderlands 3 at 75 FPS at 4K. If implemented properly, multi GPU setups could still continue to perform exceptionally well. I blame developers, whether that be laziness, lack of time, or their expectations of everyone only running a single GPU.
 
A long time back - I used crossfire with two 5870s (Can't recall the GPUs but I think it was the AMD 5870s) - and had issues like micro stutter. Also some gaming engines did not support a multi GPU setup (e.g. I think it was RAGE - engine only use a single GPU) ... I just recall it being a frustrating experience, and since have always gone for a single GPU setup.

I had two GTX470's in SLI back in the day with a i7 920 @4ghz. I had water blocks on both and TBH they worked pretty well. But they were hugely driver dependent. I remember playing FarCry 3 on them and it took both game updates and driver updates before SLI worked well. IIRC it was a good month or so before the game worked well under SLI. But once it did it was pretty smooth.

SLI/CF were both highly dependent on a fast CPU. I went from a Q9550 to the i7 with those GTX470s, huge difference. I went from laggy mess to great performance. I had a great experience with those cards. Lack of VRAM was the only reason why I stopped using them when I did. 1.25GB was not enough, and that really became a problem later in life. SLI & Lack of VRAM don't play nice. However this was close to the end of the era for SLI/CF. Newer game engines and DX12 pretty much put the end to SLI/CF.

Before I got my gtx1080, I popped my buddy's 290 in with mine. It was fine on some games, and either did not work or didn't work very well with others. CF/SLI is dead. RIP

Maybe in the future they will bring it back
 
I'm one of the few people that still run Crossfire (I have a Fury X and Fury in CF) and for the games that support it, it's absolutely incredible. For example, in BF4 with Mantle my framerate goes so far past my 144hz monitor that I HAVE to supersample at 140% resolution to keep it under my refresh rate. Prey at ridiculous framerates and max settings is also a breathtaking experience too. These 2015 GPUs perform roughly like a 1080 Ti/2080 when they work together: and you could have had this experience 5 years ago. For my purposes and the games I play (mostly older stuff) this will last me a while until RDNA 2.

I think it's a tragedy that mGPU is disappearing because back in the day, you could get 2 $200 GPUs that annihilated a $600 GPU, or get dual top tier cards that would create a whole new class of performance. One majorly overlooked feature is that the existence of mGPU setups led to a sanity check in price/performance: who would spend $1200 on a GPU when dual $200 GPUs could tie or beat it? Part of the reason you have insane GPU pricing these days is because you have no other option: you have to pay a TON more for pretty pathetic performance increases in comparison to simply adding another GPU.
There is ALWAYS some game or application that will benefit from more graphics power, especially with ray tracing starting to come around. Eventually we'll see AMD/Nvidia solve the MCM problem and that's how it'll come about, but we will likely never be able to add in affordable cards to compete with flagships anymore. That's a big shame, as pricing/performance is going to continue to spiral out of control.
 
Maybe YOU dont want to see again, but some of us still pine for the days of multi GPU excess, when you could scrape together enough cash for a pair of 550tis and perform nearly at the level of a stock 580.
Ah but that’s multiple graphics cards. The comment in the article was purely about single cards sporting multiple GPUs ?
 
While I agree that mutli-gpu cards are no longer relevant, I don't really like the fact that Crossfire and SLI are no longer viable. My two reasons for this are as follows:

1.) FS2020
2.) Crysis Reloaded

I remember a long time ago when Arma III came out and you needed at least a GTX Titan to run it properly but you could get by just fine with twin HD 7970s in Crossfire (for a LOT LESS money). Also, for me, the first upgrade I made to a GPU was to add a second one so that I was still leveraging the power of the card I had already paid for instead of tossing it out.

I had upgraded my HD 4870 by adding a second, I bought two HD 7970s at the same time for Arma III and I initially upgraded my R9 Fury with a second one. With the exception of the Furies, I had absolutely no issues whatsoever with Crossfire. When the Furies worked, they were absolutely breathtaking but they didn't always work because games had stopped supporting it. Rather sadly, I realised that I probably wouldn't be doing this anymore.

Then a "game" like FS2020 comes out which practically BEGS for multi-GPU support but there isn't any. This is after DX12 made it possible to Crossfire/SLI ANY two GPUs regardless of make and model. Since this is Microsoft's baby, perhaps FS2020 will support it because if it does, that will make it far more accessible because many could just throw their previous card back in and get a nice performance boost for free (since FS2020 breaks the back of most cards).

The odd thing is that if you buy an X370/470/570 motherboard, it will natively support Crossfire and SLI. There has to be a reason for this.
 
A long time back - I used crossfire with two 5870s (Can't recall the GPUs but I think it was the AMD 5870s) - and had issues like micro stutter. Also some gaming engines did not support a multi GPU setup (e.g. I think it was RAGE - engine only use a single GPU) ... I just recall it being a frustrating experience, and since have always gone for a single GPU setup.
I had two 5870s in CF, I had tons of issues until I flashed them with identical BIOS' and killed the lowest 2d clock profile, increased the clock speed and played with the fan curve while I was at it. Once they had the same custom BIOS games that supported them worked flawlessly, and games that didn't, the un-utilized card would actually stay clocked high enough to not cause major stuttering issues. Sadly the single GB of VRAM buffer couldn't keep them going for very long, and as newer cards came out from AMD the drivers started getting worse and worse for my aging 5870s. Looking back I have no regrets, it was fun to play around with and when I picked up the 2nd card for $200 it got me another couple years of use out of my original. That was always the appeal to me in CF and SLI, buy the first card when it launches and then pick up a 2nd card at half the price or less two years later to boost performance to make it another generation. I almost made it all the way to the 970 I bought when that was new, although I did borrow some 2nd hand cards from friends to help out the last year or two.

Ah but that’s multiple graphics cards. The comment in the article was purely about single cards sporting multiple GPUs ?
Yes, however the same lack of driver support killed off multi card setups, which I believe far more people had as the price of admission was far more reasonable, and as I mentioned you can buy one, then pick up a 2nd card at a sizeable discount a year or two later.
 
"For a time, they served a niche market very well, but the excessive power demands and jaw dropping prices are something that nobody wants to see again."

Maybe YOU dont want to see again, but some of us still pine for the days of multi GPU excess, when you could scrape together enough cash for a pair of 550tis and perform nearly at the level of a stock 580.

Even as chips get more powerful, some would still like to lash several GPUs together, for things like VR or higher resolution gaming

I still think MCM GPUs are the future, as silicon gets pushed to the edge of its capability the use of multiple smaller chips in a rendering cluster as opposed to a single monumental chip will make more sense.
Yeah, top-end SLI was just for that e-peen. But you could cobbled together a couple of mid-tier cards to match a top-tier card for a little less - and possibly bypass a supply bottleneck of top-tier cards.

Imo, I doubt you'll see the SLI/CrossFire interconnects go anywhere anytime soon, but I also doubt you'll see multiple GPU chips on the same PCB ever again.
 
I have been a consistent user of multi-GPU configurations. Starting with one card and then adding second one unless the card itself was X2 version.

There is a typo in the article with the name of 5970 being written as 5790.

I would also like to see inclusion of more obscure X2 version made by partner companies.

460 2Win GTX, 4850x2, and ASUS MARS (760x2, 285x2, & 580x2) come to mind.
 
I don't really get the hate for SLi or why its being killed off.

I had an Asus Strix Nvidia 970 Sli setup from 2014-2018 and I had nothing but a great experience with it.

I was able to play stuff like The Witcher 3, max settings at 4K (albeit at a locked 30fps) Project Cars at 4K 60fps, the new Tomb Raider at the same and many others.

Can't say I ever really noticed the problems with it that so many others complained about. I suspect many of them had never actually tried it and were just jumping on the bandwagon of internet hate for the sake of it, TBH.

Multi-frame rendering methods are the real killer of multi-GPU. TAA is inherently incompatible with AFR since dependencies occur frame-to-frame requiring a peer-to-peer copy between GPUs for stale assets. You'd see that as nasty, unbearable stuttering (or full drops to 5-15fps for 1-3 seconds) each time GPU needs a copy.

Most games use deferred rendering in their game renderers these days, and while not strictly incompatible with multi-GPU (other than multi-frame post processing effects), it's harder to implement support than forward rendering.

The entire game industry has shifted away from multi-GPU, and you can see this in how game engines are optimized more and more for a single GPU.

I have dual Vega64s, but the last games I used them together in were Far Cry 5 and The Witcher 1-3, and Rage 1 (OGL) all in 4K/2160p.

The experience in FC5 was decent, but The Witcher 3 required so many game config tweaks to stop stuttering (mostly related to game engine even when installed on Samsung 960 Evo) at 4K60fps. Setting the game to 30fps stopped the stuttering altogether (esp. when running full speed with Roach in open terrain), so it wasn't really multi-GPU related. The Witcher 1 hated Windows 10 fullscreen optimizations (stopped frametime spikes when moving mouse or pressing mouse buttons), and The Witcher 2 needed a few post-processing effects disabled, else the unbearable stuttering I mentioned above happened. Rage 1 also needed some engine tweaks (r_useSMP 1). Civilization VI supports DX12 split-screen mGPU and that's pretty cool. Workloads on each GPU vary since they're rendering one side of the screen. The merge of the two sides is pretty seamless, but you can just make it out if leader crosses over it.

Otherwise, it was a generally good experience that I had too.
 
I had GTX 570s in SLI and the framerate variance was way too large. It was nice getting triple digit framerates and times but when they drop in the 30s it just makes you angry. Ditched the 570s for a single 670 and never thought about SLI since.
 
Almost since multigpu cards appeared we have been promised the holy grail...A technology that can make multigpu work as one.

Supposedly AMD and nvidia have been working on such a technology for years but still no results.

Now intel is also working on multigpu on its Xe series cards and finally have multiple gpus appear as one.

I would't declare multigpu dead yet and might even come back with vengeance. But SLI/Crossfire as we know them are finito.
 
I don't really get the hate for SLi or why its being killed off.

I had an Asus Strix Nvidia 970 Sli setup from 2014-2018 and I had nothing but a great experience with it.

I was able to play stuff like The Witcher 3, max settings at 4K (albeit at a locked 30fps) Project Cars at 4K 60fps, the new Tomb Raider at the same and many others.

Can't say I ever really noticed the problems with it that so many others complained about. I suspect many of them had never actually tried it and were just jumping on the bandwagon of internet hate for the sake of it, TBH.
I agree with you the main downfall of the multi gpu are several points which nobody mentions.
The lack or support by game developers and often even in the PRO segment of software.
But the absolute worst is the electricity prices which especially in my country are pumped up to extreme levels in my country the actual energy price is as low as in any other country.
But the taxes on your using it are insane the real price is probably 0,2 euro cent but after taxes and so called transport costs it is over 34 euro cents per kwh.
So its pretty costly, before one says some place its more expenssive that probably is so.
But there is no need for this insane high prices here most power comes from a large atomic powerplant and we have half our country filled with solar panels and windmills and loads of organic power production (mostly farmers). There are even plans to make use of the power of the nord sea to produce even more.
So power was the major concern with multi gpu setups, now people blabbing here that they are less used on companies are partially correct, because yes if they can they will lower costs if they can especially here with the insane high prices here (even though large companies pay alot less)
Anyway I do liked the multi gpu setups also the ones at work which I installed which are monster number crunchers with up to 8 graphics card per system these do complex calculations and are really useless for home users. Nevertheless there are plenty of them and can't be replaced by a single card nor that will ever happen. Unless the new cpu technic is really that powerfull as it was on paper. But currently those are only toys for the super global companies like IBM and the likes.
And the other hand the current gpu are actually performing at insane levels for gamers and there is actually no real need anymore for newer and faster ones.
Thats also the funny thin people bought very expenssive videocards with new toys which actually are almost not used at all. They pay a fortune for a card which actually does not show anything at all in my games. And to be honest NOT one new game made me think oeh I want that game.
That raytracing is so overhyped and does do little to nothing for me, so whatever people say about I do not care for my games its all useless and as now more than 2 years later hardly any new supports or use it and worse its hardly noticable.
People care so much about shadows .... I turn it off or put it on the lowest setting I can put it at.
I really do not like super dark games at all, so Doom3 was pretty quickly moved to the bin.
When I visited a friend he was bragging about his 2100 euro costing water cooled 2080ti which he is now crying about after this 3080 came out.
Anyway when I asked him to show me why he was so excited he only could showed some minor reflexes on a small puddle of water in game, and I thought is that what you make such a huge fuss about and payed so much money for.....
I will not tell what I thought also but it ain't nice for him ;)
Now people are again hyped about these new cards and they even as I speak made an order already for the newest hyped 30x0 serie.
nvidia will get its profit margins this year again in full, because I am pretty sure newer models will follow these pretty soon as well. I bet the next lineup is already in the pipeline to get more money out of your pockets and for what ... there are no games at all which really do maker use of this new stuff and really for me it does do nothing.
So I keep playing my old favorites and the likes and let it pass there is no need for me to buy a new pc nor will I buy anything whats hyped and actually does not add anything usefull or shows real improvements, for now I seen non.
Maybe in 5 to 10 years from now we can buy games which actually makes use of them, time will tell. But by then nvidia has cooked up probably another 5 to 10 new toys so you open up your wallet again for something new.
True some new developments are fantastic but besides only a minor example nothing has been developed for real into the gaming world with this new tech and I am pretty sure it will take a long time before we actually see any games having this.
For example I see alot of people talking about games and see youtube video's about them playing their game in 4k res, but I see with my 1440p monitor the exact same as they do, there is absolute no difference in the game nor does it look better.
If I buy a 4K monitor and I play a game I want to see some improvements and do not want to see the same graphics as I do when I run it on a 1080p monitor.
I was bored and saw the release of horizon dawn for pc and I bought the darn game, the bad thing is that it looks alot better on the freaking ps4, hell even the whole story was cut on the pc and the graphics are really only 1080p, yes you can run it at higher res but you still get 1080p material nothing more. But I really was hoping for better graphics but there is really no difference in most games in the graphics at al l if you turn up the graphic settings.
It still looks the same in whatever graphic setting I set it, no clear visuals or improvements.
Grass still looks like its a green blur, yes it moves but it still is just a green blur.
So he was stunned to see my screen showed the exact same as he had on his Hires pc.
There is totally no difference, so we tried many more games and found only a few actually show a major improvement in graphics. For me that made me decide that I do not have to change anything at all for years to come. I won't miss much with my current hardware at all.
And end of story when I really need more graphics power is just scope up a cheap 5700 xt from the second hand market and run 2 of them :D
 
Last edited:
I still think MCM GPUs are the future, as silicon gets pushed to the edge of its capability the use of multiple smaller chips in a rendering cluster as opposed to a single monumental chip will make more sense.

Intel is currently working very hard for a properly designed and implemented Multi-GPU setup on their Xe GPU Family aside from they are also working on a Chiplet-style Xe GPUs. I sincerely wish and hope that Intel will pioneer the glorious comeback of Multi-GPU Setups(SLI/Crossfire) and Dual-GPU Graphics Cards(GTX 295, GTX 590, GTX 690, HD 5970X2, HD 6990, HD 7990, R9 295X2). When that happens, AMD and NVIDIA will surely and shortly follow.
 
I think it's a tragedy that mGPU is disappearing because back in the day, you could get 2 $200 GPUs that annihilated a $600 GPU, or get dual top tier cards that would create a whole new class of performance. One majorly overlooked feature is that the existence of mGPU setups led to a sanity check in price/performance: who would spend $1200 on a GPU when dual $200 GPUs could tie or beat it? Part of the reason you have insane GPU pricing these days is because you have no other option: you have to pay a TON more for pretty pathetic performance increases in comparison to simply adding another GPU.
There is ALWAYS some game or application that will benefit from more graphics power, especially with ray tracing starting to come around. Eventually we'll see AMD/Nvidia solve the MCM problem and that's how it'll come about, but we will likely never be able to add in affordable cards to compete with flagships anymore. That's a big shame, as pricing/performance is going to continue to spiral out of control.
Then let's put our hopes on Intel's Xe GPU Venture since multi-GPU support will be an integral part of their GPU Roadmap. They are even extensively utilizing Chiplet-style Designs for their GPUs. If Intel's GPU Ambitions succeeded, we can hope for significantly lower GPU Prices.
 
The Nvidia Titan Z has a lot more than twice the 64-bit floating-point performance of the GTX 780 Ti, so in one way it is better than two of the latter.
 
The Nvidia Titan Z has a lot more than twice the 64-bit floating-point performance of the GTX 780 Ti, so in one way it is better than two of the latter.
But still a fair bit less than two Titan Black cards, which together was $1998 - $1k less than a Titan Z. The only thing going for it is that it’s a single card; everything else was nuts ??‍♂️
 
Not even an honourable mention for the GTX 590 tut tut. That beast is the real reason I had to buy 1200w PSU lol paired with my i7 980x it was a monster in its day, sitting in the rampage III Mobo. It would heat my whole room up after 30 mins of gaming. If I could of fed it aviation gas, I think it would of been happier!
 
Back