Quad GeForce RTX 3090 tested on a single PC running workstation benchmarks

Ivan Franco

Posts: 251   +9
TechSpot Elite

Here at Puget Systems, our customers range from those needing relatively modest workstations, all the way up to those that need the most powerful system possible. These extreme workstations used to be dual or quad CPU systems, but recently, the shift has been to quad GPU setups. Having this much GPU power can provide incredible performance for a number of industries including rendering, scientific computing, and even has some use for video editing applications like DaVinci Resolve.

Editor’s Note:
Matt Bach is the head of Puget Labs, a division of Puget Systems that does research and benchmarking to test workstation-level software packages. Puget Systems is a specialized builder of gaming and workstation PCs. This article was originally published on the Puget blog. Republished with permission.

With Nvidia's latest GPUs, the GeForce RTX 3080 10GB and 3090 24GB, the amount of performance you can get from a single GPU has dramatically increased. Unfortunately, the amount of power these cards require (and conversely the amount of heat they generate) has also increased. In addition, almost all the GPU models available from Nvidia and third party manufacturers are not designed for use in multi-GPU configurations which in many cases limits you to only one or two cards.

However, Gigabyte has recently launched a blower-style RTX 3090 that should give us our best chance of using three or four RTX 3090's in a workstations: the GeForce RTX 3090 TURBO 24G.

This type of blower-style cooling system is much better for multi-GPU configurations as it exhausts the majority of the heat directly out the back of the chassis. And when we are dealing with four 350 watt video cards, that is 1,400 watts of heat that we certainly want out of the system as quickly as possible.

While the cooler design may be able to help with the heat output, we also have the problem of total power draw. We should be able to power 1,400 watts with a single 1,600 watt power supply, but that doesn't leave much room for any voltage spikes, not to mention having enough to power the CPU, motherboard, RAM, storage, and other devices inside the computer.

This begs the question: is having four RTX 3090s inside a desktop workstation actually feasible? Or is the heat and power draw too much to handle?

Test Setup

To see if quad RTX 3090 is something we even want to consider offering, we wanted to put a number of configurations to the test to look at performance, temperatures, and power draw. The main system we will be using has the following specs:

  Test Platform
CPU Intel Xeon W-2255 10 Core
CPU Cooler Noctua NH-U12DX i4
Motherboard Asus WS C422 SAGE/10G
RAM 8x DDR4-3200 16GB REG ECC(128GB total)
Video Card 1-4x Gigabyte RTX 3090 TURBO 24G
Hard Drive Samsung 970 Pro 512GB
PSU 1-2x EVGA SuperNOVA 1600W P2
Software Windows 10 Pro 64-bit (Ver. 2004)

While our testing with up to three RTX 3090s is fairly straightforward, we have some serious concerns about power draw when we get up to four GPUs. We are going to attempt using just a single 1600W power supply with stock settings, but also try setting the power limits for each card to 300W (which should bring us comfortably below the 1600W max), as well as using a dual PSU setup to ensure none of the cards are starved for power.

Power draw is a bit of a concern with quad RTX 3090

For our testing, we will look at performance, power draw, and GPU temperature with OctaneBench, RedShift, V-Ray Next, and the GPU Effects portion of our PugetBench for DaVinci Resolve benchmark.

Benchmarks: OctaneBench

OctaneBench is often one of our go-to benchmarks for GPUs because it runs extremely well with multiple video cards. As you can see in the charts above, the scaling from one to four RTX 3090 cards is almost perfect with four cards scoring exactly four times higher than a single RTX 3090.

In fact, the biggest surprise here was that limiting each of the GPUs to 300W only dropped performance by a little more than 1%. While we are going to talk about power draw in more detail later, we will tease that limiting the GPUs reduced the overall system power by ~16%, which is a great return for such a tiny drop in performance.

Benchmarks: V-Ray Next

V-Ray Next is quickly becoming another staple for our GPU testing because it not only scales just as well as OctaneRender, but it actually causes slightly higher overall system power draw which makes it a great benchmark for stressing GPUs.

Scaling up to four RTX 3090 cards is perfect, and limiting the GPU power reduced the benchmark result by less than 1%. We also aren't seeing any increase in performance with dual power supplies, which means that so far, a single 1600W power supply appears to be doing OK.

Benchmarks: RedShift

RedShift is interesting because it does not scale as well as OctaneRender or V-Ray, but its recent acquisition by Maxon (makers of Cinema4D) means that we are likely to see more people using it in the near future.

One thing to note is that this benchmark returns the results in seconds, so a lower result is better (the opposite of our other tests).

In RedShift, we didn't see quite as good of scaling, but four RTX 3090 cards is still 3.6 times faster than a single card. Once again, power limiting the cards and using dual power supplies didn't affect the performance to a significant degree.

Benchmarks: DaVinci Resolve

To round out our testing, we wanted to look at something that wasn't rendering. We actually were going to include a few other tests such as NeatBench and a CUDA NBody simulation, but either the scaling wasn't very good with multiple GPUs, or we had issues running it due to how new the RTX 3090 is.

Our DaVinci Resolve benchmark, however, has support for these cards and the "GPU Effects" portion of the benchmark scales fairly well up to three GPUs.

We didn't see a significant increase in performance with four RTX 3090 cards, which may in part be due to our choice of CPU. This test is going to load the processor more than any of the others, and while that shouldn't explain the performance wall entirely, it may be a contributing factor.

Power Draw, Thermals, and Noise

Performance is great to look at, but one of the main reasons we wanted to do this testing was to discover if putting four RTX 3090 cards into a desktop workstation was even feasible. The higher power draw and heat output means that only cards like the Gigabyte RTX 3090 TURBO 24G with a blower-style cooler will even have a chance, but these cards are still rated for 350W each.

Power draw was one of our biggest concerns, so we decided to start there:

We measured the power draw from the wall during each benchmark, which showed us some very interesting results. First, the benchmark that pulled the most power was actually our DaVinci Resolve GPU Effects test. This is likely because it not only uses the GPUs, but puts a decent load on the CPU as well. Because of this, this test is likely a bit more accurate for the kind of maximum load you might put on a system in an "everyday" situation.

Overall, what we found was that while Quad RTX 3090s was able to run on a single 1600W power supply, it is cutting is extremely close. Remember that this is power draw from the wall, and going by the rated 92% efficiency of the EVGA unit we are using, our peak power draw of 1,717 watts should only translate to 1,580 watts of power internally. That leaves a whole 20 watts to spare!

We didn't have the system shut down on us during our testing, but this is way too close for long-term use. Not to mention that if we used a more power-hungry CPU, or even just added a few more storage drives, we likely would have been pushed over the edge. So, while we technically succeeded with four RTX 3090s on a single 1600W power supply, that is definitely not something we would recommend.

There are larger power supplies you can get that go all the way up to 2400W, but at that point you are going to want to bring in an electrician to make sure your outlets can handle the power draw. The 1717 watts we saw translates to 14.3 amps of power since we are using 120V circuits, and most household and office outlets are going to be wired to a 15 amp breaker. That leaves almost no room for your monitors, speakers, and other peripherals. If you do go the route of a 2400W PSU, you are going to need to ensure that you are using a 20 amp circuit.

But power draw aside, how did the Gigabyte cards handle the heat generated by these cards? 1700 watts of power is more heat than most electric space heaters put out, which can be very hard to handle within a computer chassis.

In the two charts above, we are looking at the peak GPU temperature for each card, as well as the peak GPU fan speed for each configuration in OctaneBench. We recorded results for the other tests as well, but OctaneBench was the only test that ran long enough to truly get the system up to temperature.

Surprisingly, the temperatures were not bad at all. Even with all four GPUs running at full speed, the temperatures ranged from 73C on the bottom GPU, to just 80C on the top card However, something to keep in mind is that the temperature of the GPU is only half the picture. GPU coolers - and CPU coolers for that matter - are tuned to increase the fan speed gradually, which means that the temperatures not only have to be acceptable, but there must be adequate fan speed head room to account for higher ambient temperature, longer load times, and the heat generated by other components that might be installed in the future.

In this case, quad RTX 3090s peaked at 88% of the maximum fan speed. To us, that is starting to cut it close, but technically it should be enough headroom - especially if you beefed up the chassis cooling with additional front or side fans.

The last thing we wanted to consider was noise. The temperature of these cards was actually fairly decent, but if the system sounds like a jet engine, that may not be acceptable for many users. Noise is very difficult to get across, but we decided the best way would be to record a short video of each configuration so you can at least hear the relative difference.

All things considered, the noise level of four RTX 3090 cards is not too bad. It certainly isn't what anyone would call quiet, but for the amount of compute power these cards are able to provide, most end users would likely deem it acceptable.

Quad RTX 3090: Feasible or Fantasy?

Interestingly enough, we actually had very few issues getting four RTX 3090 cards to work in a desktop workstation. Using a blower-style card like the RTX 3090 TURBO 24G from Gigabyte is certainly a must, but even under load the GPU temperatures stayed below 80C without going above 90% of the maximum GPU fan speed.

The only true problem we ran into was power draw. We measured a maximum power draw of 1717 watts from the wall, which not only exceeds what we would be comfortable with from a 1600W power supply, but also means that you should run your system from a 20 amp breaker if possible. Most house and office outlets will be on 15 amp circuits in the US, which may mean hiring an electrician to do some electrical work if you decide to use one of the very few 2400W power supplies that are available.

So, will we be offering quad RTX 3090 workstations? Outside of some very custom configurations, we likely are not going to offer this kind of setup to the general public due to the power draw concerns. On the other hand, triple RTX 3090 is something we are likely to pursue, although that has not yet passed our full qualification process quite yet. Even three RTX 3090 cards is going to give a very healthy performance boost over a quad RTX 2080 Ti setup, however, which is great news for users that need faster render times, or those working in AI/ML development.

Permalink to story.

 

Squid Surprise

Posts: 3,559   +2,451
Obviously this isn’t for gaming - but it would have been cool to see if it could run Crysis :)

I wonder if they considered custom water cooling... that would take care of the noise and heat - since we’re already talking north of 6K for the video cards, why not add a few hundred to cool them properly?
 
  • Like
Reactions: Theburn

Endymio

Posts: 1,029   +875
I wonder if they considered custom water cooling... that would take care of the noise and heat - since we’re already talking north of 6K for the video cards, why not add a few hundred to cool them properly?
According to the article, the temperatures aren't bad. Also, it would cost quite a bit more than you think to water cool them: four AIO 360mm coolers, plus a custom case to hold the radiators, plus potentially an even larger PSU, since they're pushing this one to the limit already. I don't see that happening for less than an additional $1200-$1500 at least.
 

Squid Surprise

Posts: 3,559   +2,451
According to the article, the temperatures aren't bad. Also, it would cost quite a bit more than you think to water cool them: four AIO 360mm coolers, plus a custom case to hold the radiators, plus potentially an even larger PSU, since they're pushing this one to the limit already. I don't see that happening for less than an additional $1200-$1500 at least.
Agreed - but once you’re charging 10k for a system, what’s an extra 2k? The only people buying a system with 4 3090 cards would have very deep pockets already...

It's not the temps - although it's always nice to have lower ones (especially if you're thinking of OCing) - but the noise... water would alleviate a great deal of that.

I’d actually prefer to see custom water blocks for each card (like EKWB’s stuff for the old Titan cards) and a giant pump/reservoir which would cost even more - but again, money really isn’t an object when purchasing this kind of system.

Clearly power draw is an issue - but that was already a concern. I think it would have to have dual 1600 PSUs to even be considered.... and naturally, a disclaimer warning that your electrical needs to be on at least a 20 Amp breaker....

I'm eagerly awaiting an update to PC Building Simulator as that will be the only way I get to "have" one of these :)
 
Last edited:
  • Like
Reactions: Reehahs

neeyik

Posts: 1,453   +1,611
Staff member
I wonder if they considered custom water cooling... that would take care of the noise and heat - since we’re already talking north of 6K for the video cards, why not add a few hundred to cool them properly?
Puget don't seem to offer any custom water cooling systems - the nearest is their Deluge models, where the CPU is water cooled. Generally speaking, Puget's design choice is 'simple and functional', where heat, noise, etc are given less attention than raw capability.
 

Squid Surprise

Posts: 3,559   +2,451
Puget don't seem to offer any custom water cooling systems - the nearest is their Deluge models, where the CPU is water cooled. Generally speaking, Puget's design choice is 'simple and functional', where heat, noise, etc are given less attention than raw capability.
Yeah... well, if they want to offer a quad-3090 system, I suggest they get on that :)
 
Why....? Why am I even having to say this on this site?!?

Grab a damn Add2PSU ADAPTER and put another power supply in the mix...

I've used them.for years in my personal builds and they work beautifully by daisy chaining PSU's together without any effort!

Three 1500 watt PSU'S knocking down that power requirement barrier...
 
  • Like
Reactions: PEnnn

Endymio

Posts: 1,029   +875
Agreed - but once you’re charging 10k for a system, what’s an extra 2k? ...
It's not the temps [but] the noise... water would alleviate a great deal of that.

I think it would have to have dual 1600 PSUs to even be considered.... and naturally, a disclaimer warning that your electrical needs to be on at least a 20 Amp breaker....
Certainly it would reduce what even Puget admits is a very high noise level. But you can't run two 1600W PSUs off a single 20A circuit. A 30A breaker would solve that, but much residential wiring and outlets aren't rated that high. So not only would you have the additional cost of the cooling and second PSU, but the system would probably need custom wiring at the customer's location, or a rather messy extension-cord solution.
 

Squid Surprise

Posts: 3,559   +2,451
Certainly it would reduce what even Puget admits is a very high noise level. But you can't run two 1600W PSUs off a single 20A circuit. A 30A breaker would solve that, but much residential wiring and outlets aren't rated that high. So not only would you have the additional cost of the cooling and second PSU, but the system would probably need custom wiring at the customer's location, or a rather messy extension-cord solution.
True... but only if the dual PSUs run at max - which they wouldn't be doing...but it could run a 2400W PSU... which should be sufficient for this as well.

Again, we're talking EXTREME niche product here - price is generally not really a concern...
 
It would be nice for gaming benchmarks.

Even if sli is gimped how could you NOT at least try benchmarking 4k games where sli still kinda works.. I could only imagine a modded skyrim...... Or 8k

Sad that Nvidia / developers are not making /optimizing sli for it to be feasible anymore.

Quad titans were so much fun.
 

amghwk

Posts: 863   +686
Shame that this mega-expensive, power-hungry, furnace-hot setup can't push anything above what a single card can already do in games.

But yeah, how many of the people actually use such setup in their "work"?
 

Thanthan

Posts: 53   +100
240v strikes again. Maybe some day you Guys should just switch xD might Even Save money in the long run Even if itll be stupid expensive for now.
 

Endymio

Posts: 1,029   +875
240v strikes again. Maybe some day you Guys should just switch xD might Even Save money in the long run
We were planning it, actually: convert every generation plant and distribution station in the country, along with rewiring every house and office, just so people could run their 3000 watt personal computers off a single outlet. Congress had the law ready to go -- but then Covid struck and the plan was cancelled.
 
  • Like
Reactions: Charles Olson

Mister_K

Posts: 1,979   +632
Also heard from previous gen that Resolve 2-3 GPUs is the sweet spot. If only I had they money. That said I only work with up to 6K footage, most 4K.
 

LuckyMenace

Posts: 28   +14
Agreed - but once you’re charging 10k for a system, what’s an extra 2k? The only people buying a system with 4 3090 cards would have very deep pockets already...
Couple of issues with water cooling.

1. This is a workstation, which means it needs to be super reliable. We are talking about 99.99% reliability here, which rules out water cooling with its many failure points, regular maintenance, and catastrophic damage.
2. Cost is a factor for company budgets. Increasing the budget by 20% is a huge number, which means they’d have to go with a weaker system.
3. Water cooling will further increase power draw, which is already a concern
 
  • Like
Reactions: Reehahs

neeyik

Posts: 1,453   +1,611
Staff member
I don’t think any game would benefit from more than 2 GPUs
Depends on the game and setup. For example, one can see in these quad SLI GTX 980 benchmarks that in some older titles there is a clear benefit with using more than two, but other titles far less so:


Newer games are far more twitchy about multi GPU, thanks to the amount of frame-to-frame data that gets used.

Isn't that Xeon a bottleneck?
No, because for such a setup, the intended workloads will primarily be GPU-limited, rather than being CPU or system limited. Besides, it's still a 10 core, 20 thread, 4 memory channel, 3.7 to 4.5 GHz CPU - it's hardly slow.
 
Shame that this mega-expensive, power-hungry, furnace-hot setup can't push anything above what a single card can already do in games.

But yeah, how many of the people actually use such setup in their "work"?
I do animation and visual effects for work and my quad GPU setup is regularly running at full bore, especially for 3D rendering with Redshift and Octane. More GPU power = raising the bar for what's possible when the job schedule is limited. It doesn't take many jobs before the hardware is paid for.
 

Latest posts