Nvidia is reportedly making 600W reference boards for the RTX 4090

mongeese

Posts: 590   +119
Staff member
Rumor mill: According to industry sources, Nvidia has shared its reference boards for the flagship RTX 4000-series GPUs, which might be called the RTX 4080, 4080 Ti, 4090, or 4090 Ti, with its OEM partners. It hasn't delivered the GPUs themselves yet, but the boards are geared to consume 600W of power.

Several leakers with good track records have recently said that they expect the RTX 4080 or 4090 to have a 600W TBP (total board power). A higher-end model, either the RTX 4090 Ti or some special edition card, could consume over 800W. But they've also attached several disclaimers to those numbers, noting that it's too early for exact details to have been finalized.

Igor's Lab reports that Nvidia has told OEMs to test the reference boards used to help them design their PCBs and coolers at the full 600W. Not every 4080 or 4090 will necessarily consume 600W out of the box, but at least some will turbo or overclock into that power bracket.

Igor's Lab received a picture of a reference board from a Chinese OEM and confirmed its authenticity with two other sources (above). One standout feature was the twelve slots for memory modules, which indicate that the GPU will have either 12 GB or 24 GB of GDDR6X memory, most likely the latter.

It also had an impressive 24 voltage converters, up from 20 on the Founder's Edition RTX 3090 PCB (shared between the GPU and memory). It used UPI Semi's UP9512 controllers capable of eight phases, so three controllers per phase.

All those converters and modules are fed by one 12VHPWR connector, also known as the PCIe 5.0 power plug, which can deliver 600W alone. Igor's Lab notes that OEMs are designing PCBs that use the full 600W, but, as mentioned, that's a ceiling, not an average. Each 12VHPWR connector will come with an adaptor with four 6+2 pin Molex plugs, even though most power supplies don't come with that many cables.

The insiders expect Nvidia to reuse the 3090's three-slot air cooler for the 4080 and 4090. Meanwhile, manufacturers are reportedly preparing 3.5-slot air coolers and considering water-cooled solutions similar to what AMD used for the RX 6900 XT LC.

In a potential first, the 3090 Ti and 4080 are allegedly pin-compatible, meaning they can use the same PCB designs. Igor's Lab speculates that OEMs might be using the 3090 Ti to test their 4080 coolers and PCBs. So the 3090 Ti and the 4080 might share parts.

At the very least, the 3090 Ti is the trial run for the 12VHPWR connector. Rumors indicate that the 3090 Ti will consume 400-500W depending on the model, a range that looks crazy compared to the RTX 3000-series but tame compared to the rumored figures for the RTX 4080 and 4090.

Image Credit: Thomas Foster

Permalink to story.

 

yRaz

Posts: 4,400   +5,124
I have a decently large room for an office with my computer in it, a 1070ti and 1800x heats the thing up decently during longer gaming sessions and it's not even close to the power draw of what a system like that would use.

The performance is impressive, no doubt, but this is just absurd. I remember when the GTX 280 came out and people thought it was a powerhog. It's now getting to the point where a single graphics card can pull a quart of what's available on a 20A breaker in a household.

I get that this is an absolute flagship where cost is no object, but the flagships often trickle down in later generations. A 3060 is a "midranged" card with an offical power draw of 175watts but I'm sure special additions have no problem exceeding the 200w barrier. Will we start to see 5060 and 6060 series push 300 watts? nVidia designed a new connector and it certainly wasn't because they expected to use less power.

 

Achaios

Posts: 378   +1,046
So, who's the target group for a GPU with 600 or 800W TDP?

Would you buy a GPU with 800W TDP to game on? No, you wouldn't.

So who's gonna buy those? That's right, Miners will.

We are looking at Ada Lovelace Miner's Knock Yourself Off edition.
 

yRaz

Posts: 4,400   +5,124
So, who's the target group for a GPU with 600 or 800W TDP?

Would you buy a GPU with 800W TDP to game on? No, you wouldn't.

So who's gonna buy those? That's right, Miners will.

We are looking at Ada Lovelace Miner's Knock Yourself Off edition.
Miners often do many things to reduce power consumption. I hate the miners for their contribution of the GPU shortage as much as anyone else but we can't keep blindly blaming them for everything.

There is a very real chip shortage, and surprisingly a significant part of it is caused by Texas Instruments. You can make all the GPUs and Memory chips in the world but it doesn't do you any good if you don't have the secondary components to make them work together.
 

psycros

Posts: 4,142   +5,770
The fact that the power consumption
Miners often do many things to reduce power consumption. I hate the miners for their contribution of the GPU shortage as much as anyone else but we can't keep blindly blaming them for everything.

There is a very real chip shortage, and surprisingly a significant part of it is caused by Texas Instruments. You can make all the GPUs and Memory chips in the world but it doesn't do you any good if you don't have the secondary components to make them work together.

It's possible that the high consumption of this new GPU family may end up discouraging miners from buying many. If there's even a modest dip in crypto prices for more than a couple weeks these power vampires will be *costing* them money, and that's not something the casual miner is willing to risk. Then again, it might also get more wealthy gamers to dive into mining - quite a few are now mining when their not using their PCs for gaming.
 

Neatfeatguy

Posts: 832   +1,445
Miners often do many things to reduce power consumption. I hate the miners for their contribution of the GPU shortage as much as anyone else but we can't keep blindly blaming them for everything.

There is a very real chip shortage, and surprisingly a significant part of it is caused by Texas Instruments. You can make all the GPUs and Memory chips in the world but it doesn't do you any good if you don't have the secondary components to make them work together.

There isn't a shortage. There's a massive inventory management problem that was brought to light by the governments shutting everything down for just 3 months.

No one had inventory on hand to fall back on. They all relied on the inflow of material coming to them and nothing on the shelves for backup. Governments put everything on halt and that's when it became abundantly clear none of these companies nor governments knew how to manage their inventory.

A lot of things were put on halt, but not everything and production was still going out the doors for some services, but the materials to replenish their products wasn't coming in. Now these companies didn't have anything to fall back on and so backorders built up. Once everything started moving again companies were in a panic to get material to fill their backorders.....We've been trying to fill backorders ever since with very little catchup actually happening.

There isn't a shortage. There's just a bunch of f'ing retards looking to cook the books and keep inventory as slim as possible to make the best margins. The company I work for has a lot of customers like this, they don't keep any inventory on the shelves for backup to handle work loads, they expect their distributor to keep their machines and people working from inbound inventory. Look what happens when something on my company's end causes a hold up and a shipment isn't made that week.....those customers actually shut down their production lines and send workers home for the week. They don't want to keep any overhead on hand because it cuts into the bottom dollar, but on the flip side they don't want to keep any overhead on hand so they can't always keep their business running. A lot of companies are like this and this was the main issue when things were locked down.

Now that all these places claim there is a "shortage", they've been able to recoup lost sales by jacking up the prices and this is where we've been stuck and probably will never get out of....you know, not without another great depression type event.
 

PEnnn

Posts: 770   +874
Just get a 1000W PSU and call it a day.

You are missing the point by a New York mile.

The issue is not whether we should buy a 1000W PSU due to Nvidia's pathetic inability to produce something that's power efficient, doesn't cost a kidney to buy and doesn't require a humongous PSU.

Unlike you, many of us peasants don't have unlimited funds, don't have a mansion with a nuclear reactor for a power supply / not having to worry about electric bills.

We do have to pay electric bills each month and we don't care for a GPU whose power draw is 1/4 of our total house consumption!!
 

hahahanoobs

Posts: 4,430   +2,409
You are missing the point by a New York mile.

The issue is not whether we should buy a 1000W PSU due to Nvidia's pathetic inability to produce something that's power efficient, doesn't cost a kidney to buy and doesn't require a humongous PSU.

Unlike you, many of us peasants don't have unlimited funds, don't have a mansion with a nuclear reactor for a power supply / not having to worry about electric bills.

We do have to pay electric bills each month and we don't care for a GPU whose power draw is 1/4 of our total house consumption!!
These would be enthusiast cards where power efficiency isn't a top priority. Performance is.
This is not news.
 

emmzo

Posts: 609   +766
Well, I don't get the outrage. It's not like everybody's gonna buy 4090s. And not too long ago people were using SLI, remember that? You needed an 1000WPSU to do the job. So, high end it's always going to cost you and drain a lot of power, nothing new. Now, I'd be concerned if a 4070 will need something like 500W or more. That would really be sh!tty.
 

Neatfeatguy

Posts: 832   +1,445
Well, I don't get the outrage. It's not like everybody's gonna buy 4090s. And not too long ago people were using SLI, remember that? You needed an 1000WPSU to do the job. So, high end it's always going to cost you and drain a lot of power, nothing new. Now, I'd be concerned if a 4070 will need something like 500W or more. That would really be sh!tty.

Didn't need 1000W PSU for SLI.

I ran 980Ti SLI on 850W PSU.
Ran GTX 280 SLI on 750W PSU
Ran GTX 570 SLI on 750W PSU.

Even when I was pushing 1.5125V through my PII x4 940 to get the best OC I could and running 570s in SLI with the voltage up all the way and OC up as high as they'd go using MSI Afterburner, my total system power draw while gaming was around 620W.
 

yRaz

Posts: 4,400   +5,124
There isn't a shortage. There's a massive inventory management problem that was brought to light by the governments shutting everything down for just 3 months.

No one had inventory on hand to fall back on. They all relied on the inflow of material coming to them and nothing on the shelves for backup. Governments put everything on halt and that's when it became abundantly clear none of these companies nor governments knew how to manage their inventory.

A lot of things were put on halt, but not everything and production was still going out the doors for some services, but the materials to replenish their products wasn't coming in. Now these companies didn't have anything to fall back on and so backorders built up. Once everything started moving again companies were in a panic to get material to fill their backorders.....We've been trying to fill backorders ever since with very little catchup actually happening.

There isn't a shortage. There's just a bunch of f'ing retards looking to cook the books and keep inventory as slim as possible to make the best margins. The company I work for has a lot of customers like this, they don't keep any inventory on the shelves for backup to handle work loads, they expect their distributor to keep their machines and people working from inbound inventory. Look what happens when something on my company's end causes a hold up and a shipment isn't made that week.....those customers actually shut down their production lines and send workers home for the week. They don't want to keep any overhead on hand because it cuts into the bottom dollar, but on the flip side they don't want to keep any overhead on hand so they can't always keep their business running. A lot of companies are like this and this was the main issue when things were locked down.

Now that all these places claim there is a "shortage", they've been able to recoup lost sales by jacking up the prices and this is where we've been stuck and probably will never get out of....you know, not without another great depression type event.
you say that there both is and is not a shortage, so which is it? Who is buying material if the production lines are shut down? Surely, the inflated prices with increased demand and limited supply would make any investor foam at the mouth.

The only people that I've dealt with that I actually haven't been able to get chips from has been Texas Instruments and it seems like they're only running at about 20% capacity while reporting record profits. We know that TSMC, Intel, Global Foundries and Samsung are all running their fabs as fast as they can, TI makes some very unique products that can't just be filled with the other massive fabs. Everyone wants to talk about 7nm, 5nm, ect. TI is still one of the few fabs that produces cheap chips on 120nm+ processes. Look at chips like the Arduino, simple microcontrollers that perform basic functions. You could have all the 3090 dies in the world and you can't do a thing with them if you don't have fan controllers, voltage regulators, RGB controllers.

these are all things that I call the supporting chips that make the system as a whole operated. The chips in the window motors for cars? They use tiny controllers like this. TI is the largest producer of those in the world. I've used some cheap Chinese alternatives but you're lucky if half work and if they do, they're far out of spec.
 

Geralt

Posts: 1,125   +1,738
Time to move to Stadia and similar services. To buy and own a GPU has become an absurd process. No way for me to have a 600W card inside my case. Just stupid and insane.
 
Last edited:

kiwigraeme

Posts: 1,028   +768
These would be enthusiast cards where power efficiency isn't a top priority. Performance is.
This is not news.

Glad you speak for all of us - I have a RTX 3080 and don't want a 24v power hog .
Yes I will accept your point for a 4090 and TI variant - but a RTX 4080 is meant to be reasonable purchase - kids don't want parents screaming about power bills etc . This is just a ARMS race for bragging rights . Surely with better FAB 5mm or whatever and design - we should be having parity.
TBF the 600w may be a very rare peak for a 4080 and it will probably be around 500w
 

yRaz

Posts: 4,400   +5,124
Glad you speak for all of us - I have a RTX 3080 and don't want a 24v power hog .
Yes I will accept your point for a 4090 and TI variant - but a RTX 4080 is meant to be reasonable purchase - kids don't want parents screaming about power bills etc . This is just a ARMS race for bragging rights . Surely with better FAB 5mm or whatever and design - we should be having parity.
TBF the 600w may be a very rare peak for a 4080 and it will probably be around 500w
Maybe the reference card but lets see what the board partners do. With the increased power consumption and price it doesn't really feel like we're getting a new generation of cards. Performance per Watt is suppose to increase and the old "dollars per FPS" metric should go down. We are getting neither of those. It seems like lazy development. We're getting absurdly power hungry cards and they aren't even going to be priced at the enthusiast level. If rumors are true we're going to have $1500 4080's.

These prices are stupid and these power consumption numbers are stupid. Let's hope AMD has something reasonable. Oddly enough, it might be intel that saves us. I say many people complaining that intel is only making "mid ranged cards". Compared to what, cards that even enthusiasts have a hard time affording? It doesn't matter who has the best card if you don't own it.

I like to think that I'm decently successful and can certainly afford one but look at the prices as outrageous and very poor value. Only way I'd buy a new card in this market is if my 1070ti dies and I have no other choice.
 

kiwigraeme

Posts: 1,028   +768
I just checked idled wattage for RTX 3080 and it's seems around 30/40 watts - apparently can be quite variable .
If this carries on - you will need a small GPU to handle low demand - like my web use now .

As for solar panels - I was thinking about them the other day - the costs are not that much - probably a lot more for a battery system .- maybe a small system costs the opening market price of a RTX 4090 ( This is ACTUALLY true ) ie some solar panels and inverter installed
Well when I eventually get a electric car - it will make more sense - depends on how much power you use in the day - My wife likes it warm - currently on a mission to improve insulation etc - glazing . Plus repainting - not sure if will do gables - or get Pros for them .
Plus Doomsday preppers and all that - being able to continue if grid goes down - my gas bottle water heating actually needs a small amount of power to work ( kind of defeats non-reliance on electricity ).
Plus think battery tech will continue to improve and cheapen - some people already reuse old Nissan Leaf batteries for that lakeside cabin off the grid.

Even in my travels in developing countries in the 80s/90s a lot of tribal huts etc ran of 12 or pref 24 volt car batteries -Probably generators more common now - Ha that reminds me off those RVs in National Parks - or desert campsites with no power (USA) - and how it annoyed other campers them running their generators in nature ( especially at night- sounds of the whippoorwill and coyotes and all that )- I didn't get it -most of them had huge battery packs and some solar panels along their roofs - maybe they needed to for stupidly big fridges and NVidia GPUs
 

AIC1Drew

Posts: 57   +47
Time to move to Stadia and similar services. To buy and own a GPU has become an absurd process. No way for me to have a 600W card in my case. Just stupid and insane.
That's actually not too far fetched....maybe this is actually by design and thats what they're pushing for is pushing people to Stadia and other flavors of cloud gaming.