Nvidia is reportedly making 600W reference boards for the RTX 4090

Actually it's Nvidias ego at play Nvidia simply can't have the inferior gpu hence the record high still rumored 800 watt tdp ( reference design 😳 ). Questions?
So there is one thing I'd like to bring up and it rarely ever gets mentioned, These higher end cards are great for people doing AI research on a budget. I'm not saying they're common, but I know a few people in the industry who dabble with AI at home and their 2080's and 3090's get more use as compute cards than gaming cards. For what I'm going to call "home professionals" the 3090's can be a great option if you can't spend $7000 on a quadro card.
 
Well, I don't get the outrage. It's not like everybody's gonna buy 4090s. And not too long ago people were using SLI, remember that? You needed an 1000WPSU to do the job. So, high end it's always going to cost you and drain a lot of power, nothing new. Now, I'd be concerned if a 4070 will need something like 500W or more. That would really be sh!tty.

The issue is that this will come down the line of the cards people afford too: The TDP of the 3070 is still 220 watts, it went up almost 50% since the 1070 came out in just 2 generations all these cards are far more power hungry across the line.

This is also extremely unnecessary: we know that if you were to constrain the power of the 3070 to just 150 watts it would still behave about as good as the 2070 or so. It's not a *huge* jump sure but it is very significant, basically the 2070 and all of the 2000 series just didn't need to exist at all: these are products that just did not obey market forces and customer demand overall, this was just Nvidia inducing demand by pumping a bunch of extra power and killing most of the efficiency into what was essentially just the 1070 and the 1000 series.

What is any of this accomplishing? Basically nothing: Nvidia could have continued selling the 1050, 1060 and 1070 cards for all of that generation and literally no new game during all of that time would have left gamers wanting more because no game would have introduced Ray Tracing without them and the terrible DLSS 1.0 implementation.

By the time that tech would have been more mature, the 3070 in it's more limited power delivery form as you've seen in the laptops, would offer a bit of an improvement but now enabling Ray Tracing with the tenser cores and DLSS 2.0 enabled and this would have been a perfectly fine experience as it is on laptops today.

Instead we have this mentality of "Just put 2 or 3 pounds of copper on the thing and keep overclocking it past any kind of efficiency to keep selling more cards" that is not really helping gaming and it's just inducing demand, making GPUs more expensive and making mostly Eth miners happy right now and nobody else.
 
As a small home miner I can assure you no miners are happy at high wattage cards. Really hinders profits and managing the heat also raises electricity usage more. These things are more like anti miner cards. As for the power usage of these cards its miniscule. My baseboard heater uses 2000W in my gaming room and it cycles every 15 mins due to poor insulation. 600W for a GPU isn't anything. Besides if 400W gives X performance and 600W gives 1.5X performance then thats just fine. Granted if it does any less then 50% more performance then flame away because its officially trash.
 
I just checked idled wattage for RTX 3080 and it's seems around 30/40 watts - apparently can be quite variable .
If this carries on - you will need a small GPU to handle low demand - like my web use now .

As for solar panels - I was thinking about them the other day - the costs are not that much - probably a lot more for a battery system .- maybe a small system costs the opening market price of a RTX 4090 ( This is ACTUALLY true ) ie some solar panels and inverter installed
Well when I eventually get a electric car - it will make more sense - depends on how much power you use in the day - My wife likes it warm - currently on a mission to improve insulation etc - glazing . Plus repainting - not sure if will do gables - or get Pros for them .
Plus Doomsday preppers and all that - being able to continue if grid goes down - my gas bottle water heating actually needs a small amount of power to work ( kind of defeats non-reliance on electricity ).
Plus think battery tech will continue to improve and cheapen - some people already reuse old Nissan Leaf batteries for that lakeside cabin off the grid.

Even in my travels in developing countries in the 80s/90s a lot of tribal huts etc ran of 12 or pref 24 volt car batteries -Probably generators more common now - Ha that reminds me off those RVs in National Parks - or desert campsites with no power (USA) - and how it annoyed other campers them running their generators in nature ( especially at night- sounds of the whippoorwill and coyotes and all that )- I didn't get it -most of them had huge battery packs and some solar panels along their roofs - maybe they needed to for stupidly big fridges and NVidia GPUs
im guessing you have multiple monitors like me. the more displays the more power. my fans on my gpu are not even running and using 35 watts. 2070 here
 
It looks like power efficiency was already dead after Pascal. Pascal was the peak of power efficiency. I remember how amazed we were with the 1080 that came with 1x8pin, and the 1080 ti with 8+6 pin. Even my 1080 Ti Trio only needs 2x8pin.
 
I agree with @godrilla that just the top model is very power hungry because Nvidia always wants to be the GPU king and that is actually a fun thing in the scene. Even still, power consumption is likely rising for every model, the RTX 4060 closing the 200 W line.
If performance gain is even close to the jump from Turing to Ampere per watt, I'm sure RTX 4060 will satisfy those who own something like RTX 2070 Super, RX 5700 XT, RTX 3060 or lesser. At this point I expect that model to be the most interesting, since I assume RTX 4070 is not going to cost under 500 € and it's consumption is over 250 W, though who knows. What is quite certain is that RTX 4080 (Ti) is much less power hungry than the likely top model RTX 4090 (Ti).

RTX 3060 was a little bit weaker than likely many wanted, it should have been closer to the Ti model, but when you think about it, the whole 3000 series is a little bit strange with all the VRAM capacities and Ti models. Most of the models were attuned on the fly responding to the mining boom.
 
Last edited:
This doesn’t bother me at all. At least this makes your massive tower PC justifiable. It always hurts me a little when I see a massive tower PC that has low power components in it making it feel like a big empty box of wasted space.

If you're looking for lower power consumption then don’t expect top tier performance.
 
Every RTX 4090 owner will have their own personal sun. Hopefully, this only applies to the top-end model and not to let's say RTX 4070 and below.

This doesn’t bother me at all. At least this makes your massive tower PC justifiable. It always hurts me a little when I see a massive tower PC that has low power components in it making it feel like a big empty box of wasted space.

If you're looking for lower power consumption then don’t expect top tier performance.
How does hardware using more power fill up more space inside a case? What are these things you are saying? Though I agree that top tier performance usually comes with high power consumption.
 
The direct problem around power requirement can easily be handled by changing to a higher wattage PSU. However, the byproduct of higher power consumption, heat, is something that most people don’t consider. I stay in a hot and generally humid country, and with the air conditioning switched on while gaming on a system running a RTX 3080, I’ve observed the temps creeping upwards. I am not gaming in a big room, and the new air conditioning system is actually quite cold by itself. If just a 320 to 340W power draw from a GPU can slowly cause ambient temps to increase, I wonder how hot will the room be with a 600W GPU under load for an extended period of time. It doesn’t matter whether you are using liquid or air cooling because the heat output from the GPU is the same, so the heat dumped into the ambient is going to be the same. I feel that this point, we are seeing an almost stagnation in performance per watt where you increase power substantially to achieve higher performance.Pair this with an Intel Raptor Lake (which I suspect will lead to increase in power consumption), then one will have adequate heating for winter.
 
How does hardware using more power fill up more space inside a case? What are these things you are saying? Though I agree that top tier performance usually comes with high power consumption.
It’s quite straightforward really. PC components that use more power typically produce more heat and require more cooling. Stronger cooling takes up more space. So for example you may see a larger heatsink or even a watercooling arrangement.
 
600w will be.....impossible to cool. The 3090 struggles and that's a 400w part. Exactly how big is the cooler on this thing gonna be?
You are missing the point by a New York mile.

The issue is not whether we should buy a 1000W PSU due to Nvidia's pathetic inability to produce something that's power efficient, doesn't cost a kidney to buy and doesn't require a humongous PSU.

Unlike you, many of us peasants don't have unlimited funds, don't have a mansion with a nuclear reactor for a power supply / not having to worry about electric bills.

We do have to pay electric bills each month and we don't care for a GPU whose power draw is 1/4 of our total house consumption!!
Dude electricity is dirt cheap. If you have to worry about the difference between a 300w GPU and a 600w GPU, you probably cant afford a 600 watt GPU. True story.

Same with those who whine about electric bills for the i9 12900k. It's like a $5 difference per year. On a $800 part.
 
600w will be.....impossible to cool. The 3090 struggles and that's a 400w part. Exactly how big is the cooler on this thing gonna be?

Dude electricity is dirt cheap. If you have to worry about the difference between a 300w GPU and a 600w GPU, you probably cant afford a 600 watt GPU. True story.

Same with those who whine about electric bills for the i9 12900k. It's like a $5 difference per year. On a $800 part.
I don't like the idea of using loads of electricity for anything, let alone playing video games. Probably just me but it just feels dirty. I think of modern hardware as efficient while inefficient components that use brute power to do the job remind me of something like...old diesel SUVs. :joy:
 
A 250W CPU + 800W GPU + 50-100W (MB,RAM, SSD,etc) and we should mount PC case in-wall or in-window. Remember the old AC units? Half inside and half outside. Or maybe the PC case will come with their own AC unit. Just imagine to put your PC next to a drain for the condense.
All PSU calculators say to use a PSU that can handle more than you draw. So a power draw of 1200W will require at least a 1600W PSU. That is a 134 Amp's @ 12V.
Next generation GPU will come with liquid cooling and radiator equal or larger than the one in your car.
 
It’s quite straightforward really. PC components that use more power typically produce more heat and require more cooling. Stronger cooling takes up more space. So for example you may see a larger heatsink or even a watercooling arrangement.

I had a full tower, 3 140mm intake fans, 3 140mm exhaust fans plus the 240 rad with a push/pull config. Things stayed cool.

I moved to a more simplified case that was much smaller, has 1 - 200m exhaust fan and 2 front intake fans that push air through the 240mm rad for my AIO. The GPU ran 4-5 degrees cooler in this case than the full tower and the CPU ran a few degrees cooler, as well.

Bigger isn't always better.
 
I had a full tower, 3 140mm intake fans, 3 140mm exhaust fans plus the 240 rad with a push/pull config. Things stayed cool.

I moved to a more simplified case that was much smaller, has 1 - 200m exhaust fan and 2 front intake fans that push air through the 240mm rad for my AIO. The GPU ran 4-5 degrees cooler in this case than the full tower and the CPU ran a few degrees cooler, as well.

Bigger isn't always better.
Nobody said it was.
 
This reminds me of 3dfx when they ran out of ability to innovate and just started throwing more and more of the same at their cards as a way of increasing performance. It didn't go well for them. I don't think they are quite in the same boat yet, but getting there...
 
This reminds me of 3dfx when they ran out of ability to innovate and just started throwing more and more of the same at their cards as a way of increasing performance. It didn't go well for them. I don't think they are quite in the same boat yet, but getting there...

To be fair, 3dfx bet on the wrong thing (2d performance) and got bodied by newer cards that ran circles around them in 3d performance metrics. That, and the voodoo 5/banshee was horrendously late to market.
 
You are missing the point by a New York mile.

The issue is not whether we should buy a 1000W PSU due to Nvidia's pathetic inability to produce something that's power efficient, doesn't cost a kidney to buy and doesn't require a humongous PSU.

Unlike you, many of us peasants don't have unlimited funds, don't have a mansion with a nuclear reactor for a power supply / not having to worry about electric bills.

We do have to pay electric bills each month and we don't care for a GPU whose power draw is 1/4 of our total house consumption!!
You are giving him too much credit. There's no proof that he can actually afford what he says he has. Might be just another troll living in grandma's basement for all we know. This is the age of social media. Anyone can pretend to be anyone and can write anything.
 
For the record, the rumours have been saying the core count will near double for a few months now. It's not 600W because they hit a wall.
 
There isn't a shortage. There's a massive inventory management problem that was brought to light by the governments shutting everything down for just 3 months.

The beauty of lean production. One of the many things that are great when everything‘s within or near optimal parameters but quickly falls apart when it isn‘t.

I have a feeling that in our quest to optimize everything, technology but also society is becoming less resilient / fault tolerant.


We do have to pay electric bills each month and we don't care for a GPU whose power draw is 1/4 of our total house consumption!!

Plus there‘s the ‚heat in the case‘ and ‚heat in the room problem‘. On warm, let alone hot summer days this can be an issue - crank the AC for those that have it, otherwise sweat and have an overheating PC with all fans on full blast.
 
Back