RTX 4000 rumors: September release, over 800W TDP

midian182

Posts: 9,745   +121
Staff member
Rumor mill: Nvidia’s upcoming RTX 4000 series of graphics cards, codenamed Lovelace, are rumored to arrive this September. But in addition to availability and price concerns, you’ll probably need a new PSU for the high-end models, which are said to consume over 800W of power.

We heard reports last year claiming the RTX 4000 cards would use some of the most power-hungry GPUs we’ve ever seen. That’s especially true of the AD102 that’s rumored to have a TGP of 800W or more.

The latest rumors come from prolific leakers Kopite7kimi and Greymon55, both of whom say they’ve heard claims that the AD102 GPUs will have scarily high TGPs. The former thinks the RTX 4080 will carry a 450W TGP and the RTX 4080 Ti will be 600W. The RTX 4090, meanwhile, will come with a monstrous 800W rating.

Greymon55 is going for 450W/650W/850W for the AD102. He also adds that it’s unclear if one model has three TGP ranges or whether it has three models.

The most power-hungry consumer Ampere card on the market right now is the RTX 3090 with 350W. The RTX 3090 Ti is expected to up that to 450W, though we don’t know when that much-delayed card will arrive.

We also know that companies are working on (and some have released) PCIe Gen 5 power supplies ready for next-gen cards. The 16-pin power cable is set to replace existing 8-pin 150W power cables, though the 800W+ RTX 4000 may need two. There had been theories that the RTX 3090 Ti would use the PCIe Gen 5 connector, but it seems Nvidia is saving it for Lovelace.

These are all rumors and speculation, of course, but it’s not the first time we’ve heard similar claims. Nvidia previously said graphics card supply would improve in the second half of 2022, so let’s hope owning a capable PSU will be the only issue with Lovelace.

Permalink to story.

 
This is getting ridiculous, burning that much energy just for a game. What a waste. I feel like regulations should be used to impose a limit on consumer GPUs max TDP.

But I guess that is anathema to a lot of people, as freedom is the most important to them. Taking care of the future we have to share not so much, somehow.
 
I thought it was common knowledge that while still finalizing the design the first yields will be this power hungry. Not sure why these reports keep popping up and people take it for granted the upcoming card will be literally double the power.

The 4080 will be just like every single Nvidia launch for the last decade: It will bump up performance on the top tier car about 20 to 30% above the 3080 and scale perfectly all the way down:

4070 delivers very similar performance to the 3080
4060 class replaces the same performance as the current 3070
4050 now becomes the new 3060 in performance

Power limits stay relatively the same around the named tiers (So a 4070 is roughly the same power as a 3070 except now it pushes 20-30% more performance for that same power).

So can we stop going surprised pikachu face about what's very clearly an early prototype that will get a lot better as their yields get better with the finalized design?
 
But I guess that is anathema to a lot of people, as freedom is the most important to them. Taking care of the future we have to share not so much, somehow.
I would phrase my objection more along the lines that if we're going to start regulating power use per person, 1) it should be a total quota which the person can allocate as they choose; and 2) either way personal gaming is not likely to be the in top 100 wastes of energy and video gamers suffer enough unfair stigma as it is. When I picture someone trying to deliver this lecture to me, I envision a family of 5 in their large SUV who makes 8 ski trips a season to their cabin 4 hours away from their first home. Give me a break.
 
No doubt this is going to cost an arm and a leg.

Back in the days you could get the flagship cards like the 8800 GTX or even the Ultra variant for 1k here in Australia.

Now days you need to folk out 2k to 4k just to even get into any of these top tier cards. It's absurd how expensive these GPU keeps getting....
 
This is getting ridiculous, burning that much energy just for a game. What a waste. I feel like regulations should be used to impose a limit on consumer GPUs max TDP.

But I guess that is anathema to a lot of people, as freedom is the most important to them. Taking care of the future we have to share not so much, somehow.

You need to understand that not everyone believes in climate change and in order to have a peaceful world one must not force their beliefs on others.
 
This is getting ridiculous, burning that much energy just for a game. What a waste. I feel like regulations should be used to impose a limit on consumer GPUs max TDP.

But I guess that is anathema to a lot of people, as freedom is the most important to them. Taking care of the future we have to share not so much, somehow.

I mean people also do content creation etc with these cards.

Like it or not, power draws for ALL electrical equipment will only go higher as time goes on. You want performance? Well no imaginary pixies will drive that device for free...

Instead of handicapping technological advances we should be looking at better harnessing of energy itself.
 
You need to understand that not everyone believes in climate change and in order to have a peaceful world one must not force their beliefs on others.

Climate change is not a belief. You can believe in a deity or winning the lottery, not hard evidence. But, I guess there are people out there like flat earthers who will deny facts even if they are right infront of them.
 
800w GPUs sound mighty sus. the 3090 already had reported instability issues with its 450 watt parts and the 3090ti is rumored to be delayed for the same reason.

800w? How would you even cool that much power draw?
This is getting ridiculous, burning that much energy just for a game. What a waste. I feel like regulations should be used to impose a limit on consumer GPUs max TDP.

But I guess that is anathema to a lot of people, as freedom is the most important to them. Taking care of the future we have to share not so much, somehow.
You can mask it as "muh freedumbs" but there is a simple fact here: People, in general, do not like it when the government nudges into their lives and says you cannot enjoy your entertainment.

Regulating TDP for GPUs is one hell of a slippery slope. Why not regulate how many hours you can play, like china? How much you can buy per month? Do your video game quotas have enough government support in them?

If you are concerned about "muh power use" a 800w GPU is nothing compared to the terrawatts of energy wasted cooling silly looking buildings with giant glass pavilions and charging electric cars, if your care is pollution GPUs are nothing compared to the bunker oil burning shipping vessels carrying your chinese sweatshop good across the pacific daily. If you're concerned with resource usage, tell people in second and third world countries to stop popping out babies like they're pokemon cards.

If you dont want a big GPU, dont buy one. We dont need the nanny state telling us what we can and cant enjoy, it will INEVITABLY be abused.
 
This is getting ridiculous, burning that much energy just for a game. What a waste. I feel like regulations should be used to impose a limit on consumer GPUs max TDP.

But I guess that is anathema to a lot of people, as freedom is the most important to them. Taking care of the future we have to share not so much, somehow.
You know what I have never heard mentioned when Climate Change is brought up....?
GPU's used for gaming. Please stop. I beg you.
 
Having a 800W GPU is not called "development" or "improvement", it's just more transistors from the same bucket, which NV could've done with the release of the 3000 series.

Thesre "rumors" are delusional and ignorant.
 
If it turns out to be true then you'll basically need a 480mm or even 560 (assuming standard 360 is good for say 400-450W) radiator to keep the card temperatures in check. I've no idea how could that be a feasible product.

Take a 3090. I have it on liquid (on both sides of course). Before I've slapped blocks on it whenever I hit button *Render* VRAM on the backside in 2-3 seconds hit 100-102C. And that with large fan blowing directly at it and 13 fans inside the case. Funny thing: warranty stickers and S/N were dry and loose after like a week of using and after few days seeing what's what I just patiently waited for blocks to arrive because I was genuinely scared that card will burn in flames itself.

Now imagine that with card which sucks 800W instead of only 350W.
 
It’s not the power supply I’m worried about….it’s my electric bill.

You plan on running the card 24/7?

In my city the current cost of electricity is $0.1194/kWh
If your system ran 24/7 (because you're mining) we'll say you're pulling 650W/h while running a 4080Ti and that's your total system power.

24 * 650 = 15,600 (or 15.6kWh) in a day.
15.6 * .1194 = $1.86 a day
$1.86 * 365 = $678.9 a year

Now, instead of mining 24/7, we'll say you game 4 hours and use the computer for low/idle use the remaining 20 hours a day. So your system stays on 24/7, but you're not maxing out the power draw 24/7

We'll say you're gaming and it averages out 4 hours a day, 7 days a week and you're hitting 650W those 4 hours.
4 * 650 = 2.6kWh for a day
2.6 * .1194 = $0.31 a day
$0.31 * 365 = $113.15

then we'll say you leave the computer on and it's idle or low use, so power draw is limited at maybe 150W for the remainder of the day. 4 hours of gaming a day and 20 hours of low/idle use.

20 * 150 = 3.0kWh for balance of the day
3 * .1194 = $0.36 a day
$0.36 * 365 = $131.40

$131.4 + 113.15 = $244.55 a year for power draw if I were to run a system that drew 650W 4 hours a day and averaged 150W after low use/idle draw the remaining 20 hours a day.


My current gaming system can draw upwards of 550W. I average around 2 hours of gaming a day, I don't game as much as I used to.

2 * 550 = 1.1kWH a day.
1.1 * .1194 = $0.13 a day for gaming

I only have the computer on when I'm home and awake. At night I power off the computer and it stays off until I get home from work, 5 days out of the week. We'll say low use/idle power draw is 150W and that's over the following hours:
Mon-Fri = 6 hours (6 * 150 = .9kWh)
.9 * .1194 = $0.11 a day for low use/idle during the weekdays

Sat-Sun = 16 hours (16 * 150 = 2.4kWh)
2.4 * .1194 = $0.29 a day for low use/idle on the weekends

.13 + .11 + .29 = $0.53 a week for me to run my gaming computer or $27.56 for a year of use


If you plan on running your computer 24/7 regardless if you're mining on it or not, you could have a much increased electric bill. In the end, it all depends on your overall use of your system and how often it is on and running.
 
Last edited:
Nothing but speculation.

And now that Russia has entered war with Ukraine, the silicon costs may increase yet again.

I'll believe it when I see it on the shelf at my Microcenter.

And I'll only get the highest end card possible.
 
You plan on running the card 24/7?

In my city the current cost of electricity is $0.1194/kWh
If your system ran 24/7 (because you're mining) we'll say you're pulling 650W/h while running a 4080Ti and that's your total system power.

24 * 650 = 15,600 (or 15.6kWh) in a day.
15.6 * .1194 = $1.86 a day
$1.86 * 365 = $678.9 a year

Now, instead of mining 24/7, we'll say you game 4 hours and use the computer for low/idle use the remaining 20 hours a day. So your system stays on 24/7, but you're not maxing out the power draw 24/7

We'll say you're gaming and it averages out 4 hours a day, 7 days a week and you're hitting 650W those 4 hours.
4 * 650 = 2.6kWh for a day
2.6 * .1194 = $0.31 a day
$0.31 * 365 = $113.15

then we'll say you leave the computer on and it's idle or low use, so power draw is limited at maybe 150W for the remainder of the day. 4 hours of gaming a day and 20 hours of low/idle use.

20 * 150 = 3.0kWh for balance of the day
3 * .1194 = $0.36 a day
$0.36 * 365 = $131.40

$131.4 + 113.15 = $244.55 a year for power draw if I were to run a system that drew 650W 4 hours a day and averaged 150W after low use/idle draw the remaining 20 hours a day.


My current gaming system can draw upwards of 550W. I average around 2 hours of gaming a day, I don't game as much as I used to.

2 * 550 = 1.1kWH a day.
1.1 * .1194 = $0.13 a day for gaming

I only have the computer on when I'm home and awake. At night I power off the computer and it stays off until I get home from work, 5 days out of the week. We'll say low use/idle power draw is 150W and that's over the following hours:
Mon-Fri = 6 hours (6 * 150 = .9kWh)
.9 * .1194 = $0.11 a day for low use/idle during the weekdays

Sat-Sun = 16 hours (16 * 150 = 2.4kWh)
2.4 * .1194 = $0.29 a day for low use/idle on the weekends

.13 + .11 + .29 = $0.53 a week for me to run my gaming computer or $27.56 for a year of use


If you plan on running your computer 24/7 regardless if you're mining on it or not, you could have a much increased electric bill. In the end, it all depends on your overall use of your system and how often it is on and running.
Don't forget that Every watt your computer heats up also has to be cooled down in the summer. There are also other things to consider than just straight up electricity cost. You need more expensive hardware just to power the card and then there is the power draw on the house itself. Depending on how much you have running on one electrical line you could start tripping the breaker.

I'm not telling people to not buy something just because of how much power it draws, someone could power a 4090ti with all solar for all we know, but it's getting kind of ridiculous. While nVidia is making faster cards they're really just pumping as much power into a GPU as they can without really innovating. I don't want 800 watt GPUs. Frankly, I don't see a need for faster GPUs at the time for gaming. Until we have a display standard for 8k120 we really don't have a use for cards faster the 3090 anyway.

Don't get me wrong, I'm all for increased performance, but I would rather see increases in power efficiency and decreases in price rather than just brute forcing their way to more performance.

There, legitimately, is currently no way to use performance passed a 3090 for gaming. What would impress me in a card is 3090 levels of performance at a 3070 level TDP with a 3080 price point.
 
Did you just not get the memo? And, out of mild curiosity, what aspect of climate change do you not believe? 97% of climate scientists say climate change is real. Do you know something the rest of the world doesn't? The years 2016 - 2021 were each the hottest on record.

Old news. Currently:

"More than 99.9% of peer-reviewed scientific papers agree that climate change is mainly caused by humans, according to a new survey of 88,125 climate-related studies.

The research updates a similar 2013 paper revealing that 97% of studies published between 1991 and 2012 supported the idea that human activities are altering Earth’s climate. The current survey examines the literature published from 2012 to November 2020 to explore whether the consensus has changed."

Cornell: More than 99.9% of studies agree: Humans caused climate change
 
That said, GPUs are a tiny fraction of a contributor unless you're cryptomining. Personally I don't see the need to run a GPU way outside of it's efficiency zone to get 4K120 at Ultra.

But that's just me. You all spend your money on this expensive stuff so I don't have to keep the economy going. That's your job.
 
Back