Nvidia RTX 4060 rumored to be more power-hungry than an RTX 3070

midian182

Posts: 8,023   +88
Staff member
Rumor mill: With Nvidia's next-generation RTX 4000 (Ada Lovelace) series edging ever closer, the rumor mill is going into overdrive with possible specs and release dates. In the case of the former, one prolific leaker has claimed that even the mid-range RTX 4060 will have considerable power demands, exceeding the current-gen RTX 3070.

Claims of the RTX 4060 having a higher TDP than the RTX 3070 come from regular leaker Kopite7kimi. He didn't mention the specific power consumption of the upcoming card, but the Ampere product's Founders Edition hits 220W while some overclocked AIB variants can reach 250W.

A TDP of 250W or more would be a big jump for a xx60 series product. The previous most power-hungry cards from the series were the RTX 3060 and GTX 760 (both 170W), followed by the RTX 2060 (160W), then the GTX 1060 and GTX 960 (both 120W). That's not counting the RTX 3060 Ti, which consumes 200W.

We've heard plenty of rumors that the RTX 4090 will have a 600W TBP (total board power), while a flagship RTX 4090 Ti could push it to over 800W. Assuming that's true, and it's certainly starting to look that way, a mid-tier RTX 4060 card reaching 350W isn't beyond the realms of fantasy.

Elsewhere in the world of RTX 4000-series rumors, new release dates for the cards have surfaced. It was previously believed that the RTX 4090 would launch first, in August, with the RTX 4080 arriving a few weeks later, possibly in the same month. That would leave the RTX 4070 to launch in September.

The new rumors put the launch dates quite a bit further ahead but with the same release order: the RTX 4090 in October, RTX 4080 in November, RTX 4070 in December, and the RTX 4060 in January 2023.

Remember, all of this is just rumor and speculation that should be taken with a heavy dose of salt, but we shouldn't have too much longer to wait before Nvidia reveals its plans.

Permalink to story.

 

Dimitriid

Posts: 2,216   +4,268
Didn‘t power consumption stop to matter - first for CPU, now for GPU ?

I hope this is sarcastic because no actually, it *should* matter (If it is disregard the rest of the post but I'll leave it up because I think others will inevitably also make this point and will for sure be serious about it)

1) Power usage should matter to people since it's still largely a direct correlation of how much fossil fuels we burn to generate it and even if that didn't matter, it's still an indication of how much renewable resources we waste on them: Ever tried to look into the pricing of a set of solar panels and batteries for an 'average' household? If you have you'll quickly realize "You know I would just save so much money switching to dc led lights, HVAC on just 1 or 2 rooms of the house and a laptop instead of a 1500 watts PSU full on gaming rig"

2) That last part leads nicely into this point: since we scaled *back* from huge power consumption numbers it's only natural that the market adjusted and high wattage power supplies became far more expensive simply because they were less sought after. If we're creeping right back up on the power requirements thanks to careless companies like Nvidia and intel (And in less measure but still relevant, AMD when it comes to their GPUs at least) The market forces won't immediately adjust so expect to pay a VERY important premium on a PSU that can even power something like a 12900k + 4080 based rig.

3) Last point is what I see as just bad tech that is not ready: People tend to forgive the vast amounts of extra power if the performance is a great uplift but if it requires a lot of power and headroom then it's an indication of a company pushing beyond it's limits to stay relevant: Think Alder Lake vs Ryzen 3: intel really had to basically either create or accelerate their BIG little architecture plans in my opinion just to get a bit more headroom in their power usage to be able to compete with AMD that did something similar as well but in my opinion more efficiently: By enabling 3D cache tech thingie the 5800X3D was able to claim back a lot of the performance claims Alder Lake took while that chip still requires far less power and cooling.

So even if you don't care about 1) and can flex on us how you're wealth enough to also shrug off point 2) I think everybody interested in tech should care about 3) since that's the only way forward and just throwing more and more power at it, it's not innovation, it's either stalling or rushing tech that just isn't ready by just upping the power.
 

George Keech

Posts: 215   +382
I would love to see companies pushing more to power efficiency. like what is the diminishing returns on the high end. I suppose that is down to reviewers but if you reduced the power and clocks down 10% but only got a 1% change in performance apart from benchmarks why power it that high.
 

hahahanoobs

Posts: 4,526   +2,496
I'm with kopite.
I'm just curious about the performance too. No one is going to care about an additional 80w if the performance justifies it. Most should be running PSU's with headroom anyway so 80w is nothing.

The only thing I can think of right now that would get in the way of 250W X060 cards would be OEM's. ie: BOM, higher cooling requirements, system compatibility etc.
 

moon982

Posts: 101   +27
We've heard plenty of rumors that the RTX 4090 will have a 600W TBP (total board power), while a flagship RTX 4090 Ti could push it to over 800W. Assuming that's true, and it's certainly starting to look that way, a mid-tier RTX 4060 card reaching 350W isn't beyond the realms of fantasy.
—————————

Yea because Moore's law cannot shrink down like they did in the past and double the transistor count and GPU are now becoming space heaters.

When you look at the progress of GPUs and CPUs now it is incremental updates. There is no reason to upgrade every two or three years now.

Well Nvidia should not be bringing out 4000s series cards now, but in two or three years from now when they can shrink it down more and put more transistor count to of set the power.

The technology is not there because Moore's law is slowing down and had been for the past 6 years. I have been seeing this with Intel, AMD and Nvidia. They don’t have the technology and are just drawing power and upping the clock.

Intel CPUs are space heaters and PC laptops and PC tablets under $800 are noisy, terrible battery life and struggle to buffer youtube videos and crash loading Google earth well the $300 iPad just speeds past them.

This terrible sensation that every tech website does to hype up every new GPU or every new Intel CPU or new AMD CPU every year is just terrible as it is nothing more than incremental updates now.

There no reason for new CPU every year now or new GPU every two years now.

With technology slowing down it should be every 4 or 5 years now not every year. I wish many tech websites would just tell the truth.

Intel and AMD CPUs have been playing catch up to race to 5nm. When Intel has been stuck on 14nm for 6 years.

And even the past 5 to 8 years ago maxing out on the power draw.

I'm sure Apple could make the M1 chip three times faster but your battery life would be no more than 20 minutes at the most and have be in a very big tower case than be used has a laptop. And Apple would just hold back the power draw to when the technology is here to be two or three times faster.


The 4070 is not going to be two or three times faster than the 3070 or the 2070 or the 1070 this is nothing more than incremental updates by Nvidia. And I wish tech websites would stop with the sensation and tell truth that technology is slowing down when comes to GPUs and CPUs now.

When the 5000s series cards come out they would be drawing over 1000 watts this where technology is at today. And I wish tech websites would tell the truth that technology is slowing down and there no reason to upgrade every year or two years now.

They should just bring out new GPU and new CPU every 4 years now with the way technology is today.
 

mrSister

Posts: 74   +101
If rumors are true, after the "irresponsible amounts of performance" and "the more you buy, the more you save" I can't wait to hear what catchphrase can Leather Jacket Man come up with to convince everybody they want and *need* a 800w gpu on their case XD
 

Neatfeatguy

Posts: 880   +1,519
I'm in a good spot, 5900x with a 3080 10GB. I set the power limit to 80% and I play games just fine on my 1440p free-sync monitor. I've got plenty of room to go with pushing my system's limits.

It helps that I haven't been impressed with recent games - plagued with bugs, horseshit gameplay/performance/stories....I'm really not excited about any new game coming out after being so disappointed in a few games these past couple of years.

My setup should last me for years to come before I think about upgrading or deciding that maybe my PC gaming hobby isn't worth it anymore.
 

GoldenGoat

Posts: 64   +64
I'm in a good spot, 5900x with a 3080 10GB. I set the power limit to 80% and I play games just fine on my 1440p free-sync monitor. I've got plenty of room to go with pushing my system's limits.

It helps that I haven't been impressed with recent games - plagued with bugs, horseshit gameplay/performance/stories....I'm really not excited about any new game coming out after being so disappointed in a few games these past couple of years.

My setup should last me for years to come before I think about upgrading or deciding that maybe my PC gaming hobby isn't worth it anymore.

How do you set the power limit?
 

Morphine Child

Posts: 149   +283
Didn‘t power consumption stop to matter - first for CPU, now for GPU ?
It didn't matter before when electricity was way cheaper and power hungry pc drew 500w total.

Now, when a single component draws that much, you will very much feel it on your bills if you play a lot.

So yeah, to me as well it started to matter again.

And quite frankly I find it frightening that mid range is going to these hights...
 

JGert

Posts: 8   +9
I'm in a good spot, 5900x with a 3080 10GB. I set the power limit to 80% and I play games just fine on my 1440p free-sync monitor. I've got plenty of room to go with pushing my system's limits.

It helps that I haven't been impressed with recent games - plagued with bugs, horseshit gameplay/performance/stories....I'm really not excited about any new game coming out after being so disappointed in a few games these past couple of years.

My setup should last me for years to come before I think about upgrading or deciding that maybe my PC gaming hobby isn't worth it anymore.
I already hit that point. Bought an Xbox series X for $500 and it does what I need it to. It’s too bad it has came down to this
 

Lew Zealand

Posts: 2,190   +2,675
TechSpot Elite
TechSpot said:
The previous most power-hungry cards from the series were the RTX 3060 and GTX 760 (both 170W), followed by the RTX 2060 (160W), then the GTX 1060 and GTX 960 (both 120W). That's not counting the RTX 3060 Ti, which consumes 200W.

So the real question is: what was Nvidia smoking when they designed the 960 and 1060 (and 1660/S @120-125W)? Why did they cheap out on the power usage for those? Or were they cheaping out on the core count during those years so they drew lower wattage? Maybe the 1070 was the real 1060 @150W?
 

NeoMorpheus

Posts: 1,150   +2,316
Funny how its bad news after bad news for the new series, yet the nvdrones are like:

I dont care, here is my money Lord Jensen!

At the very least, wait for reviews, seriously...
 
Last edited:

Crinkles

Posts: 246   +220
I hope this is sarcastic because no actually, it *should* matter (If it is disregard the rest of the post but I'll leave it up because I think others will inevitably also make this point and will for sure be serious about it)

1) Power usage should matter to people since it's still largely a direct correlation of how much fossil fuels we burn to generate it and even if that didn't matter, it's still an indication of how much renewable resources we waste on them: Ever tried to look into the pricing of a set of solar panels and batteries for an 'average' household? If you have you'll quickly realize "You know I would just save so much money switching to dc led lights, HVAC on just 1 or 2 rooms of the house and a laptop instead of a 1500 watts PSU full on gaming rig"

2) That last part leads nicely into this point: since we scaled *back* from huge power consumption numbers it's only natural that the market adjusted and high wattage power supplies became far more expensive simply because they were less sought after. If we're creeping right back up on the power requirements thanks to careless companies like Nvidia and intel (And in less measure but still relevant, AMD when it comes to their GPUs at least) The market forces won't immediately adjust so expect to pay a VERY important premium on a PSU that can even power something like a 12900k + 4080 based rig.

3) Last point is what I see as just bad tech that is not ready: People tend to forgive the vast amounts of extra power if the performance is a great uplift but if it requires a lot of power and headroom then it's an indication of a company pushing beyond it's limits to stay relevant: Think Alder Lake vs Ryzen 3: intel really had to basically either create or accelerate their BIG little architecture plans in my opinion just to get a bit more headroom in their power usage to be able to compete with AMD that did something similar as well but in my opinion more efficiently: By enabling 3D cache tech thingie the 5800X3D was able to claim back a lot of the performance claims Alder Lake took while that chip still requires far less power and cooling.

So even if you don't care about 1) and can flex on us how you're wealth enough to also shrug off point 2) I think everybody interested in tech should care about 3) since that's the only way forward and just throwing more and more power at it, it's not innovation, it's either stalling or rushing tech that just isn't ready by just upping the power.

The preceding rant is written by a confirmed Intel hater.

3,287 kWh $197.24 per year, bottom line. 1500 watts, six hours a day at six cents per kwh, will cost $200 per year.