Nvidia RTX 4060 rumored to be more power-hungry than an RTX 3070

With numbers like that, it's going to be an easy choice to upgrade my RX480 to an RX 6600. 150 Watts to 132 Watts and nearly twice as powerful, that's a good upgrade, works with my current power supply, and keeps my system just as quiet as it has been the past decade. The 6600 XT is also nearly as efficient but 20% more powerful.

Apparently nVIDIA thinks people want space heaters on their desk, but no... I'm happy to leave the "computer can effectively heat up Pop-Tarts" days in the Pentium 4 era.
 
3080 FE already heats my office during winter to uncomfortable levels requiring me to open a window when its 30-40f outside.
Ill pass on these new space heaters.
 
More rumors and speculation based on previous rumors... And this card only comes out January 2023? Things can change a lot in 6 months.

Also there's no mention of the TDP of the 4060 in these rumors, it's just rumored to be more than a 3070's 220 watt TDP, so that could mean 225 watts, or as the author felt the need to exaggerate and claim a possible 350 watt TDP, which if this was the case the original rumor would have stated greater TDP than that of a 3080's 320 watt TDP.
 
I miss the yesteryear where you could cram a decent Geforce GPU onto a single fan GPU design* (970, 1070/1080, 2070 as examples)... with how TDPs keep climbing it feels like these higher end niche cards are being unintentionally designed out of the market. Granted the market that such cards were geared for have pivoted to case designs that accommodate larger cards so it's more or less a non-issue fitting bigger and better cards in them, but I'll still be a curmudgeon about it.

*I'm not talking about the founders edition cards
 
The preceding rant is written by a confirmed Intel hater.

3,287 kWh $197.24 per year, bottom line. 1500 watts, six hours a day at six cents per kwh, will cost $200 per year.
Depends on where you live. Places in EU have seen their electricity bills double in the past 4 years.

In my area the cost for electricity is $0.114/KWh (last I checked). If it were only $0.06/KWh, that would cut my electricity bill in half.

It should and it does matter, to some people, what kind of power draw high-end GPUs can pull.
 
I hope this is sarcastic because no actually, it *should* matter (If it is disregard the rest of the post but I'll leave it up because I think others will inevitably also make this point and will for sure be serious about it)

1) Power usage should matter to people since it's still largely a direct correlation of how much fossil fuels we burn to generate it and even if that didn't matter, it's still an indication of how much renewable resources we waste on them: Ever tried to look into the pricing of a set of solar panels and batteries for an 'average' household? If you have you'll quickly realize "You know I would just save so much money switching to dc led lights, HVAC on just 1 or 2 rooms of the house and a laptop instead of a 1500 watts PSU full on gaming rig"

2) That last part leads nicely into this point: since we scaled *back* from huge power consumption numbers it's only natural that the market adjusted and high wattage power supplies became far more expensive simply because they were less sought after. If we're creeping right back up on the power requirements thanks to careless companies like Nvidia and intel (And in less measure but still relevant, AMD when it comes to their GPUs at least) The market forces won't immediately adjust so expect to pay a VERY important premium on a PSU that can even power something like a 12900k + 4080 based rig.

3) Last point is what I see as just bad tech that is not ready: People tend to forgive the vast amounts of extra power if the performance is a great uplift but if it requires a lot of power and headroom then it's an indication of a company pushing beyond it's limits to stay relevant: Think Alder Lake vs Ryzen 3: intel really had to basically either create or accelerate their BIG little architecture plans in my opinion just to get a bit more headroom in their power usage to be able to compete with AMD that did something similar as well but in my opinion more efficiently: By enabling 3D cache tech thingie the 5800X3D was able to claim back a lot of the performance claims Alder Lake took while that chip still requires far less power and cooling.

So even if you don't care about 1) and can flex on us how you're wealth enough to also shrug off point 2) I think everybody interested in tech should care about 3) since that's the only way forward and just throwing more and more power at it, it's not innovation, it's either stalling or rushing tech that just isn't ready by just upping the power.
I was being sarcastic. However, looking at many discussions, power consumption mysteriously stopped mattering for many once the wrong brands were less efficient.

I also remember how some reviews went to great lengths pointing out the true (overall) cost of a higher power consumption but it‘s more of a side note.
 
I was being sarcastic. However, looking at many discussions, power consumption mysteriously stopped mattering for many once the wrong brands were less efficient.

I also remember how some reviews went to great lengths pointing out the true (overall) cost of a higher power consumption but it‘s more of a side note.
I think that last part goes to one of the points I made: since any company is capable of falling behind the competition at many points it's why you can find hot chips for any brand: Hot chips for AMD CPUs then hot chips for intel, hot chips from AMD GPUs and also from Nvidia GPUs.

I feel that for Nvidia however, it's been a bit of a self inflicted wound: AMD hasn't reclaimed significant market share in years and many generations for GPUs and the drive towards more frequent GPU releases and henceforth, more frequent ramping up of power requirement has been entirely designed by Nvidia as they decided "We need to just start selling new products even if our design and our partner's forges are not quite ready for a significant jump it's been too long without flexing out marketing machine and obsoleting old cards"
 
Good engineering = same TDP + more performance.
But someone in Nvidia is not following that simple rule. Power consumption this new generation will be appalling.
So if there are cards in the lineup that have the same TDP as Ampere cards and better performance, we can agree that they're well engineered?
I have solar panels so I dont care. They produce more than I can consume.
Same, and the heat isn't an issue for me either. I don't get how some people say their GFX card makes their room unbearably hot, I live in a super hot country and my 3080 has had virtually no effect on the temperature of my room in any month of the year.
I can see why people don't want the extra heat generated etc, I'm just saying that for some people this is very much a non issue.
 
Last edited:
I wonder how would all this power hunger will translate to Laptop gaming? It looks the PC/Laptop rift will further widen instead of narrowing.
I wonder how would all this power hunger will translate to Laptop gaming? It looks the PC/Laptop rift will further widen instead of narrowing.

A mobile 4060 would be so down-clocked I'll guarantee AMD's Dragon will probably match it and use 1/4 the power. Nvidia's brain-dead let it rip approach will see it become less and less relevant in non-gaming specific laptops.Already 3060 Ti laptop is barely faster than 3050 due to low clocks to keep thermals in check. Laptops aren't getting more voluminous, so mobile Lovelace will have to be severely crippled to manage heat.
 
Wow, a mid range card pushing 350W? I feel it would be pushing it to have a mid range card from the xx60 series to draw 250W. Looks like Nvidia is pushing clockspeed really hard to try and keep ahead of competition that the card is operating way past the “sweet spot” clock. I do wonder what chip will we be expecting on the mobile/ laptop market since AD103 and 104 appears to be quite power hungry, and not really suited in a laptop.
 
Wow, a mid range card pushing 350W? I feel it would be pushing it to have a mid range card from the xx60 series to draw 250W
I think 350w is totally false, I mean, the same as a 3090 really? Who even comes up with this stuff, hec even 'more than a 3070' could be false for all we know. Only time will tell, the rumor mill is certainly in overdrive. I'll keep my expectations tempered till the companies themselves release the specs.
 
Last edited:
After reading this article I'm glad to have recently bought two 12GB GeForce RTX 3080 cards. It looks as though the RTX 40 series will be power-hungry monsters.
 
According to coretek on youtube, ngreedia's next gen would be much more ray tracing oriented which I think would mean a lot of RT cores. If it's the case, that would be a quite well justified power usage increase. In the 3000 series, RT was just a demo, but if it becomes a real thing with full RT games and faster CG rendering, it would be the way to go even at a high power cost. That doesn't mean I'd ever buy any ngreedia product, as the company is much too powerful and not in our favor. Better promote the competition.
 
I hope this is sarcastic because no actually, it *should* matter (If it is disregard the rest of the post but I'll leave it up because I think others will inevitably also make this point and will for sure be serious about it)

1) Power usage should matter to people since it's still largely a direct correlation of how much fossil fuels we burn to generate it and even if that didn't matter, it's still an indication of how much renewable resources we waste on them: Ever tried to look into the pricing of a set of solar panels and batteries for an 'average' household? If you have you'll quickly realize "You know I would just save so much money switching to dc led lights, HVAC on just 1 or 2 rooms of the house and a laptop instead of a 1500 watts PSU full on gaming rig"

2) That last part leads nicely into this point: since we scaled *back* from huge power consumption numbers it's only natural that the market adjusted and high wattage power supplies became far more expensive simply because they were less sought after. If we're creeping right back up on the power requirements thanks to careless companies like Nvidia and intel (And in less measure but still relevant, AMD when it comes to their GPUs at least) The market forces won't immediately adjust so expect to pay a VERY important premium on a PSU that can even power something like a 12900k + 4080 based rig.

3) Last point is what I see as just bad tech that is not ready: People tend to forgive the vast amounts of extra power if the performance is a great uplift but if it requires a lot of power and headroom then it's an indication of a company pushing beyond it's limits to stay relevant: Think Alder Lake vs Ryzen 3: intel really had to basically either create or accelerate their BIG little architecture plans in my opinion just to get a bit more headroom in their power usage to be able to compete with AMD that did something similar as well but in my opinion more efficiently: By enabling 3D cache tech thingie the 5800X3D was able to claim back a lot of the performance claims Alder Lake took while that chip still requires far less power and cooling.

So even if you don't care about 1) and can flex on us how you're wealth enough to also shrug off point 2) I think everybody interested in tech should care about 3) since that's the only way forward and just throwing more and more power at it, it's not innovation, it's either stalling or rushing tech that just isn't ready by just upping the power.
He was being sarcastic. It's because so many people used to whine about the power consumption of Radeon cards and FX processors as their reason for using GeForce and Core respectively. Now that Radeon cards and Ryzen processors are the more power-efficient, suddenly those same people act like it's no big deal. It's a double-standard that's plain as day. I think that he's just calling people out on their BS.
 
Ηere the cost of a KW/hr is 0,282 EUR fm 0-1600, 0.3 EUR fm 1600-2000 and 0.33 EUR above 2000 KW/HR.

In addition, there's various taxes, obscure fees, surcharges, value added taxes and even taxation on the taxes paid that raise a 3000 KW/HR power bill to the vicinity of 1265 EUR per 4 months.

It has gotten so bad lately with the energy crisis in fact that the Greek Govt is actually reimbursing some of the money paid.

Mining stopped being profitable with regard to power costs I think back in February.

When you pay a 1260 EUR power bill you can understand a 350W 4060 or any 350W card is quite out of the question.

I have got a Killawatt device and my whole system's power consumption when gaming is abt 250W. Both my CPU and GPU (1080) are undervolted and overclocked.

There's no way in hell I'd ever buy a GPU with 350W TDP or anything higher than that and I imagine many consumers in the EU are in the same boat as our power costs are insane compared to the USA and practically free power they got there.
 
Last edited:
He was being sarcastic. It's because so many people used to whine about the power consumption of Radeon cards and FX processors as their reason for using GeForce and Core respectively. Now that Radeon cards and Ryzen processors are the more power-efficient, suddenly those same people act like it's no big deal. It's a double-standard that's plain as day. I think that he's just calling people out on their BS.
Yeah I think that was clarified on a subsequent post no worries
 
A mobile 4060 would be so down-clocked I'll guarantee AMD's Dragon will probably match it and use 1/4 the power. Nvidia's brain-dead let it rip approach will see it become less and less relevant in non-gaming specific laptops.Already 3060 Ti laptop is barely faster than 3050 due to low clocks to keep thermals in check. Laptops aren't getting more voluminous, so mobile Lovelace will have to be severely crippled to manage heat.

And you still have to wait to next your for AMD to bring out new cards. Well Nvidia will be bringing out this fall. Than we will see if power increase is true or not.


 
According to coretek on youtube, ngreedia's next gen would be much more ray tracing oriented which I think would mean a lot of RT cores. If it's the case, that would be a quite well justified power usage increase. In the 3000 series, RT was just a demo, but if it becomes a real thing with full RT games and faster CG rendering, it would be the way to go even at a high power cost. That doesn't mean I'd ever buy any ngreedia product, as the company is much too powerful and not in our favor. Better promote the competition.

Not sure what you mean by more ray tracing oriented? We already have ray tracing support.
 
We've heard plenty of rumors that the RTX 4090 will have a 600W TBP (total board power), while a flagship RTX 4090 Ti could push it to over 800W. Assuming that's true, and it's certainly starting to look that way, a mid-tier RTX 4060 card reaching 350W isn't beyond the realms of fantasy.
—————————

Yea because Moore's law cannot shrink down like they did in the past and double the transistor count and GPU are now becoming space heaters.

When you look at the progress of GPUs and CPUs now it is incremental updates. There is no reason to upgrade every two or three years now.

Well Nvidia should not be bringing out 4000s series cards now, but in two or three years from now when they can shrink it down more and put more transistor count to of set the power.

The technology is not there because Moore's law is slowing down and had been for the past 6 years. I have been seeing this with Intel, AMD and Nvidia. They don’t have the technology and are just drawing power and upping the clock.

Intel CPUs are space heaters and PC laptops and PC tablets under $800 are noisy, terrible battery life and struggle to buffer youtube videos and crash loading Google earth well the $300 iPad just speeds past them.

This terrible sensation that every tech website does to hype up every new GPU or every new Intel CPU or new AMD CPU every year is just terrible as it is nothing more than incremental updates now.

There no reason for new CPU every year now or new GPU every two years now.

With technology slowing down it should be every 4 or 5 years now not every year. I wish many tech websites would just tell the truth.

Intel and AMD CPUs have been playing catch up to race to 5nm. When Intel has been stuck on 14nm for 6 years.

And even the past 5 to 8 years ago maxing out on the power draw.

I'm sure Apple could make the M1 chip three times faster but your battery life would be no more than 20 minutes at the most and have be in a very big tower case than be used has a laptop. And Apple would just hold back the power draw to when the technology is here to be two or three times faster.


The 4070 is not going to be two or three times faster than the 3070 or the 2070 or the 1070 this is nothing more than incremental updates by Nvidia. And I wish tech websites would stop with the sensation and tell truth that technology is slowing down when comes to GPUs and CPUs now.

When the 5000s series cards come out they would be drawing over 1000 watts this where technology is at today. And I wish tech websites would tell the truth that technology is slowing down and there no reason to upgrade every year or two years now.

They should just bring out new GPU and new CPU every 4 years now with the way technology is today.

Well said, but shareholders don't care about that.
 
Unless you have like a 1000W or 1300W PS you are doing an upgrade on your PS for these new high-end Nivida cards.
 
Not sure what you mean by more ray tracing oriented? We already have ray tracing support.
When I say much more ray tracing oriented I mean that RTX cards that we have right now have extremely poor RT performance because it was more intended for marketing purpose than for actual use. But soon (or not that soon), games will be RT only. For that, we need much better RT performance. And this could be the explanation of why such a power gap.
 
Back