I have solar panels so I dont care. They produce more than I can consume.
Main problem is not the power bill but the heat generated inside your computer case. Got it?I have solar panels so I dont care. They produce more than I can consume.
Depends on where you live. Places in EU have seen their electricity bills double in the past 4 years.The preceding rant is written by a confirmed Intel hater.
3,287 kWh $197.24 per year, bottom line. 1500 watts, six hours a day at six cents per kwh, will cost $200 per year.
I was being sarcastic. However, looking at many discussions, power consumption mysteriously stopped mattering for many once the wrong brands were less efficient.I hope this is sarcastic because no actually, it *should* matter (If it is disregard the rest of the post but I'll leave it up because I think others will inevitably also make this point and will for sure be serious about it)
1) Power usage should matter to people since it's still largely a direct correlation of how much fossil fuels we burn to generate it and even if that didn't matter, it's still an indication of how much renewable resources we waste on them: Ever tried to look into the pricing of a set of solar panels and batteries for an 'average' household? If you have you'll quickly realize "You know I would just save so much money switching to dc led lights, HVAC on just 1 or 2 rooms of the house and a laptop instead of a 1500 watts PSU full on gaming rig"
2) That last part leads nicely into this point: since we scaled *back* from huge power consumption numbers it's only natural that the market adjusted and high wattage power supplies became far more expensive simply because they were less sought after. If we're creeping right back up on the power requirements thanks to careless companies like Nvidia and intel (And in less measure but still relevant, AMD when it comes to their GPUs at least) The market forces won't immediately adjust so expect to pay a VERY important premium on a PSU that can even power something like a 12900k + 4080 based rig.
3) Last point is what I see as just bad tech that is not ready: People tend to forgive the vast amounts of extra power if the performance is a great uplift but if it requires a lot of power and headroom then it's an indication of a company pushing beyond it's limits to stay relevant: Think Alder Lake vs Ryzen 3: intel really had to basically either create or accelerate their BIG little architecture plans in my opinion just to get a bit more headroom in their power usage to be able to compete with AMD that did something similar as well but in my opinion more efficiently: By enabling 3D cache tech thingie the 5800X3D was able to claim back a lot of the performance claims Alder Lake took while that chip still requires far less power and cooling.
So even if you don't care about 1) and can flex on us how you're wealth enough to also shrug off point 2) I think everybody interested in tech should care about 3) since that's the only way forward and just throwing more and more power at it, it's not innovation, it's either stalling or rushing tech that just isn't ready by just upping the power.
I think that last part goes to one of the points I made: since any company is capable of falling behind the competition at many points it's why you can find hot chips for any brand: Hot chips for AMD CPUs then hot chips for intel, hot chips from AMD GPUs and also from Nvidia GPUs.I was being sarcastic. However, looking at many discussions, power consumption mysteriously stopped mattering for many once the wrong brands were less efficient.
I also remember how some reviews went to great lengths pointing out the true (overall) cost of a higher power consumption but it‘s more of a side note.
So if there are cards in the lineup that have the same TDP as Ampere cards and better performance, we can agree that they're well engineered?Good engineering = same TDP + more performance.
But someone in Nvidia is not following that simple rule. Power consumption this new generation will be appalling.
Same, and the heat isn't an issue for me either. I don't get how some people say their GFX card makes their room unbearably hot, I live in a super hot country and my 3080 has had virtually no effect on the temperature of my room in any month of the year.I have solar panels so I dont care. They produce more than I can consume.
I wonder how would all this power hunger will translate to Laptop gaming? It looks the PC/Laptop rift will further widen instead of narrowing.
I wonder how would all this power hunger will translate to Laptop gaming? It looks the PC/Laptop rift will further widen instead of narrowing.
I think 350w is totally false, I mean, the same as a 3090 really? Who even comes up with this stuff, hec even 'more than a 3070' could be false for all we know. Only time will tell, the rumor mill is certainly in overdrive. I'll keep my expectations tempered till the companies themselves release the specs.Wow, a mid range card pushing 350W? I feel it would be pushing it to have a mid range card from the xx60 series to draw 250W
He was being sarcastic. It's because so many people used to whine about the power consumption of Radeon cards and FX processors as their reason for using GeForce and Core respectively. Now that Radeon cards and Ryzen processors are the more power-efficient, suddenly those same people act like it's no big deal. It's a double-standard that's plain as day. I think that he's just calling people out on their BS.I hope this is sarcastic because no actually, it *should* matter (If it is disregard the rest of the post but I'll leave it up because I think others will inevitably also make this point and will for sure be serious about it)
1) Power usage should matter to people since it's still largely a direct correlation of how much fossil fuels we burn to generate it and even if that didn't matter, it's still an indication of how much renewable resources we waste on them: Ever tried to look into the pricing of a set of solar panels and batteries for an 'average' household? If you have you'll quickly realize "You know I would just save so much money switching to dc led lights, HVAC on just 1 or 2 rooms of the house and a laptop instead of a 1500 watts PSU full on gaming rig"
2) That last part leads nicely into this point: since we scaled *back* from huge power consumption numbers it's only natural that the market adjusted and high wattage power supplies became far more expensive simply because they were less sought after. If we're creeping right back up on the power requirements thanks to careless companies like Nvidia and intel (And in less measure but still relevant, AMD when it comes to their GPUs at least) The market forces won't immediately adjust so expect to pay a VERY important premium on a PSU that can even power something like a 12900k + 4080 based rig.
3) Last point is what I see as just bad tech that is not ready: People tend to forgive the vast amounts of extra power if the performance is a great uplift but if it requires a lot of power and headroom then it's an indication of a company pushing beyond it's limits to stay relevant: Think Alder Lake vs Ryzen 3: intel really had to basically either create or accelerate their BIG little architecture plans in my opinion just to get a bit more headroom in their power usage to be able to compete with AMD that did something similar as well but in my opinion more efficiently: By enabling 3D cache tech thingie the 5800X3D was able to claim back a lot of the performance claims Alder Lake took while that chip still requires far less power and cooling.
So even if you don't care about 1) and can flex on us how you're wealth enough to also shrug off point 2) I think everybody interested in tech should care about 3) since that's the only way forward and just throwing more and more power at it, it's not innovation, it's either stalling or rushing tech that just isn't ready by just upping the power.
Yeah I think that was clarified on a subsequent post no worriesHe was being sarcastic. It's because so many people used to whine about the power consumption of Radeon cards and FX processors as their reason for using GeForce and Core respectively. Now that Radeon cards and Ryzen processors are the more power-efficient, suddenly those same people act like it's no big deal. It's a double-standard that's plain as day. I think that he's just calling people out on their BS.
A mobile 4060 would be so down-clocked I'll guarantee AMD's Dragon will probably match it and use 1/4 the power. Nvidia's brain-dead let it rip approach will see it become less and less relevant in non-gaming specific laptops.Already 3060 Ti laptop is barely faster than 3050 due to low clocks to keep thermals in check. Laptops aren't getting more voluminous, so mobile Lovelace will have to be severely crippled to manage heat.
According to coretek on youtube, ngreedia's next gen would be much more ray tracing oriented which I think would mean a lot of RT cores. If it's the case, that would be a quite well justified power usage increase. In the 3000 series, RT was just a demo, but if it becomes a real thing with full RT games and faster CG rendering, it would be the way to go even at a high power cost. That doesn't mean I'd ever buy any ngreedia product, as the company is much too powerful and not in our favor. Better promote the competition.
We've heard plenty of rumors that the RTX 4090 will have a 600W TBP (total board power), while a flagship RTX 4090 Ti could push it to over 800W. Assuming that's true, and it's certainly starting to look that way, a mid-tier RTX 4060 card reaching 350W isn't beyond the realms of fantasy.
—————————
Yea because Moore's law cannot shrink down like they did in the past and double the transistor count and GPU are now becoming space heaters.
When you look at the progress of GPUs and CPUs now it is incremental updates. There is no reason to upgrade every two or three years now.
Well Nvidia should not be bringing out 4000s series cards now, but in two or three years from now when they can shrink it down more and put more transistor count to of set the power.
The technology is not there because Moore's law is slowing down and had been for the past 6 years. I have been seeing this with Intel, AMD and Nvidia. They don’t have the technology and are just drawing power and upping the clock.
Intel CPUs are space heaters and PC laptops and PC tablets under $800 are noisy, terrible battery life and struggle to buffer youtube videos and crash loading Google earth well the $300 iPad just speeds past them.
This terrible sensation that every tech website does to hype up every new GPU or every new Intel CPU or new AMD CPU every year is just terrible as it is nothing more than incremental updates now.
There no reason for new CPU every year now or new GPU every two years now.
With technology slowing down it should be every 4 or 5 years now not every year. I wish many tech websites would just tell the truth.
Intel and AMD CPUs have been playing catch up to race to 5nm. When Intel has been stuck on 14nm for 6 years.
And even the past 5 to 8 years ago maxing out on the power draw.
I'm sure Apple could make the M1 chip three times faster but your battery life would be no more than 20 minutes at the most and have be in a very big tower case than be used has a laptop. And Apple would just hold back the power draw to when the technology is here to be two or three times faster.
The 4070 is not going to be two or three times faster than the 3070 or the 2070 or the 1070 this is nothing more than incremental updates by Nvidia. And I wish tech websites would stop with the sensation and tell truth that technology is slowing down when comes to GPUs and CPUs now.
When the 5000s series cards come out they would be drawing over 1000 watts this where technology is at today. And I wish tech websites would tell the truth that technology is slowing down and there no reason to upgrade every year or two years now.
They should just bring out new GPU and new CPU every 4 years now with the way technology is today.
When I say much more ray tracing oriented I mean that RTX cards that we have right now have extremely poor RT performance because it was more intended for marketing purpose than for actual use. But soon (or not that soon), games will be RT only. For that, we need much better RT performance. And this could be the explanation of why such a power gap.Not sure what you mean by more ray tracing oriented? We already have ray tracing support.
Oops, I didn't see it. My bad.Yeah I think that was clarified on a subsequent post no worries