Next-gen GPUs look big and hungry, and that's bad news

My 2060 is power limited to 180.

The 3090 I got to play around with seemed capped at 430, or at least that's the most I (and my brother whom currently uses it) saw it use.
Some 3090s do more with bios 450 to 500 watts exotic flavors. Compared that to the next gen that as the base models for high end flagships 😳.
I'm able to use a 750 watt 85+ sfx platinum psu for my 3090 ultra xc3 hybrid at 2ghz with 9900ks at 5.1 ghz all cores and 4ghz gsills.
Might have to take out my old antec 1200 psu for next gen.
 
Last edited:
6900XT can easily spike at 500 watt too. I have seen 650 watt spikes on some custom 6900XT AIO boards... And 550-600 on some 3090 .. This is on 3x8 pin cards obviously, hell I think there is even 4x8 pin custom cards...

Undervolting is a thing tho ... 3090 and 6900XT can easily be kept below 400 watt spikes while keeping performance on par with factory oc..

Top cards are pointless for most people anyway, both 3090 and 6900XT have TERRIBLE performance per value. Why pay like 100% more for 5-10% more performance and higher powerdraw?
Yes, I have seen my Radeon Asrock Phantom Gaming 6900XT spiking 500W. You are right.
 
It appears that I am learning something new here. When you all say “spike” do you mean like a split second peak or spiking and then staying very high for a longer time? The worst peak I can find during gaming in a review is around 450 watts, and it was for under a second.

Also, are these “peaks” actual GPU draw tests or total draw at the wall?
 
It appears that I am learning something new here. When you all say “spike” do you mean like a split second peak or spiking and then staying very high for a longer time? The worst peak I can find during gaming in a review is around 450 watts, and it was for under a second.

Also, are these “peaks” actual GPU draw tests or total draw at the wall?
Yes, peaks for a very short time.
 
Next gen GPUs??? I'm still waiting for this generation of GPUs to be available at MSRP.
 
Also doesn't help that it will push your electricity bill up by a fair bit if you game regularly, and will mean you have to spend money on a larger power supply where getting good 80 plus efficiency is important as your load at idle (I assume) would be well below max capacity.
It's not going to be a significant increase to your electric bill.

This isn't even out and already banned in the great state of California 🤣🤣🤣
It's not banned in California since its memory bandwidth will likely be well above the minimum.
 
It appears that I am learning something new here. When you all say “spike” do you mean like a split second peak or spiking and then staying very high for a longer time? The worst peak I can find during gaming in a review is around 450 watts, and it was for under a second.

Also, are these “peaks” actual GPU draw tests or total draw at the wall?
My previous computer set up had an ASUS Thor SPU which showed how much power my computer was pulling and even with a 5950x, 3090, 64GB of RAM and it rarely pulled over 300w. It would only pull full power if the computer was running CPU and GPU stress tests at the same time. Most real world applications do not stress all of a computer's components.
 
OK. Let's put things into perspective.

350W 3090 vs 365W GTX590. So 590 had 2 GPUs on it, but you wouldn't play a single new game on it today. 24GB VRAM vs 3GB. 3090 does same amount of job as 4 1080Tis using ONLY 350W not 1kW+ and it doesn't choke at very high res renders. I have no issue with 1 card drawing 350W if it can do the work of multiple VGAs from previous generation.

500W 3090 that's ludicrous. Anyone who OC 3090 is insane.

As a curiosity. I have one 3D garden module which chokes 3090 to death. I thought that 1080Tis were weak, but it seems that amount of reflections, surfaces, leaves and all other organic matter is so enormous which is enough to bring even 3090 to its knees when doing live render in the Viewport.

Exactly, since SLI/CF no longer exists, these cards should be compared to earlier generations SLI/CF setups, not single cards. They also cost the same as SLI/CF from earlier generations. If you don't like the price or power consumption, buy a video card that fit your budget/amount of power you want instead of complaining that top performance costs $$$ and uses a lot of power. I worry more about actually getting a card @MSRP.
 
Maybe a lot of this is because the buzz around cards even from the media is about pure speed sometimes price to Performance comes up. but in reality although Power draw is benchmarked Very rarely are these things front a centre.

I think more should be focused on Efficiency eg this PC is 2x more powerful as a unit but pulls 5 times more power. I think as a hole rig this is important.
 
Also doesn't help that it will push your electricity bill up by a fair bit if you game regularly, and will mean you have to spend money on a larger power supply where getting good 80 plus efficiency is important as your load at idle (I assume) would be well below max capacity.

To be fair, the most crucial component for system stability is your PSU, especially if you overclock.

I remember the first component to ever die on me was my crappy no-name PSU, and it took the motherboard with it. :( Never again I said! I'll gladly spend the extra cash for a premium PSU, which come with 10 year warranties now.

I bought a Seasonic Titanium efficiency PSU 7 years ago and haven't had a single problem (it has seen its way into 2 upgrades during that time). Just buy it and forget it about it.
 
Power consumption was already an area of concern for Ampere and serious concern at that:
PC World: Do you need a new power supply for Nvidia’s GeForce RTX 3080?
Nvidia's RTX 3000 Power Supply Requirements Amp Up PSU Shortage Concerns
RX 6000-series cards aren't exactly light on the power usage either but with the RX 6800XT using about the same juice as the RTX 3070, it was hard to complain about. Now it's easy as hell to complain about if it's going to be north of 400W.

This isn't even really an advancement in tech, it's just suping up what's already there. It's like taking a 2.5L L4 engine that puts out 150hp and then succeeding said engine with a 5.0L V8 that puts out 300hp but uses twice the fuel. That's not an advancement in tech, that's just using more of what you already have.
 
Also doesn't help that it will push your electricity bill up by a fair bit if you game regularly, and will mean you have to spend money on a larger power supply where getting good 80 plus efficiency is important as your load at idle (I assume) would be well below max capacity.
It depends on where you live of course. In Ontario, our hydro rates depend on the time of day and the season:
SUMMER:
chart_electricity_tou_summer%20copy.jpg

WINTER:
chart_electricity_tou_winter%20copy.jpg

Off-Peak: 8.2¢/kWh
Mid-Peak: 11.3¢/kWh
On-Peak: 17¢/kWh
(Prices are in CAD, Weekends and Holidays are 100% Off-Peak/Green)


So, yeah, we'll be alright with the power use with an overall average of about [($0.082*108)+($0.113*30)+($0.17*30)]/168 = 10¢/kWh CAD
I know that Europeans pay more than double what we do for reliable electricity. People with privatised power grids are really screwed because, as Texas demonstrated, a privatised grid is unreliable and (at least in Texas), they pay literally 50% more on average than people in Ontario at 15¢/kWh CAD. For some of these people (don't know exactly if it's a lot or a little), that increased power consumption will be a factor in their choice of PC parts.
 
Last edited:
Graphics processing is overtaking the CPU. I've predicted for years that things are going to flip to where the mobo supports a socketed GPU and the CPU will move to be an add-in card.
 
This isn't even out and already banned in the great state of California 🤣🤣🤣


How long yall gonna keep posting this "joke" aka incorrect and badly reported bs?

DIY isn't effected AT ALL!

It was literally only one pc from one company because they didn't want to spend the extra money to redesign their psu to fall within spec.

Just stop with this dumb joke.
 
Both nvidia and AMD are colluding to disregard low end GPUs and force everyone in that segment (which is a lot of people) to turn towards used market... unfortunately.


You can call it "colluding" but the fact is the market that was once only inhabited by the typical "gamer/geek/tech/etc" is now a much more mainstream pass time and with that explosive growth in popularity the days of them having to price everything to bottom barrel pricing is over... Heck they used to throw in 3 brand new AAA games just to get us to upgrade (at the same prices they had been for generations at that) but today the market dictates through basic supply and demand that they can no longer operate this way and still maximize their business.

Unless you actually think a publicly traded company is going to go out of its way to sabotage its success?

The market (popularity) of pc gaming is what's pushed the bottom out of getting brand new silicon as well as the huge performance gains we've been seeing.

A used $400 card from one generation back or so for $200 is still likley to be better than that $200 gpu you bought 5 years ago by a long shot.

It sucks to be in the spot but as we've seen the market grow and grow its essentially moving up tiers and the lower ones are going to fall off because at the end of the day there is only so much silicon to go around and to pretend like these companies would go out of their way to make LESS money is illogical.
 
A used $400 card from one generation back or so for $200 is still likley to be better than that $200 gpu you bought 5 years ago by a long shot.
Please tell me when you find that unicorn $400 GPU (ex. 2060 Super/ 5700XT) for $200 used, or even $300. Today they are even more expensive than they were brand new, by a factor of 2x or more, on the used market.

There is not solution for the low end anymore, not even the used market. This is nvidia's and AMD's reasoning and solution in their decision to eliminate new low end GPUs, but it does not mean it's viable.

It's as bad as it can be and it does not look to get better soon, or ever and even if it gets better at one point, it will not be much better than it is now.

Unfortunately we will never go back to 2019 state of things, not only GPUs and prices, but everything.
 
Power consumption was already an area of concern for Ampere and serious concern at that:
PC World: Do you need a new power supply for Nvidia’s GeForce RTX 3080?
Nvidia's RTX 3000 Power Supply Requirements Amp Up PSU Shortage Concerns
RX 6000-series cards aren't exactly light on the power usage either but with the RX 6800XT using about the same juice as the RTX 3070, it was hard to complain about. Now it's easy as hell to complain about if it's going to be north of 400W.

This isn't even really an advancement in tech, it's just suping up what's already there. It's like taking a 2.5L L4 engine that puts out 150hp and then succeeding said engine with a 5.0L V8 that puts out 300hp but uses twice the fuel. That's not an advancement in tech, that's just using more of what you already have.
It depends. Most rumours put the performance at around 2.5x which should in theory make it more efficient in terms of perf/watt.
 
Those are defective boards. I can't even see how a GPU that has a 300-350 watt TDP could survive that kind of power draw. And the 3090s are only about 45 watts higher Max.


I'm not posting that as AMD vs Nvidia. This link is an excellent write-up on power draw.
I said spikes, and yes, they can;

620 watt spike

545 watt spike

508 watt spike

531 watt spike

So yeah..

Will spike even higher for people who overclocks without undervolting
 
Last edited:
Please tell me when you find that unicorn $400 GPU (ex. 2060 Super/ 5700XT) for $200 used, or even $300. Today they are even more expensive than they were brand new, by a factor of 2x or more, on the used market.
That's a fact. I was damn lucky that I got my RX 5700XT last August. If I didn't find it for $90CAD cheaper than the going rate, I might still be on my R9 Fury. The R9 Fury played all my games well enough to enjoy but going forward, it could have been difficult because there's no way I'd be willing to bend over and take it for the prices that are out there right now.
 
I just upgraded my gpu (rtx3080) and cpu (5600x,) not to mention ram (4000MHz CL16) that needs higher voltage. While my 750w psu seems to be managing but just to play it safe I just ordered a 850w evga and now I'm wondering if I should have got the 1000w to handle any future upgrades
 
I said spikes, and yes, they can;

620 watt spike

545 watt spike

508 watt spike

531 watt spike

So yeah..

Will spike even higher for people who overclocks without undervolting
Yes I know someone already pointed it out.
 
Back