Nvidia GeForce RTX 4090 Review: Next-Gen GPU Monster

Ohnooze

Posts: 445   +821
Since when did power demand turn into a much less important issue?
I mean it's talked about but it seems to be less important these days.
Are we going to just keep making them larger and throwing more power at it to get the gains? Seems weird to me.
 

AlaskaGuy

Posts: 744   +612
From the Nvidia site. The RTX 4080 16GB looks to be a beast.

rCGgBPASVdC2Djd2Y2iBT8-970-80.png
 

Bobbydpue

Posts: 404   +267
I truly and sincerely hope that the title for the upcoming reviews of RDNA3 are as good and positive as this one, but I know better and some salt or pinch of negativity will be added, as usual.
When you say pinch of negativity do you mean objective criticism? Did you miss the part where the price of the 4090 was criticized?
 

Bobbydpue

Posts: 404   +267
Since when did power demand turn into a much less important issue?
I mean it's talked about but it seems to be less important these days.
Are we going to just keep making them larger and throwing more power at it to get the gains? Seems weird to me.
If you look at the amount of work done for the amount of power used modern hardware is a lot more efficient than 10 years ago.
 

Geralt

Posts: 1,320   +2,149
Main problem with this card is the size. I have a full tower with an EATX mobo and anyway I am tight regarding length. It is very thick too, covering my second pcie x8 slot where I have an Asus RAID adapter. And heavy too needing a support stick to prevent damage to my mobo from happening. It is a hefty brick really! Impractical. And the third-party cards are even bigger! People with a middle tower... Good luck installing this monstrosity.
It is hot too and power hungry despite Steven tried hard to hide this fact. And it's extremely expensive despite Steven again tried hard to justify its exorbitant price. To pay this big money for a card to only play games with is totally insane. There are no miners inflating prices now. Nonetheless, the author seems to find the price "reasonable".
All in all, too many efforts to get and own a card like this, hefty, bulky, hot and power-hungry, only to play games with at 5000967 fps.
The article, as usual, is biased toward Nvidia. This happens in many tech sites anyway. Performance is not the sole factor regarding a graphics card.
Waiting for some common sense from RDNA 3.
 
Last edited:

Ohnooze

Posts: 445   +821
If you look at the amount of work done for the amount of power used modern hardware is a lot more efficient than 10 years ago.
But we are still using a lot more power. When AMD's Vega series came out everyone was all up in arms having a fit because it required a 500w psu minimum. Just using this as an example but that was in 2017...only 5 years ago. We are up to 850w and people are justifying it. Now the vega series wasn't exactly a stellar series but the power consumption was not acceptable.
I remember about 3 years ago being on a pc hardware forum and I was asking about what PSU I should buy and everyone was saying that anything over 600w was overkill because we would be getting more efficient. Thankfully I did not take their advice and went with 850w. I'm not one of those people overly concerned with power to be honest but it seems like they need to control it a bit better. Are we just going to keep making cards bigger and more power hungry to get the biggest gains possible? I mean no wonder that the gains are so drastic....wrong direction imo.
 

EdmondRC

Posts: 447   +655
But we are still using a lot more power. When AMD's Vega series came out everyone was all up in arms having a fit because it required a 500w psu minimum. Just using this as an example but that was in 2017...only 5 years ago. We are up to 850w and people are justifying it. Now the vega series wasn't exactly a stellar series but the power consumption was not acceptable.
I remember about 3 years ago being on a pc hardware forum and I was asking about what PSU I should buy and everyone was saying that anything over 600w was overkill because we would be getting more efficient. Thankfully I did not take their advice and went with 850w. I'm not one of those people overly concerned with power to be honest but it seems like they need to control it a bit better. Are we just going to keep making cards bigger and more power hungry to get the biggest gains possible? I mean no wonder that the gains are so drastic....wrong direction imo.
I guess in a way, those options are there if you don't mind a little reduction in performance. Both the RTX 30 and RTX 40 can run very efficiently with less power. They are both pushed beyond their efficiency curves to optimize performance at a cost to power consumption. If efficiency is a bigger concern a little tweaking and you can get much better power consumption and thermals and still retain 85-95% of your overall performance. This obviously works because these chips can run in laptops using 1/3 the power and usually maintaining something like 65-70% of their desktop counterpart's performance. So, it is not like you absolutely have to have a 1000 watt PSU to run an RTX 4090, you could definitely run it on a 750 watt and perhaps even a 650 watt PSU if you drop the power usage and sacrifice the performance. Probably not recommended, but there is no reason it cannot be done.
 

Geralt

Posts: 1,320   +2,149
I guess in a way, those options are there if you don't mind a little reduction in performance. Both the RTX 30 and RTX 40 can run very efficiently with less power. They are both pushed beyond their efficiency curves to optimize performance at a cost to power consumption. If efficiency is a bigger concern a little tweaking and you can get much better power consumption and thermals and still retain 85-95% of your overall performance. This obviously works because these chips can run in laptops using 1/3 the power and usually maintaining something like 65-70% of their desktop counterpart's performance. So, it is not like you absolutely have to have a 1000 watt PSU to run an RTX 4090, you could definitely run it on a 750 watt and perhaps even a 650 watt PSU if you drop the power usage and sacrifice the performance. Probably not recommended, but there is no reason it cannot be done.
If that is the case, I buy another card that is less hefty in size and costly, and problem resolved without fiddling with the performance and all that. RDNA3 might be the solution then, maybe. We need to see what AMD brings to the table.
 

EdmondRC

Posts: 447   +655
If that is the case, I buy another card that is less hefty in size and costly, and problem resolved without fiddling with the performance and all that. RDNA3 might be the solution then, maybe. We need to see what AMD brings to the table.
Well yes, a smaller die and less cores can run at a faster clock speed and lower power. So, the 250 watt 3070 is going to run a higher clock than a 3080 with power reduced to 250 watts. That being said, the 3080 would still significantly outperform the 3070 based on what I have seen with the 3080s efficiency, but the delta would be reduced and the price to performance would make the 3070 the clear winner.

As far as AMD, we will just have to wait and see. I truly hope they can offer up some strong competition. Nvidia clearly pulled out all stops with the 4090 to have the most powerful card and I think they will keep that crown. AMD has an opportunity though to offer a more balanced lineup and better price/performance ratio. If they could also introduce AMD's version of ML upscaling and upgrade their RT, they could really bring the heat.
 
Last edited:

meric

Posts: 398   +398
Seriously, the rumors were bad for this new architecture but boy I'm so impressed. I really did not expect this much from Nvidia. At least I was expecting to see an inefficient architecture with huge power usage and less of a performance leap but turns out ADA is even better than the last gen. I know 450 watts tdp doesn't look good but perf/watt is stellar. Still, I'd like to see very low power sub $200 cards from Nvidia/AMD this gen.
 

Marco Mint

Posts: 30   +46
Meanwhile in Europe, where I will be buying my next build, an Asus 6900XT is currently €740 and an Asus 4090 is €2400. When I think about that, I just find it ridiculously funny.
 

wiyosaya

Posts: 8,410   +7,845

waclark

Posts: 788   +488

loki1944

Posts: 699   +523
eh not really, unless you're ok with DLSS Balanced or lower in games that support it. It lacks the grunt and VRAM for many games at 4K to avoid stuttering/reduced texture quality and maintain at least 60FPS.
eh, really; you can see it in the Andantech review; which was more comprehensive and included 4k.