Nvidia GeForce RTX 4090 Review: Next-Gen GPU Monster

The 3080 10GB is still the best buy from RTX 30.. The 4080 16GB is going to be the worse deal imaginable

3080 owners only have two real options, IMO. Either just say to hell with it, go all in and get a 4090 or wait for the 5080 in 2024, when the price/performance ratio will be back in sync for the 80 class cards.

..but the one thing they shouldn't do is go from their current 3080 to a 4080, like you say, terrible deal.

 
"Quite incredibly, this has seen transistor count increase by 170% from 28.3 billion to an insane 76.3 billion."

Really? If you multiply 28.3 by 1.7 you get 76.3? Awesome math skills in this article.
 
Nvidia should have dropped the 4080s as well. They must think the 4090 will dominate RDNA3 and are waiting to see AMD's price points for the mid-high end cards.

Honestly, all this "gamesmanship", pun intended, with staggered releases is slowing down purchases. I was ready to buy, but now I must wait to see what RDNA delivers and then the 4080s. So, I'm likely looking at a Black Friday buy. If the 12G 4080 can deliver 80% of the performance at 50% of the price of a 4090, then maybe it's a good buy.

Hopefully, AMD will deliver similar performance at $50-150 less. Then you might get 4080 performance around $650-750. Even that is a high price.
 
You must live in Europe LOL. Noone cares about electricity cost for a PC in USA, granted even here prices have skyrocketed.
I don't know about everyone else here, but my primary complaint about heavy power use is connected to heat generation. Sweating and gaming aren't a good combo.

It's interesting the actual real cost of electricity/energy is very very, VERY cheap, just environmental regulations and taxes make it otherwise. No long ago I was paying 7c Kwh, and thats heavily heavily increased by coal bans, environmental regulations, massive taxation etc.
Coal bans? No need to ban it. Solar and wind cost less for new plants. The free market has already spoken on this topic. Check new solar and wind projects in conservative states like Texas.

In a free market the price of electricity would be 1 or 2c a kwh probably. In EU it's 15-30 times that. 1500-3000% inefficiency caused by leftists, environmentalists and the govt, making us all poor every day.
Leftists? LOL You really think a free market, profit-first corporation would sell you electricity at $0.02/kWh? Last I checked corporations are in the business of making money, not losing it.

Edit: Some data
 
Last edited:
The review from der8auer is super interesting. He takes a closer look at performance at lower power limits. From his data, it looks like you could make a 300w card with a normal 300w cooler and 2 regular power connectors and still get awesome performance. No need for all this fancy stuff. Would be interesting to know how much money could be saved that way. Those coolers can’t be cheap and it looks like the 4090 is already well into the territory of diminishing returns at 450w, so those even more expensive top of the line AIB models that I assume will allow the full 600w probably won’t gain much from it.

Constraining 3090 and 6900 cards to 250w was the same way. You lose almost nothing and get a much quieter and cooler card.
 
I'm disappointed, with these results. Completely...

...because IT IS jawdropping, excellent perfomance and efficiency, 4K60 and so on, now it is completely justified for first adopter, e-peen retards and nolifers to pay 2000$ for this piece of tech - sum that is too far to reach or fathom for those of us who has more dire needs and can't spare that for a single GPU.
Look at this way: the more people buying the 4090, the fewer people buying up the 3000 series cards all the retailers are trying to unload right now.

I normally the top-end of every 2nd or 3rd generation, and then ride that as long as it remains viable. I have a 1080ti, and was planning on buying the top-end 4000 series card, but I won't just on principle for its MSRP. Might go for a 3080/ti/3090 instead. Or switch teams, and pickup something from Sapphire, now that EVGA is out of the game.
 
"Quite incredibly, this has seen transistor count increase by 170% from 28.3 billion to an insane 76.3 billion."

Really? If you multiply 28.3 by 1.7 you get 76.3? Awesome math skills in this article.

it is 170% more or you multiply by 2.7.

28.3 X 2.7 = 76.41

In other words 100% more than 100 is 200.
or 100 x 2 is 200.

Percentages are hard I know.

 
Monster indeed.
I laughed at Rich's statement during the DF review regarding these and rasterization: 'nvidia have solved graphics'

Waiting for 4080 16 numbers, and hopefully for prices to fall out in the next 6 or so months.
 
Look at this way: the more people buying the 4090, the fewer people buying up the 3000 series cards all the retailers are trying to unload right now.

I normally the top-end of every 2nd or 3rd generation, and then ride that as long as it remains viable. I have a 1080ti, and was planning on buying the top-end 4000 series card, but I won't just on principle for its MSRP. Might go for a 3080/ti/3090 instead. Or switch teams, and pickup something from Sapphire, now that EVGA is out of the game.
Yup, going to do just the same, 1080Ti lacks perfomance sometimes in 1440p. Too bad we're trailing huang's plan - drop miner's volume of amperes on the crowd, for the prices they sould have been selling 2 years ago.
 
3080 owners only have two real options, IMO. Either just say to hell with it, go all in and get a 4090 or wait for the 5080 in 2024, when the price/performance ratio will be back in sync for the 80 class cards.

..but the one thing they shouldn't do is go from their current 3080 to a 4080, like you say, terrible deal.
Comments like this makes me scratch my head.

How nvidia is getting a free pass and win by default without even considering any of AMD offerings?
No wonder they are charging what they do, they know that the rabid followers will pay anything they ask and will get it.
 
Last edited by a moderator:
Amazing work as always, excited to see how the rest of the architecture performs, and also interested to see how AMD responds. Hopefully the 7900 XT is a monster as well.
 
Wow,
Much better than I anticipated(other than price). Just shows how behind much of the game engines are in terms of making use of all the GPU horsepower we have, or thinking alternatively; how future proof some of these top end GPU's are likely to be. While I find $1600 to be outrageous for the price of just 1 GPU, there will be people who pay it. In fairness, until we see AMD's offerings, there is simply no other card with this kind of performance, and with RTX 3000 series still sitting on shelves, it's only logical on Nvidia's part in order to move old inventory. I did what I told myself I wouldn't do, and that's buy a GPU before launch for both AMD's and Nvidia's next gen, but in fairness, I had low expectations for this next gen, and my Titan Xp had started showing it's age in a select few titles at 1440P, and having picked up the 6900 XT I had wanted off Newegg for $600 flat, it was a deal I couldn't pass up and should tide me over for years to come. I am excited however to see how the 4080 will stack up, even though it will likely still carry a hefty price tag. In FACT hundreds more than seen on the 6800 Ultra, 7800 GTX, 8800 GTX, GTX 280, GTX 480, GTX 580, GTX 680, GTX 780, GTX 980, GTX 1080, all of which were sub $600 cards aside from the 780? Inflation. It's real. Anyway RTX 4090, price aside, WOW! it would almost be exciting if it was within the realm of realistic pricing.
 
An MSRP of $1,600 is considered a GOOD value? Well then, I suppose that when the RTX 5090 comes out, if it has a similar performance delta to the 4090 that the 4090 has with the 3090 then one could say that $2,200 is worth it.

Are you kidding me? This is a VIDEO CARD that has an MSRP of $1,600USD! How is that a good deal if not comparing it to the inflated crypto-mining prices? It's pretty clear that Jensen just looked at what the zombies were willing to pay and thought "I bet I could get them to pay even more than that!".

Why else would the card hit these new pricing heights?
 
Last edited:
I have to admit that the performance is very very impressive, and I think it's meant to be like that for the 4k high refresh rate segments. I don't expect AMD can catch up with this at all, at least not this 7000 generation. Probably I hope that their prices will be very competitive.

In the end, GPUs like this will only be for a very small amount of users. Till the day games dev can fully harvest the horsepower of current GPU generation, I'm happy and loyal to my 3000 series. Imagine when the time comes and some games will have their recommended GPU as a 4090, most probably I will not let my child buy that 🤣🤣🤣
 
I'm not a fan of Nvidia - But this is a mighty impressive GPU. One of the few times that reality has beaten the hype. I'm never going to buy one of course....Unless a sparkly new Witcher is released before I pop my clogs :))
 
The 3080 10GB is still the best buy from RTX 30. Back to selling at MSRP, roughly 1/2 the performance of the 4090 and only 44% of the cost. I was lucky enough to pick up a 3080 10GB at MSRP in November 2020. The 4080 16GB is going to be the worse deal imaginable, and the 12 GB won't be much better. At $1200 it will be 40% less powerful, but only 25% less cost than the 4090, and < 60% faster than the 3080, but 71% price increase. The 3080 is a two-year-old GPU and it will probably be equivalent to a 4070 or 4060 Ti, but at the current pricing, those GPUs would likely be $700 and $600 respectively. Nvidia must really consider the 3080 10GB pricing to have been a mistake.
I'd say the 3080 12G is the better buy. I'm seeing them at $750-ish whereas the 10G versions are still above $700. Pricing is, of course, all over the map and making absolutely no sense.

I'm thinking the 4080 16G will fall between 3090 and 3090Ti performance wise, and at about 80% of the 3090 MSRP. The 12G 4080 will likely fall between the 3080Ti and 3090 at about 60% of the price of the 3090. so, really, it doesn't look that bad. Now, whether the prices are appropriate or not, that is a different discussion.
 
Arguably the first true 4K card with ray tracing. It took a few generations but the card where you can have everything is finally here.

It's fast enough to smoke through Cyberpunk at 4K plus RT ultra WITHOUT any upscaling involved and you're still exceeding 60FPS.

I know DLSS 2.0 works superb in that game in the quality mode so enabling DLSS 3.0 and getting even higher framerates at 4K is a no brainer. However it's still quite an eye opener how long it took to hit that lofty performance target and just how much graphics performance you need.
 
I'd say the 3080 12G is the better buy. I'm seeing them at $750-ish whereas the 10G versions are still above $700. Pricing is, of course, all over the map and making absolutely no sense.

I'm thinking the 4080 16G will fall between 3090 and 3090Ti performance wise, and at about 80% of the 3090 MSRP. The 12G 4080 will likely fall between the 3080Ti and 3090 at about 60% of the price of the 3090. so, really, it doesn't look that bad. Now, whether the prices are appropriate or not, that is a different discussion.

I don't know. The 4080 16GB is nearly the size of the 3090 for core count and it has around a 50% frequency bump. It will be significantly faster than a 3090ti. Almost certainly in the 25-30% range. I'd expect the "4070" to be around 3090 level.
 
Look at this way: the more people buying the 4090, the fewer people buying up the 3000 series cards all the retailers are trying to unload right now.

I normally the top-end of every 2nd or 3rd generation, and then ride that as long as it remains viable. I have a 1080ti, and was planning on buying the top-end 4000 series card, but I won't just on principle for its MSRP. Might go for a 3080/ti/3090 instead. Or switch teams, and pickup something from Sapphire, now that EVGA is out of the game.
Yeah, well, I hoped for something equally efficient and price attractive as Pascals were. And Ada perf/efficiency seems like something about that, but not pricewise.
 
Back