Nvidia GeForce RTX 5090 Review

Not anything terribly surprising, but thanks for the review nonetheless. The gains would be ok if it wasn't for the $400 price increase, but even if it was the same price I don't think many people will be upgrading from the 4000 series to the 5000 series. Maybe from the 3000 series given the limited VRAM those cards got. I'd like for AMD to swoop in and have a great value proposition for the higher end but I just don't see that happening.
 
Thanks for the review, I'm looking forward to the 5080 Review, that's the card I'm considering for my new build in April, coming from a 2070 Super which has served me admirably.
If the 5080 is poor performance and value for money I may consider delaying my whole build til the next generation. I'd be looking at moving around that time, hopefully a better home office and might be nice to time that with a fresh new build....

We'll see...
 
Well, now it seems like the 5070 will be a major failure.

I am not sure it will be able to beat the 4070 SUPER.

It might actually be the opposite. Gamers often buy based on price-to-performance.

To me, it doesn’t matter how the frame rates are achieved. If the result is 200 FPS, then that’s the measurable metric, and for many gamers, that’s what counts.

That said, we are moving past the era of ‘bigger, hotter, brute force’ GPUs with the rise of AI-driven tech.

Now, the 5070 at MSRP of $549 delivers 4090-level performance in select titles using x4 DLSS Frame Generation, that’s a massive win in my book. Sure, call it a marketing ploy - but there’s nothing wrong with that. Regardless of how it achieves the performance, the end result is what matters, and at that price point, it’s a smart move for both Nvidia and gamers.

We’ll wait for DLSS 4 specific reviews to see how well it performs in practice, but if the performance increment is as significant as expected, it will be a game-changer moving forward. Imagine the 60-series with the node change AND an even better DLSS 4 implementation.
 
Great review as always.. it is what it is, whether anybody wants it or not..

What surprises me a little is how good the 7900 XTX is in comparison, unless you are using RT where it sucks.

In my country it was literally half the price of the RTX 4090 last year... on the 2nd hand market it is almost 1/3 the price of a used 4090! crazy...

Yes the performance of the 7900XTX was a nice reminder that (as long as you disable RT), it's a great performing card (especially at 1440p).

Conveniently that's the card I've got in my system. So no upgrades for me for a while yet
 
As someone who plans to upgrade to 8k as soon as it becomes viable, it's pointless on screens below ~32 inches or lower. Perhaps if you want a 60" monitor then 8k makes sense. However, it won't make sense until we see 8k120 displays. Then, and the major issue I haven't upgraded to an 8k60 display yet, is that the onboard processing increases LATENCY. If you look at input lag at the same display going from 1080p to 4k, you see it go from ~2ms to ~8-10ms depending on manufacturer. In 8k displays, the latency is between 70-100ms.

8K displays have a long way to go before they are game-capable. For now, they're only good at watching movies.

I almost bought the Samsung QN800C last year. Best buy let me connect my laptop up to it and test it out. Even in game mode, the latency was TERRIBLE.

Totally agree with the state of the market. But consider this: unless an interest in gaming beyond 4K is demonstrated through customer interest, which itself would be increased through benchmarks, manufacturers won’t have much incentive to make 5,6, or 8K displays that are good for more than just content creation/consumption. This is exactly what happened back in the day with anything over 1200p (we’re talking mid 00’s). People were gaming at 640p, 720p, 900/1050p, or 1080/1200p and using 1600p for professional work. I remember watching a review of the freshly launched 8800GTX with 2560x1600 benchmarks, which were considered groundbreaking at the time. That 1600p monitor cost a whopping $999 and came with a miserable 30Hz refresh rate BECAUSE the market for those resolutions was strictly professional!

Is >4K gaming currently a niche? Sure. But it’s still very interesting from an academic standpoint. In addition to seeing just what 32GB of VRAM is capable of producing relative to the prior gen, those higher resolutions would also enable reviewers and readers to study DLSS/FSR/XeSS behaviors more closely than before, and for those technologies to further differentiate from each other.

So. Pour one out for the future, eh?
 
I suppose this makes me feel alright about my 4090. I will be skipping the 5000 series.
I was going to 50 series but started looking at last gen builds last night... a bit pissed tbh because I thought I could have it all with great video and AI performance in this gen. Was hoping to train models and game... but if I have to choose... I'll use the money I save to farm out any AI compute I need.

Or maybe just wait a few months for the real release of the TIs.
 
I so much wanted this card to be a beast so I could use the 240 Hz refresh rate of my 57 monitor but apparently it won't be abler to deliver good FPS at 7680x2160 @ 240 Hz.

Feel like the old days with the GTX cards each generation barely pushing 10-15% more performance.
They said AI will make everything better, well not! Deam you AI!!!

1 more thing AI ruined....
 
A touch off topic since the review did not cover DLSS (coming in later article). This is my best understand of DLSS 3. I am not up to speed on how DLSS 4 changes things, but I assume the concept is the same. I don't claim to be an expert and cutting through the marketing BS is not easy.

Deep Learning Super Sampling (DLSS) uses "AI" to boost gaming performance and visual quality. It consists of two main features:

1) Upscaling
- Renders the game at a lower resolution, then uses a trained neural network to upscale it to a higher resolution.
- Enhances anti-aliasing and detail, allowing higher framerates without requiring as much GPU power as native resolution.

2) Frame Generation
- Inserts extra frames that are AI-generated rather than fully rendered. This can effectively double (or more) the displayed framerate.
- Crucially (and the contentious part), it works concurrently with the GPU’s rendering. As soon as a real frame is finished rendering, DLSS immediately begins predicting the next AI-generated frame and displays it. Meanwhile, the GPU is busy rendering the subsequent “real” frame.

Because these "AI frames" are extrapolations, they rely on motion vectors, optical flow, and depth data to guess how the scene and user inputs evolve between rendered frames. Although user commands influence the game engine’s updates (and thereby the motion vectors), the AI’s predictions still reflect slightly older data, so Frame Generation does not reduce input lag. In fast-paced games, this can lead to a sense of detachment—your inputs are processed, but the AI frame you see may not perfectly capture your latest actions.

Despite this limitation, DLSS Frame Generation provides a substantial framerate boost as seen by the human. Coupled with DLSS Upscaling, it can enable higher resolutions and potentially smoother visuals, balancing real-time performance and image fidelity through AI-driven techniques. How useful DLSS is to you really depends on your desired "balance".
 
So we can expect 25-30% more performance for an equivalent price increase... Lucky we didn't take +50% performance, the price would have been worse.

When I see how average 4k is, activating DLSS to play it is the same as playing natively in 1440p. $2000 for that... damn.

Feeling that this card has a very bitter taste. Next.
 
It will sell well to the AI guys.
Yep. Lots of startups will probably use these cards. Cheaper than cloud (long term) and cheaper than the professional cards, with enough memory that you can actually run decently sized models without much compromise (if not on 1 card, then on 2). Still, compared to the 3 grand AI "supercomputer" that Nvidia also launched, I'm not sure if this card will still sell well compared to that. I guess it will depend on performance.
 
It was never going to be worth the money and it was never going to matter. People with all the money and only desire for the fastest card will buy it. That's it.

5070Ti is the card I want to see. Maybe AMD have a big spanner with the 9070XT if it comes in near a 4070Ti and there are plenty of choices under $600.
I agree except that NVidia are really not that smart, all those new pay contracts for those in the sports and entertainment business the real market for the best - they should be charging $3000 at least.
 
An 80 here is a joke guys. A 35% power increase. A $400 price increase. And a 27% performance increase. I would personally rather have the 4090, just for the power consumption alone. I can only imagine if they had ever released the full ad102, it would probably get similar performance with less power. This card might be the most powerful gpu available, but it is obviously not designed for gamers. This should have e gotten a 40, not an 80.
 
$2000. Proof positive that old saying is as much true now as it was in the past = a fool and their money are soon parted.
 
4090 owners rejoice. NVIDIA has done all of you a great favor by letting you know there is absolutely no need at this time for you to upgrade your current GPUs so feel free to close your wallets and put away your credit cards.

This is what I refer to as a "stockholder's release." It's intended to make the shareholders and nobody else happy with it. Yay, NVIDIA released new product on their bi-annual release as per usual, look at those AI purchases and the stock value go up. Except that given this graphical performance uplift, I would expect that the AI uplift is equally as bad and may cause AI companies to seriously reconsider whether it's worth it to purchase thousands more RTX 5090 units for minimal upgain over their current hardware.

I didn't think NVIDIA would do worse than the RTX 2000 series release over the much lauded GTX 1000 release but here we are. Really not sure why they thought they could pull the wool over everyone's eyes with this one. There's no way they could pull off a true generational uplift without a new node. Even layman "normie" gamers have some inkling of this.
 
Don't worry about the disappointing performance increase of the 5090. We still have the 5070, which is much faster than the 4090, according to Jensen's math 😁
 
Keep in mind that the $1,600 price for the RTX 4090 was a polite fiction. You couldn't actually buy the card at that price; they cost substantially more in the real world. If $2,000 is the actual price of the RTX 5090, the comparison with the 4090 will look a lot better.

In any case, the 90 series cards are for people who don't care about the price. It's delivering enough performance improvement to make it a useful upgrade if you have money to burn, or for building a new top-of-the-line system.
 
Thanks for the review, I'm looking forward to the 5080 Review, that's the card I'm considering for my new build in April, coming from a 2070 Super which has served me admirably.
If the 5080 is poor performance and value for money I may consider delaying my whole build til the next generation. I'd be looking at moving around that time, hopefully a better home office and might be nice to time that with a fresh new build....

We'll see...
Considering delaying a gen here too, or going back a gen... my rtx1080 is still holding up in games I play... I am not as excited about the 5080, and I think I'm feeling that way mainly due to the VRAM.
 
Good review. I really appreciated that you didn't mix everything with dlss 3/4, and frame(s) generation.
 
All things considered, temperature seems good especially if you compare the size of the coolers between 4090 and 5090. I like that they are using these bigger, probably 120mm fans. Even on cheaper cards, those tiny fans have the ability to produce immense amount of noise, speaking from experience owning 3070.
 
Yep. Lots of startups will probably use these cards. Cheaper than cloud (long term) and cheaper than the professional cards, with enough memory that you can actually run decently sized models without much compromise (if not on 1 card, then on 2). Still, compared to the 3 grand AI "supercomputer" that Nvidia also launched, I'm not sure if this card will still sell well compared to that. I guess it will depend on performance.
And probably the best part is that they will sell like hot cakes for MSRP if the startup does not need them anymore. GPUs have become Apple products, they hold the price just as well.
 
Back