Nvidia GeForce RTX 5090 Review

My context was about my upgrade path (3080 -> 4080 -> 5080), with the interest in the 4080 being Frame-Gen. If MFG isn’t for you, then it’s not for you, simple as that. Based on the MFG performance metrics thus far, I stand corrected.

Raw performance as the primary focus is no longer feasible, as highlighted in commentary like Digital Foundry’s 5090 review. Anyone who’s been paying attention to what AI has been doing in tech would understand why this is the direction GPUs are heading. It also makes perfect sense for Nvidia to continue pushing DLSS as they’re leading in the path they know best.

The hesitation toward DLSS is understandable, just as it was when it was first introduced. The 60-series will benefit from both a node change and even greater DLSS features, but there’s no regressing from innovations like MFG. Again, see my context, I have no interest beyond the 5080 as based on metrics, it feels like a sensible move from the 4080 given what features interests me.
"Raw performance as the primary focus is no longer feasible" - it very much still is feasible. Maybe not how we used to see where every year we got large generational jumps in performance, but it still can be done.

But as we both know... the money is not there, it's in AI hardware. It's why Nvidia put more and more R&D in AI for servers, technology which trickled down to consumer cards.

We are looking at ~30% perf increase in well over 2 years achieved by just increasing power draw and core count and almost no improvement in value ($/FPS)... it's beyond pathetic.

I have no problem with the future of GPUs being AI, but we are not there yet for gaming. Raster will still be king for another 2-3 generations with AI augmenting it (be it with fake frames or upscaling). To put things into perspective, it took Ray Tracing 6-7 years to become prevalent in new games and worth investing in hardware-wise. Maybe next gen games will start implementing neural rendering.... maybe.

Don't get me wrong, I do think your way of thinking is not bad at all. But I'm hesitant of using your point of view as a general view for the market right now since you have a much more specific need for the AI hardware than most ppl. I personally want better AI hardware, but I'm still bummed because of how little VRAM we are getting (besides the 5090) and I'm more interested in the laptop version of the 5080/5070ti (depends on the price) -> my old M1 Macbook with 32GB of unified RAM that I use at work can handle large models that expensive next gen windows laptops that I want to buy this year can't :(
 
Last edited:
This doesn’t read like a review for an 8/10 product.

I was thinking that too. It sounds like a 5 out of 10.

This is how I kind of read it: Hey... its a nice upgrade if you got the cash to burn... even tho we expected MUCH better... faster... cooler... cheaper... but you know.. its not as bad... I supposed if you game at 4K.


Anyways, this GPU should have been at most 1600.
 
I installed my 5090 Founders Edition today and updated the drivers immediately.
I am glad they reduced the size of the card. 3 slots was ridiculous. I also like how they cleaned up the power interface and made the card itself sleeker and smoother. That said: it's shocking to me that a game as old as Cyberpunk is still giving this card a run for its money to go 4K in the highest settings (Psycho).

I hear so many complaints about DLSS but the reality is, this is exactly what the AI revolution is supposed to be about. The AI revolution is supposed to offer things like Ray Tracing, DLSS and noise cancelling - and then bring that performance to lower-end machines (laptops and desktops) and eventually make it to small form factor desktops.

That's what Intel was playing at with their latest CPU and their integrated graphics.

Eventually we should have 5090 performance WITHOUT a video card. AI upscaling and all the trimmings.
 
Back