Nvidia launches RTX 50 Blackwell GPUs: $2,000 RTX 5090, $1,000 RTX 5080, RTX 5070 / Ti are $549 and $749

I'm planning a big upgrade this year that I'm intending to last 7-10 years, I'm coming from Ryzen 3600X and RTX 2070 Super, I just need to figure out if I'll be getting a 5080 or 5090, if it's really worth the extra cost....
 
Like it or not... this is the future:

1. Games are only getting more demanding with 4K/8K, ray tracing, massive open worlds, and complex physics.
2. Traditional brute-force rendering is hitting limits; power, heat, manufacturing
3. AI is stepping in to handle the load smarter, not harder.
4. RTX Neural Shaders are a game-changer; it’s like pre-rendered cutscenes but happening in real time. Crazy efficient and ridiculously good-looking.
5. NVIDIA is leaning hard into neural rendering and frame generation, and honestly, they’re miles ahead in AI.
6. Blackwell GPUs are the start of GPUs that are smaller, cooler, and smarter, with AI doing the heavy lifting.

This is NVIDIA, after all, the leader in AI. Whether you love it or hate it, this is the direction gaming GPUs from NVIDIA are heading. While adoption rates for DLSS 4 may be slower at first, they’re bound to pick up over time. With this approach, I can see near life-like visuals in games becoming a reality much sooner than with traditional rendering methods.
 
No matter how you put it. rendering 3 false frames will increase latency compared to 1 false frame. Sceptical to multiframe rendering. The other overall enhancements looks pretty good - I also like the dlss override option going native. Pricing was as expected for the 5090, with the 5080 priced lower than expected
 
I'm planning a big upgrade this year that I'm intending to last 7-10 years, I'm coming from Ryzen 3600X and RTX 2070 Super, I just need to figure out if I'll be getting a 5080 or 5090, if it's really worth the extra cost....
Considering the price difference it should be simple to pick, either the best, or almost the best -50% price.
 
Upon reflection, Nvidia's primary purpose at CES was to tout Blackwell's AI performance and dominance. Even introducing the pricing, the primary metric given was AI TOPS. Even the gaming portion of the presentation was focused on AI. According to Nvidia, the future of gaming is AI.

Personally, I would rather games not advance visually as quickly if it means you have to use frame generation to get decent framerates. I have had an RTX 4080 for about a year and a half now and I have tried FG in many games and still turn it off every time. I would rather play the game at 60 fps with FG off than 90ish fps with FG on, it just doesn't look or feel good to me. It's not just in my head either, there have been several times that I accidently turned it on using optimization in the Nvidia app and almost immediately noticed it in game and turn it back off. Now Nvidia is touting 3 out of every 4 fames being generated! I was really hoping that RTX 50 and DLSS4 would bring something better to the table than FG, but, it looks like it's not only keeping FG, it's doubling down on it being "performance".

Nvidia just gave game developers even better reasons to not optimize games to run smoothly and efficiently. Now 30 fps is good enough to claim 120 fps performance! FG is contributing to the death of AAA gaming in my opinion. The entire reason to buy a PC seems to be getting muddied, is FG 120 fps worth it vs. 30 fps on a console, but even the consoles will be in on the FG game soon enough. I wouldn't even care if FG brought real performance and lower latency, but it doesn't, it increases latency and makes games feel sluggish even when presenting high frames to the screen.
 
Last edited:
If the 5070 had 16GB of memory it would crush all before it. 12GB is not really enough for Indiana Jones for example. However that GPU is almost certainly fast enough to run the game with everything turned up at 1440p. A 4070Ti Super does so with performance to spare, only because it has 16GB.

Launching a $550 card short of memory for existing high profile titles means it will have a short lifespan.
 
Upon reflection, Nvidia's primary purpose at CES was to tout Blackwell's AI performance and dominance. Even introducing the pricing, the primary metric given was AI TOPS. Even the gaming portion of the presentation was focused on AI. According to Nvidia, the future of gaming is AI.

Personally, I would rather games not advance visually as quickly if it means you have to use frame generation to get decent framerates. I have had an RTX 4080 for about a year and a half now and I have tried FG in many games and still turn it off every time. I would rather play the game at 60 fps with FG off than 90ish fps with FG on, it just doesn't look or feel good to me. It's not just in my head either, there have been several times that I accidently turned it on using optimization in the Nvidia app and almost immediately noticed it in game and turn it back off. Now Nvidia is touting 3 out of every 4 fames being generated! I was really hoping that RTX 50 and DLSS4 would bring something better to the table than FG, but, it looks like it's not only keeping FG, it's doubling down on it being "performance".

Nvidia just gave game developers even better reasons to not optimize games to run smoothly and efficiently. Now 30 fps is good enough to claim 120 fps performance! FG is contributing to the death of AAA gaming in my opinion. The entire reason to buy a PC seems to be getting muddied, is FG 120 fps worth it vs. 30 fps on a console, but even the consoles will be in on the FG game soon enough. I wouldn't even care if FG brought real performance and lower latency, but it doesn't, it increases latency and makes games feel sluggish even when presenting high frames to the screen.

AI is the future. Period.

GPU tech is always moving forward, and honestly, people will find something to complain about no matter what. The whole "FG is just a crutch" argument? Yeah, sure, but isn’t that basically the same as saying, "Faster GPUs mean devs don’t bother optimizing their games?" This has been happening forever, hasn’t it?

1. FG is still early tech.
Remember when DLSS 1.0 came out? It wasn’t great, but AMD quickly followed with FSR, and the competition pushed both to improve. Then came DLSS 3 Frame Gen, and AMD introduced their own version with FSR 3 FG. The tech keeps evolving because it has to.

2. AI is the only scalable way forward, IMO (Keyword: Scalable)
As I said earlier, games are getting ridiculous: 4K, ray tracing, open worlds, complex physics. GPUs can’t brute-force their way through all that anymore because of power and thermal limits.
AI is stepping in to handle the load smarter, not harder, and it’s the only way to keep pushing performance without breaking hardware.

3. It’s not just about FG.
Everyone’s stuck on FG, but AI is doing way more this time: RTX Neural Shaders, texture compression, and movie-like visuals in real time. To me, this was unimaginable just a year or two ago.
FG is just one piece of the puzzle. AI is making the bigger picture.

4. FG is optional. Nobody is forcing you to use it.
Don’t like FG? Turn it off. Native performance is still there.
But for gamers pushing 4K or ray tracing to the max, FG provides smoother gameplay without needing to sell a kidney for a top-tier GPU.

5. Future consoles will follow suit.
Consoles will adopt it too. It’ll make better visuals accessible to everyone. Imagine the difference between a PC game with DLSS 4 + FG + Neural Rendering versus the same console game without it. The gap could be massive, so it makes sense consoles will adopt it.
 
AI is the future. Period.

GPU tech is always moving forward, and honestly, people will find something to complain about no matter what. The whole "FG is just a crutch" argument? Yeah, sure, but isn’t that basically the same as saying, "Faster GPUs mean devs don’t bother optimizing their games?" This has been happening forever, hasn’t it?

1. FG is still early tech.
Remember when DLSS 1.0 came out? It wasn’t great, but AMD quickly followed with FSR, and the competition pushed both to improve. Then came DLSS 3 Frame Gen, and AMD introduced their own version with FSR 3 FG. The tech keeps evolving because it has to.

2. AI is the only scalable way forward, IMO (Keyword: Scalable)
As I said earlier, games are getting ridiculous: 4K, ray tracing, open worlds, complex physics. GPUs can’t brute-force their way through all that anymore because of power and thermal limits.
AI is stepping in to handle the load smarter, not harder, and it’s the only way to keep pushing performance without breaking hardware.

3. It’s not just about FG.
Everyone’s stuck on FG, but AI is doing way more this time: RTX Neural Shaders, texture compression, and movie-like visuals in real time. To me, this was unimaginable just a year or two ago.
FG is just one piece of the puzzle. AI is making the bigger picture.

4. FG is optional. Nobody is forcing you to use it.
Don’t like FG? Turn it off. Native performance is still there.
But for gamers pushing 4K or ray tracing to the max, FG provides smoother gameplay without needing to sell a kidney for a top-tier GPU.

5. Future consoles will follow suit.
Consoles will adopt it too. It’ll make better visuals accessible to everyone. Imagine the difference between a PC game with DLSS 4 + FG + Neural Rendering versus the same console game without it. The gap could be massive, so it makes sense consoles will adopt it.
The problem isn't that FG exists, it's that it's being touted as performance, when it's clearly not. At best, its picture stabilizing or frame smoothing. Its primary purpose is to give a smoother appearance to games. It should be looked at more as an alternative to motion blur and not additional FPS. However, even game devs are including FG in their system requirements. To get to 60fps, RT, 4K, it'll say you need a 4080, but then you find out that means DLSS upscaling (okay, that's fine) and FG. Developers absolutely lean on FG to meet goals instead of spending the time optimizing games to deliver better performance. For AAA developers, FG means less development time which means less $ spend on development. I'm saying I would rather see visual downgrades to AAA games than FG. I think a lot of people are with me on that. And yes, FG is horrible, I'm sorry, it is.
 
AI is the future. Period.

GPU tech is always moving forward, and honestly, people will find something to complain about no matter what. The whole "FG is just a crutch" argument? Yeah, sure, but isn’t that basically the same as saying, "Faster GPUs mean devs don’t bother optimizing their games?" This has been happening forever, hasn’t it?

1. FG is still early tech.
Remember when DLSS 1.0 came out? It wasn’t great, but AMD quickly followed with FSR, and the competition pushed both to improve. Then came DLSS 3 Frame Gen, and AMD introduced their own version with FSR 3 FG. The tech keeps evolving because it has to.

2. AI is the only scalable way forward, IMO (Keyword: Scalable)
As I said earlier, games are getting ridiculous: 4K, ray tracing, open worlds, complex physics. GPUs can’t brute-force their way through all that anymore because of power and thermal limits.
AI is stepping in to handle the load smarter, not harder, and it’s the only way to keep pushing performance without breaking hardware.

3. It’s not just about FG.
Everyone’s stuck on FG, but AI is doing way more this time: RTX Neural Shaders, texture compression, and movie-like visuals in real time. To me, this was unimaginable just a year or two ago.
FG is just one piece of the puzzle. AI is making the bigger picture.

4. FG is optional. Nobody is forcing you to use it.
Don’t like FG? Turn it off. Native performance is still there.
But for gamers pushing 4K or ray tracing to the max, FG provides smoother gameplay without needing to sell a kidney for a top-tier GPU.

5. Future consoles will follow suit.
Consoles will adopt it too. It’ll make better visuals accessible to everyone. Imagine the difference between a PC game with DLSS 4 + FG + Neural Rendering versus the same console game without it. The gap could be massive, so it makes sense consoles will adopt it.
Never in history have we had such an absurd waste of hardware potential as now. I say with certainty that most AAA games, especially those made in UE5 could be running at 2-3x better performance if they had refined optimization.

It's time to admit reality, Studios will do anything to save time and money, they don't give a damn about Innovating/creating a polished product. The most played game of the decade will probably be GTA VI, and there will be almost no RT built into it. Period.
 
The problem isn't that FG exists, it's that it's being touted as performance, when it's clearly not.
Absolute rubbish. FG gives smoother performance and better visuals -- the reason the vast majority of people buy video cards in the first place. They are no more "fake frames" than DLSS is "fake resolution" or lighting effects are "fake eye candy". Honestly, you people hear these memes and run with them nonsensically. Anyone familiar with the rendering pipeline knows there are already dozens of other places in which intermediate results are reused for multiple frames, rather than recalculating every value directly from first principles, substituting some trivial measure of theoretical accuracy for better performance.
 
Never in history have we had such an absurd waste of hardware potential as now. I say with certainty that most AAA games, especially those made in UE5 could be running at 2-3x better performance if they had refined optimization.

It's time to admit reality, Studios will do anything to save time and money...
Your post reminds me of an editorial I read in a major computing magazine in the early 1980s .. the editor was literally frothing-at-mouth livid over PCs being manufactured with more than 64K of memory. Not GB or MB ... but kilobytes. His reasoning was that all worthwhile programs could be written to consume less memory, and having more than that would simply lead to shoddy programming.
 
You will absolutely see games getting pumped up from 30fps while using these crutches, star wars outlaws, ff16, wukong and a few other new games legit only look their best if dlss is going full throttle.

I wish these tech sites would quit beating around the bush and just admit that when nvidia tosses out a new bone that its gonna get abused immediately, amd and intel, even sony are literally chasing shadows to keep up.

the truth is to keep graphics addled gamers content these companies are going to rely on dlss, fsr, xess and other abbreviations to prop their games up.
 
Your post reminds me of an editorial I read in a major computing magazine in the early 1980s .. the editor was literally frothing-at-mouth livid over PCs being manufactured with more than 64K of memory. Not GB or MB ... but kilobytes. His reasoning was that all worthwhile programs could be written to consume less memory, and having more than that would simply lead to shoddy programming.
It's not 1980. We are no longer moving forward. Look around, is the internet and software in general written to be more efficient or inefficient than before? They talk all the time about the "green agenda" while wasting billions of dollars in resources on inefficient software.

At this point we have much more performance to extract just by optimizing software for the current hardware than any minimal advance in lithography will allow. lol
 
Like Rays and PhysX before it, DLSS 4.0 is another nVidia technology that love it or not is (at least in-part) the new standard. RTX 5080 is twice the performance of the 4080 while using all of nVidia's latest technology, and when we say (like I have) that don't care about Rays and want native res etc... We see new games requiring Rays and some default to DLSS -- nVidia is a strong brand for AMD and Intel to go up against toe to toe.

At $549 for the RTX 5070, be truthful nVidia fanboys, how cheap will the RX 9070 XT have to be for you to jump ship and buy AMD over the 5070? Can AMD even sell that cheap? Personally I feel AMD is going to have to release a RX 9080 XTX that can stand ground firmly against RTX 5080 Ti, I believe a flagship card does in fact matter (in the marketing of it all, AND, for game devs even when its popular to say but no one buys those fast highend cards, which by the way simply is not true).

Well played nVidia, hate the jacket still Jensen, but well (played) done.
 
Your post reminds me of an editorial I read in a major computing magazine in the early 1980s .. the editor was literally frothing-at-mouth livid over PCs being manufactured with more than 64K of memory. Not GB or MB ... but kilobytes. His reasoning was that all worthwhile programs could be written to consume less memory, and having more than that would simply lead to shoddy programming.

And look at where we are with Windows 11...
 
Just a thought: with only one 5090 in their data center, Nvidia can now simultaneously power two "5080" tier Geforce NOW subsciptions. Handy.

It would be more reasonable to give the 5090 a name of a professional product (cfr. Threadripper), as it caters to many professional users. But then it wouldn't function anymore as the halo product and as a status symbol for wealthy gamers. (Yes, now blame me for not being able to afford a 5090. I will crawl back underneath a stone in a minute.)
 
I see no reason to move from my 3080Ti.

If AMD is looking to be around the 7900XT/XTX performance on the 9700XT and Nvidia (without DLSS and the FG bullsh!t) with the 5070Ti having maybe 20% gains over the 4070Ti and it costs $800 (after taxes) there is no reason for me to upgrade.

I'm priced out of the xx80 and up cards from Nvidia and even if a 9700XT comes in around the 7900XTX in rasterization performance and is priced at $500, I'm not paying $500 for a roughly 15-20% bump in performance.

I guess I wait and see what kind of results stem from the reviews before I can say my decision to upgrade is on hold or not.
 
Absolute rubbish. FG gives smoother performance and better visuals -- the reason the vast majority of people buy video cards in the first place. They are no more "fake frames" than DLSS is "fake resolution" or lighting effects are "fake eye candy". Honestly, you people hear these memes and run with them nonsensically. Anyone familiar with the rendering pipeline knows there are already dozens of other places in which intermediate results are reused for multiple frames, rather than recalculating every value directly from first principles, substituting some trivial measure of theoretical accuracy for better performance.

The problem with your logic is, most people PLAY their video games, they don't watch them, therefore casual players will NEVER understand the lack of in-game response that FAKE FRAMES cause.

Frame gen is great if you watching an animated cut-scene and you do not want choppy frames because it not aesthetically pleasing to your entertainment, nobody likes watching a choppy movie. But frame gen is absolutely laughable and 100% gimmick for anyone who plays Multiplayer games, because your character is sluggish and not responsive to whats on the screen, there is no consistency and timing of shots/aiming is off, because you are seeing FAKE FRAMES.

Upscaling is only good for games that are single-player and character specific, bcz upscaling doesn't work well in fast paced games, where an enemy is just a few moving pixel silhouetted in a window 100 yard away. DLSS doesn't render such detail, or alters it because it doesnt know the difference between scenery and someone hiding.



Understand, for most players and those who play/compete with friends, none of these gimmicks matter^, they are all turned off.

Most of the millions of people who play COD every week, do not even use Vsync/Gsync/Freesync, because TEARING isn't an issue. Understanding/knowing that It's thee latest game information, being written ovr the older information ripping across your screen... which is much better to a Gamer, than have that frame delayed to look extra pretty, bcz of lemming status & marketing.

People buy "smoother performance" in the form of raw raster... they don't spend $800+ for latency driven experience that looks great from the couch.


Again, Fake Frames are great for sitdown movies or casuals who pause their game often, horrid for anyone who plays game online with others. But again, I don't know too many of those people running out and grabbing a 4080/4080 for casual gameplay, those types of people do not care about frames/latency and would be better suited with a budget card and upscaling to their hearts content.

And that was the point of upscaling, for the budget cards... and casuals.
 
Absolute rubbish. FG gives smoother performance and better visuals -- the reason the vast majority of people buy video cards in the first place. They are no more "fake frames" than DLSS is "fake resolution" or lighting effects are "fake eye candy". Honestly, you people hear these memes and run with them nonsensically. Anyone familiar with the rendering pipeline knows there are already dozens of other places in which intermediate results are reused for multiple frames, rather than recalculating every value directly from first principles, substituting some trivial measure of theoretical accuracy for better performance.
FG is terrible, I have tried it and I turn it off every single time. It makes the game feel sluggish even if it increases the 'framerate'. I would rather play the game with lower frames than turn it on. For me, that means it fails.
 
The problem with your logic is, most people PLAY their video games, they don't watch them, therefore casual players will NEVER understand the lack of in-game response that FAKE FRAMES cause.
100%.

I wouldn't consider myself a hardcore gamer, somewhere between casual and hardcore. FG feels terrible. Maybe it's okay on a 240hz monitor with 120 frames rendered, but going from 60 to 90-100 fps (on my 4080 FG doesn't double the frame rate, it's more like 1.5X - 1.7X) feels worse and less responsive than just playing the game at 60fps, so I always turn it off. I haven't played a single game yet with the feature where I didn't turn it off. The only game that felt somewhat okay with FG that I have played was Black-Myth W, but even in that game I ultimately preferred 60 fps over FG and adjusted my settings accordingly.
 
Performance improvements thanks to AI inference.

They can’t beat physics until they switch from silicon to light transistors.

We’ll be stuck here for a few more years until light based transistors mature to overcome resistance based transistor physics.

Nvidia is extending the lifecycle of their legacy products which I see as positive for existing owners. This is so that they can allocate foundry capacity to higher profit AI.

A mixed bag overall and gives GPU competition time to catch up while we wait for new US foundries capacity.

Stuck here for a few more years.
I think that we will see larger products for at least two decades, 3D chips in my opinion are a more probable future. A cube with cooling in between that includes many layers of silicon with a fast interconnect. There are major issues with light transistors especially when you need 100 billion of those for a 5090. We need to scale them down, we need to cool them, also we need to power them efficiently. I see these things more for some powerful interconnects than next gen cpu/gpu tech. I hope I am wrong but I believe things will get bigger, much bigger.
 
Wow, I knew Nvidia was using frame gen to prop up their performance graphs with the 40 series, but inflating it even more with the 50 series and multi frame frame gen is suuuuuuper scummy. The non DLSS comparisons seems like it's a 10-20% improvement in gaming performance at best. We've definitely plateau'd when it comes to raw GPU power.

Improvements to DLSS upscaling for past generation cards are appreciated. Since new games are running worse and worse in the pursuit of ever so slightly "prettier" graphics, it's unfortunately required these days.

I'm interested to see real world benchmarks, but it looks like my 4080 will be safe until at least the 7000 series, if not longer.
 
It's not 1980. We are no longer moving forward. Look around, is the internet and software in general written to be more efficient or inefficient than before? They talk all the time about the "green agenda" while wasting billions of dollars in resources on inefficient software.

At this point we have much more performance to extract just by optimizing software for the current hardware than any minimal advance in lithography will allow. lol
People really don't understand how poorly these game studios are being run and how bloated with overhead they are. Just like Hollywood productions these days, I think they estimated Disney spent $645M on Andor? At this point its money laundering. AAA game studios are essentially the same, HR, DEI departments getting paid big bucks. The studios are having to rely on contractors to push games across the finish line because their internal devs can't get it done. The results are games like Star Wars Outlaws, which has some pretty moments, but overall doesn't look as good as game like MGS5 which released on PS3 and PS4 nearly 10 years ago now. Games are not getting better, they peaked around 2015, modern games don't really look any better than they did back then, but almost always perform worse and have worse gameplay. Sure they have RT and new effects, but from a quality and artistic standpoint, we've gone backwards, not forward. I mean as a whole, there are still gems here and there, but as a whole the industry is moving in the wrong direction.
 
Back