WhiteLeaff
Posts: 824 +1,589
More promises, fake frames and RT games at 30fps on expensive products.
Thank you, Huang.
Thank you, Huang.
Considering the price difference it should be simple to pick, either the best, or almost the best -50% price.I'm planning a big upgrade this year that I'm intending to last 7-10 years, I'm coming from Ryzen 3600X and RTX 2070 Super, I just need to figure out if I'll be getting a 5080 or 5090, if it's really worth the extra cost....
Upon reflection, Nvidia's primary purpose at CES was to tout Blackwell's AI performance and dominance. Even introducing the pricing, the primary metric given was AI TOPS. Even the gaming portion of the presentation was focused on AI. According to Nvidia, the future of gaming is AI.
Personally, I would rather games not advance visually as quickly if it means you have to use frame generation to get decent framerates. I have had an RTX 4080 for about a year and a half now and I have tried FG in many games and still turn it off every time. I would rather play the game at 60 fps with FG off than 90ish fps with FG on, it just doesn't look or feel good to me. It's not just in my head either, there have been several times that I accidently turned it on using optimization in the Nvidia app and almost immediately noticed it in game and turn it back off. Now Nvidia is touting 3 out of every 4 fames being generated! I was really hoping that RTX 50 and DLSS4 would bring something better to the table than FG, but, it looks like it's not only keeping FG, it's doubling down on it being "performance".
Nvidia just gave game developers even better reasons to not optimize games to run smoothly and efficiently. Now 30 fps is good enough to claim 120 fps performance! FG is contributing to the death of AAA gaming in my opinion. The entire reason to buy a PC seems to be getting muddied, is FG 120 fps worth it vs. 30 fps on a console, but even the consoles will be in on the FG game soon enough. I wouldn't even care if FG brought real performance and lower latency, but it doesn't, it increases latency and makes games feel sluggish even when presenting high frames to the screen.
The problem isn't that FG exists, it's that it's being touted as performance, when it's clearly not. At best, its picture stabilizing or frame smoothing. Its primary purpose is to give a smoother appearance to games. It should be looked at more as an alternative to motion blur and not additional FPS. However, even game devs are including FG in their system requirements. To get to 60fps, RT, 4K, it'll say you need a 4080, but then you find out that means DLSS upscaling (okay, that's fine) and FG. Developers absolutely lean on FG to meet goals instead of spending the time optimizing games to deliver better performance. For AAA developers, FG means less development time which means less $ spend on development. I'm saying I would rather see visual downgrades to AAA games than FG. I think a lot of people are with me on that. And yes, FG is horrible, I'm sorry, it is.AI is the future. Period.
GPU tech is always moving forward, and honestly, people will find something to complain about no matter what. The whole "FG is just a crutch" argument? Yeah, sure, but isn’t that basically the same as saying, "Faster GPUs mean devs don’t bother optimizing their games?" This has been happening forever, hasn’t it?
1. FG is still early tech.
Remember when DLSS 1.0 came out? It wasn’t great, but AMD quickly followed with FSR, and the competition pushed both to improve. Then came DLSS 3 Frame Gen, and AMD introduced their own version with FSR 3 FG. The tech keeps evolving because it has to.
2. AI is the only scalable way forward, IMO (Keyword: Scalable)
As I said earlier, games are getting ridiculous: 4K, ray tracing, open worlds, complex physics. GPUs can’t brute-force their way through all that anymore because of power and thermal limits.
AI is stepping in to handle the load smarter, not harder, and it’s the only way to keep pushing performance without breaking hardware.
3. It’s not just about FG.
Everyone’s stuck on FG, but AI is doing way more this time: RTX Neural Shaders, texture compression, and movie-like visuals in real time. To me, this was unimaginable just a year or two ago.
FG is just one piece of the puzzle. AI is making the bigger picture.
4. FG is optional. Nobody is forcing you to use it.
Don’t like FG? Turn it off. Native performance is still there.
But for gamers pushing 4K or ray tracing to the max, FG provides smoother gameplay without needing to sell a kidney for a top-tier GPU.
5. Future consoles will follow suit.
Consoles will adopt it too. It’ll make better visuals accessible to everyone. Imagine the difference between a PC game with DLSS 4 + FG + Neural Rendering versus the same console game without it. The gap could be massive, so it makes sense consoles will adopt it.
Never in history have we had such an absurd waste of hardware potential as now. I say with certainty that most AAA games, especially those made in UE5 could be running at 2-3x better performance if they had refined optimization.AI is the future. Period.
GPU tech is always moving forward, and honestly, people will find something to complain about no matter what. The whole "FG is just a crutch" argument? Yeah, sure, but isn’t that basically the same as saying, "Faster GPUs mean devs don’t bother optimizing their games?" This has been happening forever, hasn’t it?
1. FG is still early tech.
Remember when DLSS 1.0 came out? It wasn’t great, but AMD quickly followed with FSR, and the competition pushed both to improve. Then came DLSS 3 Frame Gen, and AMD introduced their own version with FSR 3 FG. The tech keeps evolving because it has to.
2. AI is the only scalable way forward, IMO (Keyword: Scalable)
As I said earlier, games are getting ridiculous: 4K, ray tracing, open worlds, complex physics. GPUs can’t brute-force their way through all that anymore because of power and thermal limits.
AI is stepping in to handle the load smarter, not harder, and it’s the only way to keep pushing performance without breaking hardware.
3. It’s not just about FG.
Everyone’s stuck on FG, but AI is doing way more this time: RTX Neural Shaders, texture compression, and movie-like visuals in real time. To me, this was unimaginable just a year or two ago.
FG is just one piece of the puzzle. AI is making the bigger picture.
4. FG is optional. Nobody is forcing you to use it.
Don’t like FG? Turn it off. Native performance is still there.
But for gamers pushing 4K or ray tracing to the max, FG provides smoother gameplay without needing to sell a kidney for a top-tier GPU.
5. Future consoles will follow suit.
Consoles will adopt it too. It’ll make better visuals accessible to everyone. Imagine the difference between a PC game with DLSS 4 + FG + Neural Rendering versus the same console game without it. The gap could be massive, so it makes sense consoles will adopt it.
Absolute rubbish. FG gives smoother performance and better visuals -- the reason the vast majority of people buy video cards in the first place. They are no more "fake frames" than DLSS is "fake resolution" or lighting effects are "fake eye candy". Honestly, you people hear these memes and run with them nonsensically. Anyone familiar with the rendering pipeline knows there are already dozens of other places in which intermediate results are reused for multiple frames, rather than recalculating every value directly from first principles, substituting some trivial measure of theoretical accuracy for better performance.The problem isn't that FG exists, it's that it's being touted as performance, when it's clearly not.
Your post reminds me of an editorial I read in a major computing magazine in the early 1980s .. the editor was literally frothing-at-mouth livid over PCs being manufactured with more than 64K of memory. Not GB or MB ... but kilobytes. His reasoning was that all worthwhile programs could be written to consume less memory, and having more than that would simply lead to shoddy programming.Never in history have we had such an absurd waste of hardware potential as now. I say with certainty that most AAA games, especially those made in UE5 could be running at 2-3x better performance if they had refined optimization.
It's time to admit reality, Studios will do anything to save time and money...
It's not 1980. We are no longer moving forward. Look around, is the internet and software in general written to be more efficient or inefficient than before? They talk all the time about the "green agenda" while wasting billions of dollars in resources on inefficient software.Your post reminds me of an editorial I read in a major computing magazine in the early 1980s .. the editor was literally frothing-at-mouth livid over PCs being manufactured with more than 64K of memory. Not GB or MB ... but kilobytes. His reasoning was that all worthwhile programs could be written to consume less memory, and having more than that would simply lead to shoddy programming.
Your post reminds me of an editorial I read in a major computing magazine in the early 1980s .. the editor was literally frothing-at-mouth livid over PCs being manufactured with more than 64K of memory. Not GB or MB ... but kilobytes. His reasoning was that all worthwhile programs could be written to consume less memory, and having more than that would simply lead to shoddy programming.
Absolute rubbish. FG gives smoother performance and better visuals -- the reason the vast majority of people buy video cards in the first place. They are no more "fake frames" than DLSS is "fake resolution" or lighting effects are "fake eye candy". Honestly, you people hear these memes and run with them nonsensically. Anyone familiar with the rendering pipeline knows there are already dozens of other places in which intermediate results are reused for multiple frames, rather than recalculating every value directly from first principles, substituting some trivial measure of theoretical accuracy for better performance.
FG is terrible, I have tried it and I turn it off every single time. It makes the game feel sluggish even if it increases the 'framerate'. I would rather play the game with lower frames than turn it on. For me, that means it fails.Absolute rubbish. FG gives smoother performance and better visuals -- the reason the vast majority of people buy video cards in the first place. They are no more "fake frames" than DLSS is "fake resolution" or lighting effects are "fake eye candy". Honestly, you people hear these memes and run with them nonsensically. Anyone familiar with the rendering pipeline knows there are already dozens of other places in which intermediate results are reused for multiple frames, rather than recalculating every value directly from first principles, substituting some trivial measure of theoretical accuracy for better performance.
100%.The problem with your logic is, most people PLAY their video games, they don't watch them, therefore casual players will NEVER understand the lack of in-game response that FAKE FRAMES cause.
I think that we will see larger products for at least two decades, 3D chips in my opinion are a more probable future. A cube with cooling in between that includes many layers of silicon with a fast interconnect. There are major issues with light transistors especially when you need 100 billion of those for a 5090. We need to scale them down, we need to cool them, also we need to power them efficiently. I see these things more for some powerful interconnects than next gen cpu/gpu tech. I hope I am wrong but I believe things will get bigger, much bigger.Performance improvements thanks to AI inference.
They can’t beat physics until they switch from silicon to light transistors.
We’ll be stuck here for a few more years until light based transistors mature to overcome resistance based transistor physics.
Nvidia is extending the lifecycle of their legacy products which I see as positive for existing owners. This is so that they can allocate foundry capacity to higher profit AI.
A mixed bag overall and gives GPU competition time to catch up while we wait for new US foundries capacity.
Stuck here for a few more years.
People really don't understand how poorly these game studios are being run and how bloated with overhead they are. Just like Hollywood productions these days, I think they estimated Disney spent $645M on Andor? At this point its money laundering. AAA game studios are essentially the same, HR, DEI departments getting paid big bucks. The studios are having to rely on contractors to push games across the finish line because their internal devs can't get it done. The results are games like Star Wars Outlaws, which has some pretty moments, but overall doesn't look as good as game like MGS5 which released on PS3 and PS4 nearly 10 years ago now. Games are not getting better, they peaked around 2015, modern games don't really look any better than they did back then, but almost always perform worse and have worse gameplay. Sure they have RT and new effects, but from a quality and artistic standpoint, we've gone backwards, not forward. I mean as a whole, there are still gems here and there, but as a whole the industry is moving in the wrong direction.It's not 1980. We are no longer moving forward. Look around, is the internet and software in general written to be more efficient or inefficient than before? They talk all the time about the "green agenda" while wasting billions of dollars in resources on inefficient software.
At this point we have much more performance to extract just by optimizing software for the current hardware than any minimal advance in lithography will allow. lol