Nvidia says frame generation and upscaling will require us to rethink benchmarks

The very low latency you get from real rasterized frames is all that matters for eSports so no amount of Frame Gen is welcome there.

The low latency you get from real rasterized frames at 60-80fps is all that matters in most other games, so no amount of Frame Gen is welcome below that level of non-FG performance.

Above that, I wouldn't mind 144 fps (my monitor's limit) locked with FG in non-eSports, but only with 72fps native with 2xFG, not 36fps native with 4xFG. The latency of 36 fps is clearly crap. And it's pretty hilarious to see a couple posters here defending FG with no nuance and all name-calling like they're on the school playground. LOL that ain't gonna convince anyone to do more than ignore what they say.
 
Welcome, 24 minutes in, and this is your first comment? Bold. Calling DLSS "fake framerate" is a typical stereotypical lazy take. DLSS isn’t about making the engine faster, it’s about leveraging AI to create playable experiences in games that would otherwise choke even the best hardware.

Ignoring DLSS in benchmarks makes no sense because that’s how most gamers will actually play. It’s like testing a turbocharged car and ignoring the turbo. DLSS isn’t “cheating”, it’s maximizing efficiency. And for the vast majority of users, the tiny latency it adds is a non-issue when weighed against the performance gains.

And your take on the 5070 vs. 4090? How are you so confident when there are literally no independent benchmarks yet? All we have are Nvidia’s slides. Sure, those should be taken with some skepticism, but outright dismissing the 5070’s potential just sounds ignorant.

Honestly, your whole argument screams being stuck in the past. Tech evolves whether you like it or not, and clinging to old ideas of “native rendering only” just makes you sound like someone who hasn’t kept up with how GPUs work today. DLSS and AI-driven performance are the future, and dismissing them outright shows a complete lack of understanding of where gaming is headed. Maybe it’s time to get with the program instead of clinging to outdated notions that no longer apply.
We can extrapolate what performance will look like in the same way they can extrapolate what their fake frames will look like. We look at how other generations worked with it on and off.

Our fake proof is as good as your fake frames.

Also, I'm bookmarking this comment and going to reply to it with every benchmark that comes out with what we already know.


Cheers!
 
Last edited:
I think if frame generation had been (more accurately) marketed as "motion smoothing" it would go down a lot better.

As it stands, FG doesn't really help if you're starting at an "unplayable" framerate like sub 30 FPS (or even 60 FPS in twitch based scenarios). The time between actual rendered frames is way too large for the inserted interpolated frames to respond to input properly.

What you get is a smoothed displayed motion, but moving around it feels like you're dragging your mouse through molasses or some mouse acceleration feature is turned on somewhere.

At much higher FPS when the time between rendered frames is very small, the actual added latency is small enough to generally brush off.

So FG is great for getting your already high FPS higher, but its not useful where most people might really want it which is starting at a low FPS and making a game playable. There settings like DLSS and even FSR are king, because those do increase framerate without actually increasing latency as well and most people are willing to sacrifice some image quality if it means the difference between playable and not.
 
It's quite simple. Run more than one benchmark. A traditional raster fps, with 1% lows, etc. And a seperate one that examines frame generated fps. There's no need to pivot to a "new and improved" benchmark if it only benefits certain cards and games. Nvidia has leaned so hard into it's latest tech that it wants, hell needs it to benefit from tailor made testing methods. But benchmarking isn't about the numbers. It's about the improvement in the numbers from hardware/driver refreshes, and needs to test the same features/parameters each time to be really useful.
 
Welcome, 24 minutes in, and this is your first comment? Bold. Calling DLSS "fake framerate" is a typical stereotypical lazy take. DLSS isn’t about making the engine faster, it’s about leveraging AI to create playable experiences in games that would otherwise choke even the best hardware.

Ignoring DLSS in benchmarks makes no sense because that’s how most gamers will actually play. It’s like testing a turbocharged car and ignoring the turbo. DLSS isn’t “cheating”, it’s maximizing efficiency. And for the vast majority of users, the tiny latency it adds is a non-issue when weighed against the performance gains.

And your take on the 5070 vs. 4090? How are you so confident when there are literally no independent benchmarks yet? All we have are Nvidia’s slides. Sure, those should be taken with some skepticism, but outright dismissing the 5070’s potential just sounds ignorant.

Honestly, your whole argument screams being stuck in the past. Tech evolves whether you like it or not, and clinging to old ideas of “native rendering only” just makes you sound like someone who hasn’t kept up with how GPUs work today. DLSS and AI-driven performance are the future, and dismissing them outright shows a complete lack of understanding of where gaming is headed. Maybe it’s time to get with the program instead of clinging to outdated notions that no longer apply.

Kid's doing tricks on it.
 
Everyone should know by now Jensen always highly exaggerates the performance gain.

Even back in the day, during the revealing of the GTX 1080, he literally said it's twice as fast as the Titan X, when in reality it's nowhere near that figure.

Sounds like he's been a consumer a-hole for a minute now, that's unfortunate.
 

If you guys want to see the actual performance of the 5090 in cyber**** then here you go.

They let him adjust graphic settings.
 
You're falling for it bud.
Considering my current hardware config, I am absolutely loving it. The next upgrade will definitely be the 50-series, also, thanks to AMD for not competing in the high-end anymore, while Nvidia clearly has the tech stack locked down for the next few years.
 
And your take on the 5070 vs. 4090? How are you so confident when there are literally no independent benchmarks yet? All we have are Nvidia’s slides. Sure, those should be taken with some skepticism, but outright dismissing the 5070’s potential just sounds ignorant.
Nvidia has not made dramatic changes on architecture for loooong time outside RT and fake frame things. Therefore you can very accurately predict how 5000-series performs on, say, traditional way against 4000 series by just looking at specs. 5070 has much less memory much less Cuda cores, much less memory bandwidth... Unless Nvidia makes biggest development they have ever made, 5070 have no chance against 4090.

You don't need any independent benchmarks to realize that. For comparison, people quite well predicted first Ryzen IPC performance Just looking at architecture slides AMD presented. And this case is much easier.
 
More or less a fair performance triangle. DLSS does hurt both image quality and lag. For example, comparing DLSS with Native should only be done at the same IQ. If DLSS quality on Ultra gives 50% more FPS than native on Ultra, but IQ is the same as Native on Very High, then the FPS should be compared to the VH FPS benchmark. If that ends up with DLSS having only 10% more PFS than Native, with 50% more latency, then people have a more informed decision on their settings.
 
Let's just summarize a bit here.

Some people are calling people that don't accept frame gen or dlss as valid measurements for a benchmark dumb
Some people that thinks dlss and frame gen is a natural next step call everyone else dumb

Some people don't like how people are blending dlss and frame gen into the same bracket


Fact is...is the card faster in pure raster than 4090? The answer is yes, around 30%
Is the card faster at Raytracing - The answer is yes, due to a massive increase in Tensor core - raytracing performance is up by as much as 40%

Nothing else really matters..You can further boost framerates by adding DLSS , and throw in framegen if the latency is an acceptable tradeoff...for you. No one is forcing anyone to use either option. Stick with DLAA and Native resolution if you want to.

As for the worth..it's worth 2000 dollars because people are willing to pay 2000 dollars...welcome to capitalism
 
For games that do not employ frame generation higher frame rates bring two benefits:
1. lower latency
2. smoother display

When games run at low frame rates you start to notice:
1. that it takes time for the camera to change according to your input (latency)
2. the slide show (because there is a big visual difference between two consecutive frames)

what frame gen does is alleviate point 2 by creating frames and exacerbate point 1 by increasing the actual latency. So, in effect, for people and games that value latency, frame generation is fake frames. for people and games that are not sensible to latency but to visual transition, frame generation is an improvement.

Frame rate was a good indication for smoothness, latency and usually render quality was comparable between cards at the same settings. The problem with FG is that we now need to asses separately latency and render quality. People that understand these things value the technology for what it brings (smoother display). But, we have to admit that marketing is abusing (tricking unsuspecting buyers) by comparing normal framerate with FG. In light of these marketing tricks users are right to call it Fake Frames. If nvidia called it smooth display or whatever and only counted rendered frames thus keeping latency comparable between framerates nobody would have called it Fake Frames.

This. Great layman's explanation on your part for everyone, well done. Both are good technologies and have been around for longer than we think and in more devices than we think. (Using smoothing on your TVs or smartphones, anyone?) This is not the issue, but exactly what you point out in your post -- deceptive marketing of the technologies that is clearly intended to get consumers to believe they are paying for and getting more than they actually are. The idea that a 5070 could be as good as 4090 is absolutely ludicrous and for NVIDIA to have the gall to state it publicly even with the DLSS 4 disclaimer is sheer hubris as well as deceptive.
 
Basically telling reviewers to "CHANGE" their "EDITORIAL NARATIVE"...

If you don't get what I am saying, then you should not even comment here...
 
For games that do not employ frame generation higher frame rates bring two benefits:
1. lower latency

Stopped there... because Frame Gen RAISE latency, it doesn't lower it. You would know if you were using it on a Switch game on your TV. You can only use that garbage on single player RPG with switch-like graphic quality.

Any online gaming experience, you need to turn it off. Any FPS, racing, fighting or high pace games, you can't use it.

Not to mention the high amount of artifacts created on a single frame generated. You are not raising your image fidelity, you are degrading it to an extent that it should never been used unless obliged because your game is running at 20FPS.
 
Last edited:
More or less a fair performance triangle. DLSS does hurt both image quality and lag. For example, comparing DLSS with Native should only be done at the same IQ. If DLSS quality on Ultra gives 50% more FPS than native on Ultra, but IQ is the same as Native on Very High, then the FPS should be compared to the VH FPS benchmark. If that ends up with DLSS having only 10% more PFS than Native, with 50% more latency, then people have a more informed decision on their settings.
You are missing the point.

At 2160p, DLSS Quality is upscaled from 1440p... so it is in reality 1440p.
At 2160p, DLSS Performance is upscaled from 1080p... so it is in reality 1080p.
At 2160p, DLSS ULTRA Performance is upscaled from 720p... so it is in reality 720p.
dlss-render-resolutions-v0-6yr5adggi8sa1.jpeg
 
We've been rethinking benchmarks ever since ray tracing debuted.

Nvidia doesn't care about how reviewers benchmark.

They set the goal post.

Either way, ya'll are about to line up and drop thousands of dollars on these cards thanks to psychological obsolescence so you can play a bunch of lackluster games.
 
I think test everything. Test Native, test DLSS, and FG... If you want Native you can make a decision, if you want AI up scaling, you can decide for that too. One doesn't have to replace the other.
 
Stopped there... because Frame Gen RAISE latency, it doesn't lower it. You would know if you were using it on a Switch game on your TV. You can only use that garbage on single player RPG with switch-like graphic quality.

Any online gaming experience, you need to turn it off. Any FPS, racing, fighting or high pace games, you can't use it.

Not to mention the high amount of artifacts created on a single frame generated. You are not raising your image fidelity, you are degrading it to an extent that it should never been used unless obliged because your game is running at 20FPS.
I said the opposite of what you imply:
For games that do not employ frame generation higher frame rates bring two benefits:
1. lower latency
 
I can remember when the leather jacket said the 3090 was the worlds first 8k gaming card too.
Now he wants to 'game' the benchmarks.
There's only so much BS you can swallow.
 
We've been rethinking benchmarks ever since ray tracing debuted.

Nvidia doesn't care about how reviewers benchmark.

They set the goal post.

Either way, ya'll are about to line up and drop thousands of dollars on these cards thanks to psychological obsolescence so you can play a bunch of lackluster games.
Nvidia absolutely does care about reviewers benchmarks.
They sent this very site a nasty email about the lack of praise for DLSS when it was first implemented.
Of course Techspot has highlighted the benefits of DLSS ever since.
 
Also whoever is onto high refresh rate gaming is because they want every bit of competitive advantage possible, and those gamers loathe the latency added bit frame gen, they prefer to play on lowest quality settings with draw distance on ultra.

As someone who used to play semi-professional Counterstrike: I remember when we had to take pre-match screenshots of a smoke grenade to prove we didn't set settings so low that smoke grenades didn't render properly.

But yes: I would say that even 1080p is still used in competitive formats, though nowadays 1440p/low is more typical. But it really only matters is your reaction speeds are sub-5ms anyways (I can still manage a measured 8ms on a "good day"; it sucks getting old).
 
As someone who used to play semi-professional Counterstrike: I remember when we had to take pre-match screenshots of a smoke grenade to prove we didn't set settings so low that smoke grenades didn't render properly.

But yes: I would say that even 1080p is still used in competitive formats, though nowadays 1440p/low is more typical. But it really only matters is your reaction speeds are sub-5ms anyways (I can still manage a measured 8ms on a "good day"; it sucks getting old).
This is honestly why I like the PvP in EVE. It's not twitch based, but strategy and cooperation matters. The satisfaction of simply participating in battles that end up costing as much as a car is indescribable. It's not like me getting 100+ kills in a cod4 match, but it is a second life that seems to matter. This big battles usually happen once or twice a month with the epic battles that are news worthy happen maybe 2-3 times a year. I've gotten doctors excuses because I'll get phone calls from my corp lead saying "we need to hop one" and then I end up saying for 3 days straight battling over star systems.

I use a 65" 4k TV as a monitor simply for this purpose. The large format would make sense if you were familiar with the game. I'm hoping 8k120 becomes a thing soon so I can go larger.
 
(The following story is as real as any of the AI generated frames)
I ran into Mr. Huang the other day as he was repainting his office. He was going at a fantastic, unseen before pace with his roller but some of the painted surfaces were showing small problems like minimal dripping or uneven coverage. I asked him why he is not taking a little more time to ensure proper coverage and better overall quality. His answer was: Sorry, I don’t have enough paint so I’m using DLSS4 to finish the office before the paint runs out.
 
Last edited:
Back