Nvidia says frame generation and upscaling will require us to rethink benchmarks

This is honestly why I like the PvP in EVE. It's not twitch based, but strategy and cooperation matters. The satisfaction of simply participating in battles that end up costing as much as a car is indescribable. It's not like me getting 100+ kills in a cod4 match, but it is a second life that seems to matter. This big battles usually happen once or twice a month with the epic battles that are news worthy happen maybe 2-3 times a year. I've gotten doctors excuses because I'll get phone calls from my corp lead saying "we need to hop one" and then I end up saying for 3 days straight battling over star systems.

I use a 65" 4k TV as a monitor simply for this purpose. The large format would make sense if you were familiar with the game. I'm hoping 8k120 becomes a thing soon so I can go larger.
How are things in EVE these days? I got out a decade ago, and haven't heard anything about it for some time.
 
And I'm sure Jensen was going to add "change the benchmarks to favor Nvidia" but, someone talked some sense into him for once.

Maybe Nvidia would like benchmarks to add "effectiveness as space heater," too. 🤣
 
It's not that hard to explain. Upscalers (today) are good for getting *most* of the visual fidelity at a higher resolution, specifically, so you can actually get that acceptable response time while also getting most of the visuals.

Frame generation is, functionally, for...watching the games, rather than playing them? They will *never* actually improve game responsiveness, only degrade it. Since most monitors now, and *all* video cards for that last 6 years or so, support VRR, the argument that frame gen is there so you can hit the max refresh of your monitor *to avoid tearing* is largely moot. It's great for those watching a presentation of gameplay. It never actually improves it for the one playing it.
 
Calling DLSS "fake framerate" is a typical stereotypical lazy take. DLSS isn’t about making the engine faster, it’s about leveraging AI to create playable experiences in games that would otherwise choke even the best hardware.

Ignoring DLSS in benchmarks makes no sense because that’s how most gamers will actually play. It’s like testing a turbocharged car and ignoring the turbo. DLSS isn’t “cheating”, it’s maximizing efficiency.
All that is a very nice Nvidia- like chat, it excludes of course that the turbo effect on a engine real power is…well…real. What DLSS brings to the table (and similar technologies) is to trick the user that it is getting a (example) 4K120 when in fact it is being rendered at 1440p30. IF YOUR GPU is running a very optimized game and in fact you would like more stable fps or you have a handheld and you would like to play at a lower TDP, then these technologies benefit us; but when the line is crossed and brands start to underpower their chips for raw performance (hard way) so that fake frames (easy way) are used to “measure” performance, then something is very wrong with that brand.

Let me put this way:
- right way: “we managed to achieve 30% more fps while using the same power and price. Additionally you benefit from a new tech DLSS 4 to increase the longevity of your card or fps in hard cases “

- wrong way: “we increased 100% the framerate using fake frames that are -in our opinion, that should be yours too- as good as real frames (and not telling you that only 7-15% in real power and we increased likewise the TDP) so please pay accordingly

Using real frames you compare real frames; having card A rendering at 30 fps and card B rendering 90 fps from which 30 fps are real, for me both have the same power. It would be a HUGE failure if people started comparing real frames at a real resolution with faked frames at a fake resolution.
 
Back