RTX 4090 hits 2.85GHz boost clock in Cyberpunk 2077 demo

The takeaway: It can't run 2077 maxed at 4k very well.
In fairness to the 4090, Cyberpunk 2077 at 4K, with all settings on maximum (plus all RT options enabled and lighting set to Ultra), is a complete slideshow on most hardware anyway. Just done some in-game testing myself, and on an i7-9700K + 2080 Super system, it averaged 1.5 fps with no DLSS. The only way to make it even remotely playable was to enable Ultra Performance DLSS, which resulted in an average frame rate of 46.

Obviously, one can't know exactly what section of the game Nvidia was demonstrating, but for the sake of comparable reference, I got 20 fps at 1440p, no DLSS, settings at maximum, RT at Ultra. To get similar performance figures (I.e 'just under 60 fps') took the use of DLSS Performance (52 fps) or Ultra Performance (70 fps), but the loss in visual quality just isn't worth it. DLSS Quality helps keep the pretties looking pretty, but not very playable at 11 fps.
 
You know what would be neat? If developers would code their sims and strategy games to let AIs draw upon GPU cycles. High-end video cards wouldn't just be for 'mah fps' maniacs any more.
 
I'm holding out to see what AMd brings as well.
I imagine a lot of the huge gains we'll see with rdna3 involves using FSR2 on their GPUs.
That'll be any different than what Nvidia is doing here how?
Well we all welcome the open standard approach that will benefit even Nvidia cards from 4 to 5 generations back vs only benefiting RTX 4000 series closed standard with Nvidia's approach. I hope FSR 3.0 or whatever they call it leapfrogs dlss 3.0. Nvidia is past overdue for an ego correction!
 
Ok. The fact that Nvidia are using the exact same game to market their RTX 4000 series as they used to market the RTX 3000 series speaks volumes about modern PC gaming, and Nvidia's unhealthy relationship with it at the moment.
 
I'm holding out to see what AMd brings as well.
I imagine a lot of the huge gains we'll see with rdna3 involves using FSR2 on their GPUs.
That'll be any different than what Nvidia is doing here how?
Until we see what AMD cards bring to the table, I can't tell. Nevertheless most Intel and Nvidia advances are based on more energy consumption and "tricks".

FSR and DLSS are "tricks" which I find excellent if you have pressed all the juice (or hit thermal limitations on a laptop for example) and you need a little extra; but what it seems is that most companies try to accelerate the tricks (much cheaper, much less R&D) instead of improving the main performance and *additionally* improve the tricks.
 
Back