RTX 4090 hits 2.85GHz boost clock in Cyberpunk 2077 demo

Daniel Sims

Posts: 1,390   +43
Staff
In brief: Team green is releasing more information about its new Ada Lovelace graphics cards in the days since unveiling them. A new Cyberpunk 2077 demo reel showcases the potential of the RTX 4090 along with the benefits of DLSS 3, which includes more than just higher framerates.

Nvidia sent RTX 4090 review samples to some publications, bundled with a demo video showing Cyberpunk 2077 running on the GPU. Metrics in the demo show the press just how well Nvidia's new flagship handles the game. It also demonstrates DLSS 3's effects on performance, heat, and energy consumption.

At native 1440p without DLSS, the RTX 4090 runs Cyberpunk at just about 60fps with every ray tracing setting at maximum. In DLSS 3 quality mode, which upscales from an internal resolution of 1080p, almost tripled the framerate to 171fps. Nvidia claims DLSS 3 can quadruple framerates compared to native resolution, and perhaps performance mode, which on a 1440p monitor would upscale from 720p, could deliver on that promise.

What's also impressive is that the 4090's boost clock reaches 2.8GHz at stock settings — slightly above the GPU's official boost clock of 2.52GHz. Nvidia's earlier claim that the 4090 hit 3.0GHz in lab tests was somewhat bold, but the Cyberpunk demo indicates that speed could be within reach with smart overclocking.

Another promising metric for overclocking is that DLSS 3 kept the 4090's temperature below 53C during the demo. The 35 games currently planned to support the feature — like Cyberpunk, A Plague Gale: Requiem, Microsoft Flight Simulator, or Spider-Man Remastered — could boost overclocking potential by keeping the Lovelace GPUs relatively cool.

Furthermore, DLSS 3 appears to lower wattage by as much as 25 percent. Early rumors caused users to fear the 4090 could eat up to 600W before Nvidia's unveiling confirmed it just needs 450W. However, the Cyberpunk demo shows DLSS 3 slicing 110W off that number.

The new feature appears to improve performance-per-watt, which is good news for those in regions with spiking energy costs. Unfortunately, DLSS 3 is only available on RTX 4000 GPUs. Nvidia researcher Bryan Catanzaro admitted that DLSS 3 could theoretically come to the earlier RTX 3000 and 2000 cards. However, he doesn't think they would benefit as much from it since the technology currently relies on the Optical Flow Accelerators Nvidia introduced with Lovelace.

Permalink to story.

 
Based on those numbers Cyberpunk should be able to get 4k 120 fps in dlss3 quality with 1440p sampled resolution with rt and max settings.
It's still at least 1 generation away from rt being enjoyable imo especially at price premium of $1600.
 
I can already see the posts "RX 7900 hit 2.84 GHz running Cyberpunk" and the rabid ones will be "we told you AMD sucks, master jensen, here is my contribution to your coffers!"

I mean, look at this article, its the second with the same subject in less than 3 days.

 
Who is playing Cyberpuke competitively..? Who is buying a $1200 to replay a 2 year old game, that Streamers finished in 11 days..?
 
I can already see the posts "RX 7900 hit 2.84 GHz running Cyberpunk" and the rabid ones will be "we told you AMD sucks, master jensen, here is my contribution to your coffers!"

I mean, look at this article, its the second with the same subject in less than 3 days.
I'm starting to think that you dislike Nvidia.
 
I was curious - 3090 apparently idle about 30-60watts ( 5 second search ) - I suppose these manufacturers have something similar to intels CPUs to do minor not gaming stuff
 
It's not even the same game it was at release. I went in at patch 1.5 and it was honestly pretty amazing. And it keeps getting better.
I'm still wondering how he thinks CP2077 is played competitively. Do people think because DLSS raises frame rates that only competitive titles benefit?

"Early rumors caused users to fear the 4090 could eat up to 600W"

Fears went up to at least 800W.
 
I can already see the posts "RX 7900 hit 2.84 GHz running Cyberpunk" and the rabid ones will be "we told you AMD sucks, master jensen, here is my contribution to your coffers!"

I mean, look at this article, its the second with the same subject in less than 3 days.
It'll hit over 3Ghz at stock.
 
Optical Flow Accelerator....

Or simply Frame interpolation widely used in TVs and softwares for a long time.
 
The takeaway: It can't run 2077 maxed at 4k very well.

That is right. I think there are a lot of takeaways here:

- nvidia relies heavily on DLSS ("tricks") to decrease power consumption and increase framerate. I find this approach GREAT on mobile scenarios, very limited TDP/ cooling. But they are using this as if it was a standard.

- nvidia is setting DLSS as the standard to sell a new generation

- nvidia is trying to convince you that the "new color" is DLSS and the quality is so good as real

So, the main takeaway is that the new chips are no revolution... let's wait for AMD.
 
Who is playing Cyberpuke competitively..? Who is buying a $1200 to replay a 2 year old game, that Streamers finished in 11 days..?

I paid 1K for a 3080 FTW3 last year in order to play Control (a year+ old game at that point) and RDR2 (over 2 years old at that point) in 4K (with DLSS assistance, of course).

Was it worth it? If I equate an hour of playtime to a dollar I've easy got my 1000 worth in just Control, RDR2, and CP2077 alone (the three main games I've played where DLSS and RT when applicable could be used). Fh5 4k maxed out. B4b maxed out.

I know someone that got a 3060ti primarily to play WreckFest maxed out at high framerates at 1440p, a game that hit early access in 2014 and officially launched in 2018.
With the amount of hours they've put into it since, I'd say they got their money's worth too.

Point being people will spend whatever they feel is reasonable if they feel they'll be able to get their money's worth out of it.



 
That is right. I think there are a lot of takeaways here:

- nvidia relies heavily on DLSS ("tricks") to decrease power consumption and increase framerate. I find this approach GREAT on mobile scenarios, very limited TDP/ cooling. But they are using this as if it was a standard.

- nvidia is setting DLSS as the standard to sell a new generation

- nvidia is trying to convince you that the "new color" is DLSS and the quality is so good as real

So, the main takeaway is that the new chips are no revolution... let's wait for AMD.

I'm holding out to see what AMd brings as well.
I imagine a lot of the huge gains we'll see with rdna3 involves using FSR2 on their GPUs.
That'll be any different than what Nvidia is doing here how?
 
People buying this card play at 4K+. They don't care about 1440 resolution. This card, without res tricks I dislike, is still struggling with RT. This RT technology has to be one of the biggest cheats ever. Three generations and still no card is fully ready for it.

Eh, depends on implementation.
A 1060 can run RayTracing just fine. The sacrifice being the world has to be made of voxels.
But Teardown is the only game I'm aware of right now using a raytraced renderer over rasterization.
 
In this thread: people that don't realize upscaling tech is going to be the way forward in games for all GPU vendors.

Why they don't realize that after we went through the same thing with T&L and various shader model3 support issues across generations of GPUs back in the day as they became the standard for graphics is beyond me. It's the same thing that happened then.
 
I paid 1K for a 3080 FTW3 last year in order to play Control (a year+ old game at that point) and RDR2 (over 2 years old at that point) in 4K (with DLSS assistance, of course).

Was it worth it? If I equate an hour of playtime to a dollar I've easy got my 1000 worth in just Control, RDR2, and CP2077 alone (the three main games I've played where DLSS and RT when applicable could be used). Fh5 4k maxed out. B4b maxed out.

I know someone that got a 3060ti primarily to play WreckFest maxed out at high framerates at 1440p, a game that hit early access in 2014 and officially launched in 2018.
With the amount of hours they've put into it since, I'd say they got their money's worth too.

Point being people will spend whatever they feel is reasonable if they feel they'll be able to get their money's worth out of it.

Right^ and I know people who've spend bigly on a 3090 for their flight sim... they have specific reasons for their purchase, such people are the outliers.


Unfettered gameplay is what and who and how we got GPUs in the first place and what continues to push GPU technology is the pursuance of the CRT experience and seek hardware for such gameplay and smoothness. (Proprietary game features & software is just marketing).

Otherwise a rational person would wait to see AMD's release and comparison shop.
 
Fixed it - "It's theoretically possible that with additional research and engineering that we could get this technology (DLSS 3) working on other cards, but then we wouldn't make as much money. The current version only works on 4000-series cards."
 
Back