Cyberpunk 2077 DLSS + Ray Tracing Benchmark

I am disappointed Nvidia hasn’t equipped their hero card to run ray tracing at native resolution. DLSS might be good, but it is not actually the same as running at native, its a crutch nvidia is using to cover up high GPU prices and the fact that they skimped on ray tracing cores.
That’s fine for a price competitive midrange card, but the 3090 for example should have had more RT cores and actually stood out as the hero card for a game like cyberpunk with RT.
If people haven’t heard, Steve and Tim have just been blacklisted from getting pre-release Nvidia reference cards because they provide their opinion ray tracing is just too much of a performance hit with current cards...regardless of whether you agree, that is obscene behaviour, Nvidia PR has lost the plot and they deserve to be called out on it.
 
Running on an 8700k and an RTX 3070. Ultra Ray tracing settings@1440p, nothing changed to custom and I seem to be getting far better FPS than this benchmark suggests I should.

FPS is usually around 59-62 with 54-55 FPS on frequent occasions and dips very very rarely to upper 40's, not even during combat or heavy scenes, just very randomly for no apparent reason.

I am good with that. If it went any lower I would look at more tweaking.
 
Last edited:
It won't if you turn off RT (something that AMD users won't use anyway)

Also, claiming that RTX 3070 is outdated because of something that won't run on AMD GPU is laughable. By the same logic, ALL AMD radeon is already outdated because it can't use RT at all in this game. Even if it get RT support in the future, it won't perform as good as nvidia and lack of DLSS will make it unplayable anyway.




All you need is to slightly lower texure setting to fix the Vram issue if you use RT. Otherwise, turn RT off.
What's laughable is that you can't read. Nowhere did I say RTX 3070 is outdated because of ray tracing, didn't metion ray tracing once, but because 8GB of VRAM is limiting, something AMD cards don't suffer from since they have more VRAM. Big releases in the close future will probably require at least 8GB of RAM which will make RTX 3070 struggle. Ray tracing and DLSS is something AMD can implement in the future, Nvidia can't add more VRAM onto their existing cards. Time to stop imagining things no one said.
 
It's not, actually. It's a minorly noticeable change for a huge hit in performance. Regardless of how pretty it makes things, it's not worth it....yet.
Nope you’re incorrect. It is worth it. It’s worth the performance hit for the better visuals. Anyone who thinks otherwise simply hasn’t played the game with RT on. The performance hit isn’t actually that big, I find if I turn RT and DLSS on it runs slightly better than it does with both of those off.

Ray tracing is incredible, I don’t understand how any PC gaming enthusiast would want to turn such a thing off..

“minorly noticeable” Bwahahahahaha. No, just no. If you think that it’s probably time to either get glasses or stop playing games. What a ridiculous comment!
 
The conclusion is that even the latest Nvdia cards are crap at the tech they developed and are now so heavily marketing. The first gen RT cards were simply bad and now with 2nd gen, it is still not as it should be + Nvdia skimped on the VRAM which is made obvious by Cyberpunk.
 
Ray tracing is incredible, I don’t understand how any PC gaming enthusiast would want to turn such a thing off..

^ Is ray tracing also supposed to massively blur that bottom poster out (left of the "Microwave In Use" sign)? In fact half the bottom shot from the Microwave sign itself to the rubber tubes looks artificially softer than it should. That's not quite my idea of a "generational graphical upgrade" if the only way you can get it playable is to rely on blur inducing DLSS as a crutch for playable frame-rates blurring half the shot out in the process...
 

^ Is ray tracing also supposed to massively blur that bottom poster out (left of the "Microwave In Use" sign)? In fact half the bottom shot from the Microwave sign itself to the rubber tubes looks artificially softer than it should. That's not quite my idea of a "generational graphical upgrade" if the only way you can get it playable is to rely on blur inducing DLSS as a crutch for playable frame-rates blurring half the shot out in the process...
Tbh I’m not sure I found that on a Google search. But I do find your response amusing, the RT mode image is far better in general but you pointed out the one bit that looks worse. How miserable do you want to be about this technology mate? And why? Are you upset because it’s not dirt cheap?

Pretty much every reviewer states that RT on is a big difference. Techspots own Tim states in this article; “ray tracing elevates the game’s visuals and provides a noticeable improvement”. Digital foundry went quite a bit further in their praise for RT settings.

At the end of the day it’s subjective. If you don’t care for cutting edge visuals in games then don’t turn it on. But don’t bury your head in the sand and go around making false claims It makes no or little difference to visuals or that DLSS is too blurry to be playable etc. Let others enjoy it.

In fact I’m wondering what visual improvements could be made to 3D generated environments to get someone like you excited if real time ray tracing of light doesn’t pique their interest. And why they are here reading benchmarks on an article specifically for users of ray tracing hardware?
 
Laughable that people here actually think RT isn’t worth turning on. They clearly haven’t played it with RT on. The difference is night and day!
Really? I was amazed at the screenshots and their differences; it changes the whole concept of "Game Play!" The plot, the character development, and the flow of the games just hinges on the Ray Tracing and DLSS. ; )

I mean, Nvidia is forcing the card reviewers to get on their knees and suck on something just to be able to review their product, and then, to force the reviewers to write long articles on worthless trivial "enhancements," like ray tracing, rather than overall graphical (rasterization) performance, is horse manure.
 
Last edited:
Tbh I’m not sure I found that on a Google search. But I do find your response amusing, the RT mode image is far better in general but you pointed out the one bit that looks worse. How miserable do you want to be about this technology mate? And why? Are you upset because it’s not dirt cheap?
I simply pointed out the obvious significant texture downgrade (that poster looks worse than some 90s games textures) in just one single post vs your 7x posts here cheering it on and calling everyone who doesn't like it a 'loser', 'dirt poor' or 'someone like you'. It certainly isn't me who's overly emotionally invested in RT to an seriously unhealthy degree here...
 
Last edited:
Nope you’re incorrect. It is worth it. It’s worth the performance hit for the better visuals. Anyone who thinks otherwise simply hasn’t played the game with RT on. The performance hit isn’t actually that big, I find if I turn RT and DLSS on it runs slightly better than it does with both of those off.

Ray tracing is incredible, I don’t understand how any PC gaming enthusiast would want to turn such a thing off..

“minorly noticeable” Bwahahahahaha. No, just no. If you think that it’s probably time to either get glasses or stop playing games. What a ridiculous comment!
I played with RT and DLSS on, with a 3080 and was getting under 50fps most times. Turn off RT and I'm contstantly over 100fps. It may be worth it to you for some extra shinys (we all know simple people are attracted to shiny objects) but to me I'd rather have over 100fps.
 
TBH I don't see why people hype so much about the RT stuff. According to this review it's not as breath taking as some people lead us to believe...
Ahem....

From a visual standpoint, we think Cyberpunk 2077 is the best showcase we've had so far of what ray tracing can bring to the table. The game looks excellent with it enabled. The reflections setting noticeably improves graphics quality in our opinion, with more realistic lighting also possible through the other settings.

This is not a Shadow of the Tomb Raider where often the differences are negligible, ray tracing in this game is a strong visual upgrade on the regular rasterization settings. It’s also a title that makes sense to have ray tracing in, as part of this game’s key focus is offering next-gen visuals and it’s not a fast paced competitive shooter like Fortnite where ray tracing is nearly pointless (or at least it is when you lose so much performance).
 
Fyi the optimal recommended settings from GeForce experience for my 3090 is max settings rt set to psycho and dlss set to performance at 4k resolution. I am still trying to find the best picture quality for a 60 fps experience. Setting the dlss from performance to auto yielded a better image quality while still having 60 fps. Initially I set resolution to 1600p but the drop from 4k is definitely noticeably inferior when dlss set to quality. I can have the next gen ps6 console experience today at 35 fps but I rather compromise on some features to get a locked 60fps for me the 4k max settings rt set to psycho and dlss set to auto seems to be that compromise for now.
Update dlss on auto drops to ultra performance in demanding seens and looks horrible.
 
Last edited:
I played with RT and DLSS on, with a 3080 and was getting under 50fps most times. Turn off RT and I'm contstantly over 100fps. It may be worth it to you for some extra shinys (we all know simple people are attracted to shiny objects) but to me I'd rather have over 100fps.

Still, the fact we're getting past ray tracing having <5 FPS is an achievement. Long term, we want all lighting to be RT since it's much more accurate and does certain effects (reflections/refractions) that the normal rasterizer can't really simulate well.
 
TBH I don't see why people hype so much about the RT stuff. According to this review it's not as breath taking as some people lead us to believe. I mean the game offers realistic shadows and reflections without RT already. RT makes these more crisp and noticable. Don't get me wrong, I don't completely disregard the tech, any visual improvent is good for a better gaming experience. I just don't like the performance hit compared to what little visual improvement it offers. DLSS tech is impressive though. RT when paired with DLSS can offer a good experience as the performance hit is not that big. I hope CDPR improves the game with better performance optimisations in the future.
You should watch Linus/DF videos about Ray tracing in this game. The difference between them is like going from a 2010 era game to actual life-like reflections and shadows. So ray tracing is well worth it. It is not worth it in the eyes of Radeon users, cause they can't have it, so they try to sugarcoat it saying it doesn't make a difference. Yeah, nice try.
 
Still, the fact we're getting past ray tracing having <5 FPS is an achievement. Long term, we want all lighting to be RT since it's much more accurate and does certain effects (reflections/refractions) that the normal rasterizer can't really simulate well.
Very true. The best games still don't approach the realism of your average Hollywood film played at 24 FPS @t 1080p. The difference is those films use raytracing -- sometimes the artificial raytracing of high-end CGI, but more often the ray tracing done by the real world itself.
 
Still, the fact we're getting past ray tracing having <5 FPS is an achievement. Long term, we want all lighting to be RT since it's much more accurate and does certain effects (reflections/refractions) that the normal rasterizer can't really simulate well.
since the rtx 3090 gets 19 fps at max everything without dlss at 4k we will need more than triple the rt performance to get to 4k 60 fps, The best quality dlss setting get 35 fps at 4k so here we still need double that performance.
I don't see 2x performance coming until at least 2 to 3 more years especially when current generation leaps are around 50% gains. Eg. RDNA 3 will have similar performance improvement succession that rdna 2 has. This improvement has to be an annual thing to make it on the 2 to 3 year mark.
 
Very true. The best games still don't approach the realism of your average Hollywood film played at 24 FPS @t 1080p. The difference is those films use raytracing -- sometimes the artificial raytracing of high-end CGI, but more often the ray tracing done by the real world itself.
We are definitely hitting a plateau currently rt is brute forced lite if they come up with more efficient api and let hardware success in 2 to 3 years we will start to see games plateau into ultra realistic real time scenes. On one side he have the battle of the Apis Vulkan vs DXR and on the other side we have RTX vs RDNA and Intel putting pressure on the other 2 from fear of the unknown.
 
Ahem....

From a visual standpoint, we think Cyberpunk 2077 is the best showcase we've had so far of what ray tracing can bring to the table. The game looks excellent with it enabled. The reflections setting noticeably improves graphics quality in our opinion, with more realistic lighting also possible through the other settings.

This is not a Shadow of the Tomb Raider where often the differences are negligible, ray tracing in this game is a strong visual upgrade on the regular rasterization settings. It’s also a title that makes sense to have ray tracing in, as part of this game’s key focus is offering next-gen visuals and it’s not a fast paced competitive shooter like Fortnite where ray tracing is nearly pointless (or at least it is when you lose so much performance).

I think they are being ironic, considering next to no difference in screenshots...
 
Nope you’re incorrect. It is worth it. It’s worth the performance hit for the better visuals. Anyone who thinks otherwise simply hasn’t played the game with RT on. The performance hit isn’t actually that big, I find if I turn RT and DLSS on it runs slightly better than it does with both of those off.

Ray tracing is incredible, I don’t understand how any PC gaming enthusiast would want to turn such a thing off..

“minorly noticeable” Bwahahahahaha. No, just no. If you think that it’s probably time to either get glasses or stop playing games. What a ridiculous comment!
People should stop playing video games if they think ray tracing is barely noticable? Lmao, this is how Nvidia fanbois "think".

Why don't you go stare at the sun outside and admire how realistic the lighting is?
 
After over 10 hours of playtime I disagree, turning off RT is not worth the extra frames in this game, it might be in other titles but this is a slower paced story game where the atmosphere and visuals are more important than super high speed frame rates. I usually don’t tolerate low fps in games but if you can get 50-60 with RT on then that’s better than say 144 with RT off.

And the videos don’t really do it justice, you need to run it on your system to see it, the reflections, the shadows cutting through the reflections, the far more natural lighting. I remember seeing one of the NPCs - Evelyn for the first time in a neon lit strip club wearing a shiny jacket and the reflections on it are incredible, if you turn RT off it looks so basic by comparison, like you’re playing a cartoon of the ray traced version of the game.

I’d still play it without RT because the game is great but if you have those RT options turned on it takes it to a whole new level.

As an old gamer and well, old person in general, one thing that I have learned is that everything new is a novelty and with time, it wears off.

When we are used to RT, in the near future, it will simply be "one more thing" in a game and nothing more.

What really matters here is, that for now, Nvidia is using the RT bullet point as a hook to keep you in their hardware and keep the competition out.

The reality is and proven by this game, the RT hardware is simply not there yet. We are at least a couple more GPU generations away from a reasonably priced and performant card.

What is worse, look at the numbers and see how little difference in FPS is between the 3080 and 3090, yet the price difference between then is insane.

Thats Nvidia for you and I dont blame them, I blame the sheep that continue walking themselves to the butcher.

This is the Crysis game for this decade and it will take a while before its playable, like Crysis is now.
 
As an old gamer and well, old person in general, one thing that I have learned is that everything new is a novelty and with time, it wears off.

When we are used to RT, in the near future, it will simply be "one more thing" in a game and nothing more.

What really matters here is, that for now, Nvidia is using the RT bullet point as a hook to keep you in their hardware and keep the competition out.

The reality is and proven by this game, the RT hardware is simply not there yet. We are at least a couple more GPU generations away from a reasonably priced and performant card.

What is worse, look at the numbers and see how little difference in FPS is between the 3080 and 3090, yet the price difference between then is insane.

Thats Nvidia for you and I dont blame them, I blame the sheep that continue walking themselves to the butcher.

This is the Crysis game for this decade and it will take a while before its playable, like Crysis is now.
Ah right I see why so many people lie and claim they can’t tell the difference with RT on. They don’t want Nvidia to have advantage. Hopefully when AMD gets it these fanboys will start enjoying cutting edge graphics again.

Also, if you describe people who buy these cards now as “sheep walking themselves to the butcher” then maybe just maybe you aren’t an enthusiast? I happily spend the money they want and Im no sheep.

Clearly there are a lot of what you call sheep or I would call enthusiasts as these new Nvidia cards have seen unprecedented demand.
 
Il reiterate. If you have eyes you can see a blatant and large improvement with RT turned on. Having working eyes does not make you an Nvidia fanboy.

Ive realised why there are so many people lying in the comments section here and falsely claiming that RT on doesnt make much difference. its because the multi-billion dollar corporation they fanboy over doesnt have this advantage yet.

The cancer of fanboyism has blinded people literally.
 
I simply pointed out the obvious significant texture downgrade (that poster looks worse than some 90s games textures) in just one single post vs your 7x posts here cheering it on and calling everyone who doesn't like it a 'loser', 'dirt poor' or 'someone like you'. It certainly isn't me who's overly emotionally invested in RT to an seriously unhealthy degree here...
I haven’t called anyone any of those things. Please don’t put words in my mouth. Maybe you should re read my posts? Emotionally invested in RT? Nope, I am just a Pc gaming enthusiast. I’m guessing you don’t really understand what we are all about. We love PC gaming and we love cutting edge visuals. Especially when there is a pandemic and there’s nothing else to do.

If you think posting a few times on a forum is “emotionally invested to a seriously unhealthy degree” then you are clearly devoid of passion, or enthusiasm for anything. A few posts is nothing mate, just see the things I do and the money I spend on what I’m actually emotionally invested in - like my girlfriend!

Seriously why are you here? You clearly have no love for RT or modern PC gaming so why the F are you commenting on an article exclusively about it? Do you even have an RTX card?
 
Back