Cyberpunk 2077 Benchmarked

DLSS is a gimmick. It's just upscaling, so why even bother? Just play at the lower resolution, without upscaling, to start with. Looks like those of us running older cards, like the RX480 and 580, are out of luck. Maybe we can play at 720p. :) Doesn't matter to me, though. I won't pay the obscene prices they ask for modern games anyway. I'll wait a year or two, until the price comes way down.

New AAA titles have been $60 for as long as I can remember. I remember SNES games being $60!
 
So, minimal visual differences between medium and ultra presets but with a huge 40% of performance differences means that the medium settings could be renamed high and ultra>epic or another fancy word. Same thing was seen in RDR2, where medium settings were more like high settings and high were more like ultra. Good to know. PC gamers should get smarter about tweaking graphic settings rather than complain that their 1000+ bucks of GPU's can't run the game with maxed out settings.
 
DLSS is a gimmick. It's just upscaling, so why even bother? Just play at the lower resolution, without upscaling, to start with. Looks like those of us running older cards, like the RX480 and 580, are out of luck. Maybe we can play at 720p. :) Doesn't matter to me, though. I won't pay the obscene prices they ask for modern games anyway. I'll wait a year or two, until the price comes way down.
DLSS is no gimmick, in some games like control and death stranding it looks better than native. It doesn’t in Cyberpunk but it looks a hell of a lot better than just lowering the resolution. And that might change, control was updated some time down the line after it was released to get better DLSS so I’m hoping the same happens here. It is based on deep learning after all.

DLSS is not perfect but it’s better to run a game with DLSS on at a playable frame rate than it is to either lower the resolution or just put up with poor performance. For me and my 2080 I wouldn’t be able to turn on RT in this game without DLSS so I certainly appreciate it!
 
DLSS is a gimmick. It's just upscaling, so why even bother? Just play at the lower resolution, without upscaling, to start with. Looks like those of us running older cards, like the RX480 and 580, are out of luck. Maybe we can play at 720p. :) Doesn't matter to me, though. I won't pay the obscene prices they ask for modern games anyway. I'll wait a year or two, until the price comes way down.
Or just do what I do, get your games from CDKeys. LOL
 
I got two 3090 FTW3 this morning.

I’m gonna see how much better they run Cyberpunk.

Sold my 3090FE
 

Attachments

  • 5F7BD165-72BE-4AFB-ACD2-E1CCAF114D17.jpeg
    5F7BD165-72BE-4AFB-ACD2-E1CCAF114D17.jpeg
    34.3 KB · Views: 2
Agreed. Considering the RX 590 and GTX 1060 are listed as options for recommended graphics cards on Steam, I felt as through cards like the RX 570/580/590 were noticeably absent from the benchmarks.

Although from what we've seen, we can conclude that those cards will all struggle if you crank the settings and resolution up.

If one can live with 30fps, stock RX580 can do just a hair over 30 at 1080p / extreme. If you need 45 FPS+ you'll need medium settings.

Gamer's Nexus had the 580 in their benchmarks. Pretty bad; even though mine OCs extremely well and performs above 590 levels, I think I'll need a new card ... :(
 
So much ray tracing eye candy was promised that I reckon the game's own backed lighting can easily be confused for some of that feature.

"ray traced lighting" barely seems to justify the price. Looking at this:


Overall the image already looks impressive without that effect, and the effect itself can be easily confused if you don't reckon where the light source is. If you want to look with a critic's eye, ray tracing needs a lot more horse power than these cards provide in this aspect, and the little impact just shows the flaws of the added effect itself.

A tint of gray that appears in a corner, or that little gray shade on the boxes is kind of relative. If the box is shinier or cleaner, it does appear brighter by itself.
Lights were backed in the scene by the developer, so it is their expression, and a few effects don't seem to justify the price and the marketing this generation. if is a matter of taste, and not of necessity, any eye candy is welcomed, though.

PS. The game is available on GeForce Now service.
Digital Foundry says the difference between RT on and off is night and day, so not sure what you're trying to say. That RT isn't worth it? The perf hit sucks but it is def worth it.
 
I have a good 1070 so reading between the lines looks like 1080p for me until the new cards appear in volume next year.
 
Oops, too late. All performance metrics in this article are rendered moot. New patch just dropped. Tiny size, LONG run time. From the amount of drive thrashing this tiny patch causes, it's digging stuff out and replacing a ton of code. This is clearly a pure fix patch, but performance will likely be affected.
 
DLSS is a gimmick. It's just upscaling, so why even bother? Just play at the lower resolution, without upscaling, to start with. Looks like those of us running older cards, like the RX480 and 580, are out of luck. Maybe we can play at 720p. :) Doesn't matter to me, though. I won't pay the obscene prices they ask for modern games anyway. I'll wait a year or two, until the price comes way down.

I'm playing it on a bit of an older PC with a 4GB RX480 at 1080p at medium settings. It's not blazing fast by any means but its definitely playable and I haven't noticed any real drops below 30fps at this point. The only thing unusual about my machine is that it has 32GB RAM but that shouldn't matter much here.

Your other option is to play it on Stadia (yeah I know). I haven't tried it yet but by all reports it actually runs pretty well there. I've used Stadia for a few other games and find it a surprisingly decent experience. I picked up Cyberpunk for my PC though because of the inevitable mods down the road.
 
The game needs DLSS to run well with good settings over 1080p.

I disagree. My 1080 Ti is running the game in 4k High settings preset without issue. No major frame drops, FPS is pretty stable all things considered. Holds at a steady 40ish FPS, which is more then sufficient for this type of game.
 
I have an rtx 3080 with an amd 3600x and at 4k I can run it all maxed out and dlss ultra performance and get 62 fps avg, tune it down to ultra and I get 65-67 fps. This is evga's ftw ultra and this seems to be the best combination. At 4k ultra performance while it has some detail loss on mesh objects still looks better than any 1440p setting. At 1440p everything maxed out even psycho and dlss performance it runs pretty good at around 69 fps on avg, but turning psycho off only gets me an additional 3 fps about 72, neither I am seeing drops below 60 fps (still disappointing since I have a 144hz monitor so the game is not great for 1440p but nvidia has the 4k 60 fps down in this game)
 
Steve specifically addressed this in the article:
"This first feature will cover Cyberpunk 2077 performance with all current and previous generation GPUs from AMD and Nvidia."
And with good reason too:
"So in order to create this article we’ve made nearly 500 benchmark runs, each lasting 60 seconds. Forget about sleep... "All work and no play makes Steve a dull boy.""
He's only human and it's quite clear that most of the cards that are 2 generations old and older aren't going to give very interesting results in this title. I believe that the GTX 1080 Ti would be the exception to this as it tends to perform about as well as an RTX 2070 Super but if Steve included that, you'd have people whining about other cards as well like the GTX 1060/70/80, the RX 470/570/580/590, the R9 Fury/Fury-X and the GTX 980/980 Ti.

He has to draw the line somewhere. The last time I saw him do something like this, his eyes were all bloodshot in his Hardware Unboxed video. He was clearly ready to drop.
I think it's due to the fact that tech media are paid and supply the latest hardware and I imagine companies such as Nvidia wouldn't be happy if an owner of a 10 series card or less saw that jumping to a 2060 isn't a big enough jump. There's a **** ton of 2060 cards on the market and the 2070 super and the same for the 5700XT & 5600XT. End of the day time moves on and Steve did say he would return to the game with older cards if there was enough interest, which tbf would probably push a lot of people go out and upgrade. I still think he's the best PC tech hardware journalist, no one puts in the amount of effort he and his colleagues put in.
 
I disagree. My 1080 Ti is running the game in 4k High settings preset without issue. No major frame drops, FPS is pretty stable all things considered. Holds at a steady 40ish FPS, which is more then sufficient for this type of game.

According to the benchmarks above at 4K medium settings an RTX2080 Super averages 39 FPS and minimum is down at 33FPS.

You're running a slower card at higher settings. 40FPS you say? Just an observation.
 
I think it's due to the fact that tech media are paid and supply the latest hardware and I imagine companies such as Nvidia wouldn't be happy if an owner of a 10 series card or less saw that jumping to a 2060 isn't a big enough jump. There's a **** ton of 2060 cards on the market and the 2070 super and the same for the 5700XT & 5600XT. End of the day time moves on and Steve did say he would return to the game with older cards if there was enough interest, which tbf would probably push a lot of people go out and upgrade. I still think he's the best PC tech hardware journalist, no one puts in the amount of effort he and his colleagues put in.
I completely agree. Which is why the fact that nVidia cut him off pisses me off to no end. It actually makes me proud that I've only used ATi hardware since 2008 because it means that I've done nothing to enable those bastards for twelve years and counting. For some reason, nobody has posted it here on TechSpot but all the other forums are abuzz with it. I posted a thread in the GPU forum about it.

I would recommend for anyone to check it out:
It seems that nVidia has just blacklisted Steve Walton and Tim Schiesser.
 
Just played Cyberpunk for nearly 24 hours with no real break (just paused the game to eat) and I've gotta say, it's nowhere near as glitchy as the media makes it out to be.

I'm running an old spare GTX 970 as well with basically no issues, had the odd issue but nothing game breaking or really even noticeable.

What I will say is I question the copies influencers got, I watched a few streams and YouTubers and they have way more issues than I've had. Really enjoying the game though! I actually think it's lived up to the hype personally.
 
Awesome article, Steve. Thank you!

Sadly, I just sold my old rig which included my 1080 Ti. :/ Bought a Gigabyte 1060 OC off of ebay for $125 to get by. But on a 3440x1440p monitor I'm going to wait until I can actually buy a 3080/3080Ti. Looks like that's going to be sometime next year.

Speaking of which, I was surprised that you didn't include the ever-so-popular 1080 Ti in this review. But I just looked at the 2080 results and subtracted a few frames to get a general idea.

 
So basically we need to wait next gen graphics to hit 60 FPS @ 1440P, thanks good review will do the same and by then, game of the year edition would be out too :p
 
According to the benchmarks above at 4K medium settings an RTX2080 Super averages 39 FPS and minimum is down at 33FPS.

You're running a slower card at higher settings. 40FPS you say? Just an observation.

Remember the 2000 series gets a few extra graphical options that are "off" in my case as they aren't supported on 1000 series HW; that's probably the difference.
 
Back