Cyberpunk 2077 updated system requirements revealed, including specific resolutions and...

Polycount

Posts: 3,017   +590
Staff
Highly anticipated: Cyberpunk 2077 is just a few weeks away, and though we know quite a bit about the game now, many fans have been clamoring for more detailed hardware requirements -- they've wanted spec suggestions that target different resolutions, and more importantly, different ray-tracing quality levels. Fortunately, thanks to CD Projekt Red itself and the folks over at Nvidia, we have all of those details and more today.

We'll start with the updated, non-RT system requirements first. As a quick side note, though, the minimum and recommended settings haven't changed from the last time CDPR gave us hardware requirements, so there's no need to panic if you have a lower-end rig. A GTX 780 and an i5-3570K will still net you a playable experience on Low settings.

Since we've already reported on those before, we won't re-tread that ground here. However, CDPR has included new "High" and "Ultra" non-RT recommendations, which are well worth listing.

Let's begin with the High hardware suggestions, which are as follows (everything above Minimum recommends 70GB of SSD space):

  • Target Resolution: 1440p
  • Graphics Settings: Ultra
  • OS: Windows 10 64-bit
  • CPU: Intel Core i7-4790 or AMD Ryzen 3 3200G
  • RAM: 12GB
  • GPU: RTX 2060 or Radeon RX 5700 XT

We're not sure what framerate this hardware-settings combo is aiming for, but to be safe, we are assuming everything above Minimum will result in 60 FPS gameplay most of the time.

Anyway, next up, we have the Ultra hardware recommendations:

  • Target Resolution: 2160p/4K
  • Graphics Settings: Ultra
  • OS: Windows 10 64-bit
  • CPU: Intel Core i7-4790 or AMD Ryzen 3 3600G
  • RAM: 16GB
  • GPU: RTX 2080 Super or RTX 3070 or Radeon RX 6800 XT.

Now, we move on to Cyberpunk 2077's ray tracing system requirements. Notably, all of the spec details we're about to list assume you're running DLSS -- without the AI upscaling tech enabled, we assume your framerate will take a significant dive, especially at 4K with RT on Ultra.

Here are the minimum requirements for RTX in Cyberpunk 2077:

  • Target Resolution: 1080p
  • RTX Settings: RT Medium
  • OS: Windows 10 64-bit
  • CPU: Intel Core i7-4790 or AMD Ryzen 3 3200G
  • RAM: 16GB
  • GPU: RTX 2060

No real surprises here. 16GB of RAM is to be expected, and AMD's GPUs will be cut from here on out due to their obvious lack of RTX support. Moving on to RT High suggestions:

  • Target Resolution: 1440p
  • RTX Settings: RT Ultra
  • OS: Windows 10 64-bit
  • CPU: Intel Core i7-6700 or AMD Ryzen 5 3600
  • RAM: 16GB
  • GPU: RTX 3070

The system requirements are quickly ramping up here, and notably, the 2080 Ti isn't listed anywhere, not even as an alternative to the 3070. That says a lot about Ampere's superior RT rendering performance, I suppose, but it is a shame for folks like myself who purchased Nvidia's last-gen flagship.

Finally, we have CDPR's RT Ultra recommendations:

  • Target Resolution: 4K
  • RTX Settings: RT Ultra
  • OS: Windows 10 64-bit
  • CPU: Intel Core i7-6700 or AMD Ryzen 5 3600
  • RAM: 16GB
  • GPU: RTX 3080

And there we have it -- all of Cyberpunk 2077's newly-updated official system requirements listed for your convenience. One thing you'll notice throughout these requirements is how low the CPU requirements are, relative to other modern-themed open world games.

For a game set in an incredibly-dense city with potentially hundreds of NPCs walking around, I can't help but be a bit surprised, especially given Watch Dogs: Legion's atrocious CPU optimization. Still, I'm not one to look a gift horse in the mouth, and I sincerely hope Cyberpunk 2077's real-world performance matches up with its technical requirements.

Fortunately, we won't have to wait long to find out. Cyberpunk 2077 releases on December 10, and our own Steven Walton and Tim Schiesser will be putting the game through its paces in their own benchmarks, so stay tuned for that.

Permalink to story.

 
The system requirements are quickly ramping up here, and notably, the 2080 Ti isn't listed anywhere, not even as an alternative to the 3070. That says a lot about Ampere's superior RT rendering performance, I suppose, but it is a shame for folks like myself who purchased Nvidia's last-gen flagship.

Not to worry, Nvidia’s posted an article about Cyberpunk 2077 on their site today that DOES mention the 2080 Ti, and claims the card is good to go for Ultra settings at 2560x1440, with Ray Tracing also set to Ultra.
 
So it seems that Raytracing requirements for games will be in a tier of their own from now on? I must say Cyberpunk with raytracing looks phenomenal, but where are all the other games?
 
So it seems that Raytracing requirements for games will be in a tier of their own from now on? I must say Cyberpunk with raytracing looks phenomenal, but where are all the other games?
Yes, RT isn't all or nothing it seems. In Control RT Medium gives you the great reflections that really make the game, especially office areas, stand out. It cost way less performance wise than RT High which adds in more RT features, but honestly doesn't look that much better than Medium and cost about 20fps in performance at 1440p on the 30 series cards. Certain RT features allow you to turn off other costly lighting effect too, so if the game has RT Ambient occlusion it should automatically turn off regular AO in games that support it. Obvious the RT costs more. You just have to decide which features to turn on and off.

AMD was talking about this in an interview and they believe that since games will be developed for consoles and easily ported to PC from XSX for instance, that RT elements in games will be suited very well to the AMD cards. What I'm interested to see is if AMD having the same architecture across all these platforms will greatly benefit AMD as far as optimization on PC. TBH it could help Nvidia as well if AMD is kind of automatically optimized then the PC teams can focus on optimization for Nvidia cards which are much more prevalent and will be for quite some time. So we'll see, it would be nice if Nintendo and Nvidia made a deal and released a next-gen console so that Nvidia would have a leg in the console race. But right now, they can't even get 30 series cards to the market fast enough so no expectation of that happening.
 
Surprised that no one’s mentioned that developers are *already* recommending 10GB of VRAM for the best performance, maxing out the 3080 frame buffer only weeks after launch.

Doesn’t bode well for early 3080 adopters, I’d take this as a strong plus for the 6000 series considering these cards realistically have to last years and requirements never go down...
 
Seems they are earning more money from Nvidia for favoring their hardware, than from actually selling the game. Maybe they should release a completely free version that only runs on Nvidia cards. Or even better, to not charge players that use AMD hardware, as a compensation for slowing down their gameplay.
 
So all this extra time "Polishing" the game is basically BS all they've done is spent time gimping it on AMD PC hardware so no RT will work on RX6800/6800XT GPU's meanwhile I'm assuming RT will work fine on the console hardware or won't it
 
I wonder how that "high" setting will look like. With a GTX 1070, that's what I aim for, and if it can deliver 1080p/60 with no drops, I'll be happy :)

(unless Ultra will look much better...in which case I will either have to commit seppuku, or to go on a quest for a next-gen card)
 
Seems they are earning more money from Nvidia for favoring their hardware, than from actually selling the game. Maybe they should release a completely free version that only runs on Nvidia cards. Or even better, to not charge players that use AMD hardware, as a compensation for slowing down their gameplay.

Paid to favor Nvidia hardware? ‘Slowing down’ players on AMD hardware?

Excuse me, what evidence am I missing?
 
Honestly, looking at GPU tiers and their price, consoles make quite sense. I have 1050Ti now, not looking forward for 1080p Low. Also have X1 S. While 1050Ti is stronger than X1S, the whole X1S costed almost the same as 1050Ti. 5700XT costs almost as newly arrived consoles. I know the console wont achieve the same, but still.
 
So all this extra time "Polishing" the game is basically BS all they've done is spent time gimping it on AMD PC hardware so no RT will work on RX6800/6800XT GPU's meanwhile I'm assuming RT will work fine on the console hardware or won't it

Posted edited to reflect new info... So yeah, timed exclusives for a feature are now a thing for both AMD and Nvidia. Godfall for AMD, CP2077 for Nvidia.
 
Last edited:
Oh OUCH! So yeah, ALL RDNA2 loses ray tracing for Cyberpunk 2077 at launch, Big Navi AND both consoles. It's not just Radeon Rx 6000 not making the ray tracing cut apparently, it's ALL RDNA2 chips. Only Turing and Ampere can run it.

This, AND Godfall for AMD, are both now looking at being "Timed Exclusive Ray Tracing" titles, and that is now a thing. Yuck.

I picked right for CP2077, I have an RTX 3090 so am ready to rock it. But I don't like Radeon 6000 users left out, but even worse now the consoles are screwed for ray tracing too! Godfall is not a fair trade for Radeon owners, and timed exclusives for a feature like this just feels yucky...

News quote -

"Additionally, AMD GPU owners won’t be alone in waiting for these enhanced visuals. Next-gen consoles will be able to run Cyberpunk 2077 as soon as it launches, but ray tracing will only come to said consoles via an update at a later date. "

 
Oh OUCH! So yeah, ALL RDNA2 loses ray tracing for Cyberpunk 2077 at launch, Big Navi AND both consoles. It's not just Radeon Rx 6000 not making the ray tracing cut apparently, it's ALL RDNA2 chips. Only Turing and Ampere can run it.

This, AND Godfall for AMD, are both now looking at being "Timed Exclusive Ray Tracing" titles, and that is now a thing. Yuck.

I picked right for CP2077, I have an RTX 3090 so am ready to rock it. But I don't like Radeon 6000 users left out, but even worse now the consoles are screwed for ray tracing too! Godfall is not a fair trade for Radeon owners, and timed exclusives for a feature like this just feels yucky...

News quote -

"Additionally, AMD GPU owners won’t be alone in waiting for these enhanced visuals. Next-gen consoles will be able to run Cyberpunk 2077 as soon as it launches, but ray tracing will only come to said consoles via an update at a later date. "

They did the eact same thing with DX9 features and things like Physx back in the day.

Incidentally, this is why buying a GPU for "RayTracing" is stupid. By the time the tech is widespread nothing fromt he first few gens will be able to run it well, if at all, and there will likely be newer revisions to the standard. The first DX9 cards were utterly worthless a few years later, and by the time DX11 was widespread the first DX11 GPUs, the fermi 500s and the evergreens from AMD, were utterly obsolete.
 
Back