Cyberpunk 2077's system requirements are finally here

Polycount

Posts: 2,593   +557
Staff member
Highly anticipated: Cyberpunk 2077's third (but not final) Night City Wire info dump dropped today, and it's given us some juicy new details about the game. We got a more in-depth look at Night City's various districts and their respective gangs, and -- perhaps most importantly for us PC enthusiasts -- we finally know what sort of rig we'll need to run Cyberpunk 2077 on November 19.

We'll focus on the system requirements here since that's arguably the information PC players have been waiting for the longest. Before we list them, a quick side note: some of our readers will be pleased to know that, while Windows 10 is the recommended OS to play 2077 with, it will run just fine on Windows 7.

Now, onto the minimum requirements:

  • Processor: Intel Core i5-3570K or AMD FX-8310
  • Memory: 8 GB RAM
  • Graphics: NVIDIA GeForce GTX 780 or AMD Radeon RX 470
  • DirectX: Version 12
  • Storage: 70 GB available space
  • Additional Notes: SSD recommended

Those are surprisingly modest hardware demands for a game with as much detail and density as Cyberpunk 2077.

Many games with poorer graphics and smaller worlds ask for much more from players on the low-end, so it's nice to see CD Projekt Red is taking optimization seriously (though, of course, we'll need to play and benchmark 2077 ourselves to be certain).

Now, on to the recommended hardware configuration:

  • Processor: Intel Core i7-4790 or AMD Ryzen 3 3200G
  • Memory: 12 GB RAM
  • Graphics: NVIDIA GeForce GTX 1060 or AMD Radeon R9 Fury
  • DirectX: Version 12
  • Storage: 70 GB available space

Cyberpunk 2077's recommended requirements are equally as impressive. 12GB of RAM is very easy to obtain these days, and a GTX 1060 is one of the most common GPUs out there.

You'll also want an SSD to run Cyberpunk 2077 at its recommended settings, though that isn't listed in the above list (but it is listed at the end of the Night City Wire 3 video).

It's not as if the game won't boot if you run it on a hard drive at higher settings, but CD Projekt Red seems to feel that an SSD is important enough to a smooth gameplay experience that it warranted an official requirement.

And it's not hard to see why. The developer has said from the start that Cyberpunk 2077 is an immense open world RPG with no loading screens (barring death re-loads and fast travel). With that in mind, it's probably quite the technical challenge to reduce pop-in and retain quality draw distances on much slower, 7200RPM spinning drives.

As is often the case with minimum and recommended requirement listings, it's unclear what framerate and resolution CD Projekt Red is targeting here. We'll be reaching out to the studio for clarification, but we're assuming both lists are aimed at 1080p, 30 FPS gameplay; just with different settings (and, of course, no RTX).

If you want to hop on the Cyberpunk 2077 hype train now, the game is available for pre-order on just about every publisher-agnostic digital storefront: GOG, Steam, and The Epic Games Store, to name a few.

It'll run you $60, but if you don't want to shell out that cash early, don't fret -- no in-game content will be locked behind pre-order bonuses.

Permalink to story.

 

jbc029

Posts: 105   +186
Is that 3200G right? Recommended is an older 4c/8t Intel or 4c/4t zen+ part?

That has to be a typo. The 4790 walks all over the 3200G in literally every way as a CPU.
 

Evernessince

Posts: 5,436   +6,067
Is that 3200G right? Recommended is an older 4c/8t Intel or 4c/4t zen+ part?

That has to be a typo. The 4790 walks all over the 3200G in literally every way as a CPU.
From the information I could find, they appear to be pretty close

(Please note this is a 4790K link, couldn't find a 4790 result)

With a 2080 Ti, UBM shows 70 FPS on average with a 4790K. https://www.userbenchmark.com/PCGame/FPS-Estimates-Assassins-Creed-Odyssey/4032/586821.11601.0.0.0

Here a bench of the 3400G done by techspot:

https://www.techspot.com/review/1878-amd-ryzen-3400g/

Accounting for the reduced clocks and non-oc ability of the 4790 plus the lack of SMT on the 3200g and these two should be in the same neighborhood performance wise.

Judging by the GPU specs as well, this game appears to be evenly optimized for all hardware vendors.

I think it's great they aren't requiring beefy hardware here. Many people will be able to enjoy the game without breaking the bank.
 

Morphine Child

Posts: 74   +80
That still looks a bit too modest for a game Iike this. Maybe on a brand new windows install with no crap in the background hogging the system.

Don't get me wrong, I have no doubt game will be well optimized, I just think this is perfect case scenario.
 

Kshipper

Posts: 248   +40
TechSpot Elite
I posted yesterday under "PC Games" about this. The official site does recommend an SSD:


--------------------------------------------------

And the difference between the Ryzen 3 320G and the Intel I7-4790 is not that much in favour of the Intel but it is a little better, even in single thread performance where gaming tends to be:


and

 
Last edited:

Endymio

Posts: 701   +565
That has to be a typo. The 4790 walks all over the 3200G in literally every way as a CPU.
Not going to get into that debate, but I believe you're misunderstanding the context. They're not recommending a 3200g. They're recommending a 3200g or better.
 
  • Like
Reactions: Reehahs

Kshipper

Posts: 248   +40
TechSpot Elite
Not going to get into that debate, but I believe you're misunderstanding the context. They're not recommending a 3200g. They're recommending a 3200g or better.
I wasn't misunderstanding ..I was just surprised that the recommended (or better) system requirements are so low. I didn't expect that. Someone had mentioned that a Intel 7-4790 decimates a Ryzen 3 3200G, and that is not true. It has some advantage but not decimate.
 

Endymio

Posts: 701   +565
I wasn't misunderstanding ..I was just surprised that the recommended (or better) system requirements are so low.
I was directing that not at you, but the prior poster. The selection of the 3200g -- or any particular CPU, for that matter -- as a baseline recommendation in no way implies any other processor is better or worse.

I agree with you that the requirements are surprisingly low. But obviously most gamers are going to want more than a 1080p@30hz experience out of this box.
 

Evernessince

Posts: 5,436   +6,067
I posted yesterday under "PC Games" about this. The official site does recommend an SSD:


--------------------------------------------------

And the difference between the Ryzen 3 320G and the Intel I7-4790 is not that much in favour of the Intel but it is a little better, even in single thread performance where gaming tends to be:


and

Umm, the links you provided show the 3200g is better than the 4790...

1600624261415.png

1600624298817.png

Of course that's just a single benchmark, I do think the 4790 should be a bit better overall. That said they are close enough where it shouldn't really matter.

Not going to get into that debate, but I believe you're misunderstanding the context. They're not recommending a 3200g. They're recommending a 3200g or better.
It also established minimum and recommended performance baselines. If they say X CPU is recommended for Intel and Y CPU for AMD, they are saying that any CPU around X and Y or better performance wise will run the game at a recommended level. I don't see what he is misunderstanding, he is debating two CPUs performance levels which CDProjectRed's system requirements insinuate are around similar performance.
 

Endymio

Posts: 701   +565
I don't see what he is misunderstanding, he is debating two CPUs performance levels
It's rather simple. He said the recommendation of "A" must be a typo, because processor "B" is faster. Logically, we have only two possibilities.

1. He is wrong. In which case: no typo.
2. He is right. In which case: still no typo, as the recommendation in no way implies anything about the performance of any other AMD processor.

The conclusion is a non sequitor, thus obviously he misunderstood the context.