Ubisoft announces Far Cry 5's PC requirements

midian182

Posts: 9,726   +121
Staff member

After all the anticipation, delays, and a whole lot of controversy, Far Cry 5 will arrive on the PC in just over 2 months’ time. Ahead of the March 27th launch date (all formats), Ubisoft has announced the requirements for the game. Rather than just giving the minimum and recommended specs, the company has revealed what it will take to run it in 4K at both 30 and 60 frames per second.

The baseline hardware demands are quite low, providing you're willing to run the game at 720p with the video settings dropped to their minimums. All it takes is an Intel i5-2400 or AMD FX-6300 alongside an Nvidia GTX 670 or AMD R9 270 and 8 GB of RAM.

For most gamers, it’s the 1080p/high settings/60 fps recommendations that are most important. The specs for this tier are actually pretty reasonable, especially for a game that looks as good as Far Cry 5: a Core i7-4770 @ 3.4 GHz or AMD Ryzen 5 1600 @ 3.2 GHz or equivalent, GTX 970 or AMD R9 290X, and 8GB of RAM.

Skipping any 1440p requirements, Ubisoft has announced a 4K at 30fps configuration at high video presents. This asks for a Core i7-6700 @ 3.4 GHz or AMD Ryzen 5 1600X @ 3.6 GHz or equivalent, GTX 1070 or AMD RX Vega 56, and 16GB of RAM.

If you’ve got a killer rig and want to enjoy Far Cry 5 in 4K at 60 fps and with the video settings cranked up to ultra, you’ll need a Core i7-6700K @ 4.0 GHz or AMD Ryzen 7 1700X @ 3.4 GHz or equivalent, GTX 1080 SLI or RX Vega 56 CFX, and 16GB of RAM.

The PC version of the game comes with a benchmarking tool that includes a meter showing how much video memory is being used by your configuration. Ubisoft says multi-GPU settings will “significantly improve performance,” and there are a host of other options, including variant aspect ratios, resolution scaling, and field-of-view settings.

Here’s the full list of settings:

MINIMUM CONFIGURATION: 
OS: Windows 7 SP1, Windows 8.1, Windows 10 (64-bit versions only) 
PROCESSOR: Intel Core i5-2400 @ 3.1 GHz or AMD FX-6300 @ 3.5 GHz or equivalent 
VIDEO CARD: NVIDIA GeForce GTX 670 or AMD R9 270 (2GB VRAM with Shader Model 5.0 or better) 
SYSTEM RAM: 8GB 
Resolution: 720p 
Video Preset: Low

RECOMMENDED CONFIGURATION (60 FPS): 
OS: Windows 7 SP1, Windows 8.1, Windows 10 (64-bit versions only) 
PROCESSOR: Intel Core i7-4770 @ 3.4 GHz or AMD Ryzen 5 1600 @ 3.2 GHz or equivalent 
VIDEO CARD: NVIDIA GeForce GTX 970 or AMD R9 290X (4GB VRAM with Shader Model 5.0 or better) 
SYSTEM RAM: 8GB 
Resolution: 1080p 
Video Preset: High

4K 30 FPS CONFIGURATION: 
OS: Windows 10 (64-bit version only) 
PROCESSOR: Intel Core i7-6700 @ 3.4 GHz or AMD Ryzen 5 1600X @ 3.6 GHz or equivalent 
VIDEO CARD: NVIDIA GeForce GTX 1070 or AMD RX Vega 56 (8GB VRAM with Shader Model 5.0 or better) 
SYSTEM RAM: 16GB 
Resolution: 2160p 
Video Preset: High

4K 60 FPS CONFIGURATION: 
OS: Windows 10 (64-bit version only) 
PROCESSOR: Intel Core i7-6700K @ 4.0 GHz or AMD Ryzen 7 1700X @ 3.4 GHz or equivalent 
VIDEO CARD: NVIDIA GeForce GTX 1080 SLI or AMD RX Vega 56 CFX (8GB VRAM with Shader Model 5.0 or better) 
SYSTEM RAM: 16GB 
Resolution: 2160p 
Video Preset: High/Ultra

*Supported NVIDIA cards at time of release:

  • GeForce GTX600 series: GeForce GTX670 or better
  • GeForce GTX700 series: GeForce GTX760 or better
  • GeForce GTX900 series: GeForce GTX950 or better
  • GeForce GTX10-Series: GeForce GTX1050 or better

**Supported AMD cards at time of release:

  • Radeon 200 series: Radeon R9 270 or better
  • Radeon 300/Fury X series: Radeon R7 370 or better
  • Radeon 400 series: Radeon RX 460 or better
  • Radeon Vega series: any Radeon Vega series

Permalink to story.

 
No support for Intel HD? That s*cks. I don't see stunning sales on PC. Maybe they will fare better on consoles.
 
Are we safe to assume it has SLI support if it's recommended GPUs for 4k60 are GTX 1080s in SLI?
 
Are we safe to assume it has SLI support if it's recommended GPUs for 4k60 are GTX 1080s in SLI?

In most circumstances, I would say this is a stupid question. Since we are speaking about a game made by Ubisoft, I'd say you've got a legit question here.
 
No support for Intel HD? That s*cks. I don't see stunning sales on PC. Maybe they will fare better on consoles.
Intel HD??? That's horribly slow for gaming even if you have the latest top tier being the Intel HD 630 that's still inferior to an 11 year old 8800 GTX/8 year old GTS 450...... Upgrade.
 
No support for Intel HD? That s*cks. I don't see stunning sales on PC. Maybe they will fare better on consoles.

You can't really play any of the Farcry games with Intel HD, or very many other games for that matter. Not sure why you believe that gamers would be using the worst GPU on the market, but we aren't.
 
All around Shady. No mention of 1080ti.. Higher than first thought requirements, similar to Origins. So would it be fair to say, like Origins, the game caps off CPU usage leaving a percentage to run the DRM? These games could be great, but are propped up by dumb DRM.

Does this mean that, like origins, if I launch the game and then disconnect my network.. the game will run on lower hardware requirements?

How about the ability to disable DRM while using an always-online mode hooked to an account that only allows one key from that account to be active at one time.

Reviewers will hopefully not gloss over the DRM. I don't want to hear about the gameplay/story until I hear about it's performance.
 
Pretty standard stuff. I have a 970 myself, it's a pretty old card at this point - happy to see it can still run newer titles at 60 FPS.

That said, I probably won't be playing FC5. Looks just like the past entries but in a different setting. Maybe when Ubisoft inevitably pulls an Assassin's Creed: Origins move with the Far Cry franchise I'll be interested again.
 
If you don't have the requirements to play this game or FC4, just play FC3, it's all exactly the same ****. Not to mention it's Ubisoft.
I like to hop into far cry 2, I like the map's look and the fire play is fun on a bun.
 
You can't really play any of the Farcry games with Intel HD, or very many other games for that matter. Not sure why you believe that gamers would be using the worst GPU on the market, but we aren't.
Intel HD??? That's horribly slow for gaming even if you have the latest top tier being the Intel HD 630 that's still inferior to an 11 year old 8800 GTX/8 year old GTS 450...... Upgrade.
Obviously, sense of sarcasm is not your strongest point. Don't worry, I didn't get it either when I was a little kid. It comes with time. What I meant is that without dedicated gaming GPUs, people will either play on what they can get within their budget (like integrated GPUs), or they will switch to consoles. That will be the end of PC gaming (with its quirks, like technology trend setting for cosnoles, modding, etc.) as well as a problem of both the developers who rely on PC games, and PC peripheral manufaturers, like ASUS, ACER, MSI Razer etc etc...
 
You can't really play any of the Farcry games with Intel HD, or very many other games for that matter. Not sure why you believe that gamers would be using the worst GPU on the market, but we aren't.

Well close to half the 38,000 subscribers on lowend gaming would disagree with you, not to mention the thousands of videos showing intel's integrated graphics over the last 10-11 generations playing games on youtube, and there millions of combined views. People plays games on what they have, some people have to have a laptop and can't afford anything more than a $300-400 laptop. Just about every game released from 2000-2011 will play on a HD 520/620 with dual channel ram at 720p.
 
You can't really play any of the Farcry games with Intel HD, or very many other games for that matter. Not sure why you believe that gamers would be using the worst GPU on the market, but we aren't.
Intel HD??? That's horribly slow for gaming even if you have the latest top tier being the Intel HD 630 that's still inferior to an 11 year old 8800 GTX/8 year old GTS 450...... Upgrade.
What I meant is that without dedicated gaming GPUs, people will either play on what they can get within their budget (like integrated GPUs), or they will switch to consoles. That will be the end of PC gaming
Consoles cost as much as a good GPU, so your logic doesn't make sense. Anyone who can't afford either isn't affecting the PC gaming market anyway, because they're not in it. And if consoles were going to "be the end of PC gaming", that would have happened a long time ago.

You thought you were being sarcastic? You reaffirmed in your second post that you actually are disappointed that FC5 doesn't support Intel HD; "sarcasm" is when you say the opposite of what you mean. So... no.
 
Last edited:
Consoles cost as much as a good GPU, so your logic doesn't make sense. Anyone who can't afford either isn't affecting the PC gaming market anyway, because they're not in it. And if consoles were going to "be the end of PC gaming", that would have happened a long time ago.

You thought you were being sarcastic? You reaffirmed in your second post that you actually are disappointed that FC5 doesn't support Intel HD; "sarcasm" is when you say the opposite of what you mean. So... no.
Thank you, I had the same in mind. Also, I have always sold my older GPU's to fund the newest and I only had to pay about $550 including taxes for both of my current GTX 1080 Ti SLI when they first came out after selling both my SLI GTX 980 Ti (sold both 980 Ti's for $900 for the pair).
 
Last edited:
Thank you, I had the same in mind. Also, I have always sold my older GPU's to fund the newest and I only had to pay about $550 including taxes for both of my current GTX 1080 Ti SLI when they first came out after selling both my SLI GTX 980 Ti (sold both 980 Ti's for $900 for the pair).
$900? Wow, that's huge. I guess that's the one upside to this whole mining thing.

With SLI 1080 Ti's are you using 4K? I have a single 1080 Ti and the AlienWare 3440x1440, 120Hz Gsync monitor, and it has been brilliant.
 
$900? Wow, that's huge. I guess that's the one upside to this whole mining thing.

With SLI 1080 Ti's are you using 4K? I have a single 1080 Ti and the AlienWare 3440x1440, 120Hz Gsync monitor, and it has been brilliant.
Yes, I play on a 4K TV that has good input lag for gaming (21 ms @ 4k HDR 4:4:4). I bought the 1080 Ti's when they first came out before their prices blew up because of the mining craze and 980 Ti's could be sold higher than they do today averaging about $300 now.
 
Back