Ubisoft reveals Assassin's Creed: Valhalla's PC requirements and features

Polycount

Posts: 3,017   +590
Staff
Highly anticipated: Ubisoft has plenty of games coming down the pipeline, such as Immortals Fenyx Rising and Far Cry 6, but Assassin's Creed: Valhalla is arguably the most highly-anticipated of them all. For the first time ever, the action-RPG will let players take on the role of a Viking as they raid and pillage the lands of England.

Today, we got seven minutes of fresh gameplay for Valhalla, in addition to the first official system requirements for the game.

If you're particularly hyped about the game, that means you can finally start planning your next upgrade or simply determine whether or not your existing rig will be enough to get you by.

There are many suggested computer configurations to cover here, so bear with us. We'll start with 1080p recommendations and then touch on Ubisoft's sole 4K and 1440p suggestions. As a quick side-note, you'll need a DirectX 12-supported.

The first 1080p setup targets 30 FPS at Low settings:

  • Processor: AMD Ryzen 3 1200 or Intel i5-4460
  • RAM: 8GB
  • Video Card: AMD R9 380 or GeForce GTX 960 4GB
  • Storage: 50GB HDD (SSD Recommended)
  • OS: Windows 10 64-bit

The second 1080p setup also shoots for 30 FPS, but at the High settings preset:

  • Processor: AMD Ryzen 5 1600 or Intel i7-4790
  • RAM: 8GB
  • Video Card: AMD RX 570 or Nvidia GeForce GTX 1060
  • Storage: 50GB SSD
  • OS: Windows 10 64-bit

The final 1080p recommendation hopes to help you achieve 60 FPS at High settings:

  • Processor: AMD Ryzen 7 1700 or Intel i7-6700
  • RAM: 8GB
  • Video Card: AMD Vega 64 or Nvidia GeForce GTX 1080
  • Storage: 50GB SSD
  • OS: Windows 10 64-bit

If you want to game at 1440p and 60 FPS, this is what Ubisoft recommends:

  • Processor: AMD Ryzen 5 3600X or Intel i7-8700K
  • RAM: 16GB
  • Video Card: AMD RX 5700 XT or Nvidia GeForce RTX 2080 Super
  • Storage: 50GB SSD
  • OS: Windows 10 64-bit

Finally, here's Ubisoft's 4K, 30 FPS system suggestion:

  • Processor: AMD Ryzen 5 3600X or Intel i7-8700K
  • RAM: 16GB
  • Video Card: AMD RX 5700 XT or Nvidia GeForce RTX 2080 Super
  • Storage: 50GB SSD
  • OS: Windows 10 64-bit

Assassin's Creed: Valhalla arrives on November 10 for Xbox One, PlayStation 4, and PC. It will also arrive on the PlayStation 5 and Xbox Series X. You can pre-order the game now on your platform of choice for $60.

Permalink to story.

 
Innovation and technological progress in game development has gone completely stagnant for years. This game doesn't look any better than Origin or Odyssey, but needs a beefier rig to achieve worse performance? lol

Previous games listed previous processors and CPUs, and only 1080p. If you want to game at 4k, you need better hardware than what could play 1080p two years ago at the same framerate. Its not rocket science, and its not like some new lighting technology doesn't creep into revisions of every game engine.

But that aside, I agree with your comment overall. Star Citizen seems to be the only game that is striving for "no compromise", and last I heard the ETA on that was Q4 2037. However some tradeoff between the two ends of the spectrum would be nice. Greedy publishers are partly to blame for lack of innovation.

I watched a video from 2013 the other day of new "water" technology that used particles, for use in games, being touted in an Nvidia tech demo, with one comment stating "wow! so this is the future of water effects in games!" ... Here we are 7 years later, we still get the same crappy unrealistic water techniques in games since then.
 
Innovation and technological progress in game development has gone completely stagnant for years. This game doesn't look any better than Origin or Odyssey, but needs a beefier rig to achieve worse performance? lol
Games coming out now that were in development already before next gen consoles will look last gen in terms of graphics. I would say we have to wait till a few years before we get Unreal engine 5.0 visuals.
 
New games are nothing but pushing the envelope for graphics. But ......... gameplay is soooooo empty, devoid of innovation and soul.

New system requirements for tech demos? No thanks.

Good thing old games are still a blast to play.

 
Previous games listed previous processors and CPUs, and only 1080p. If you want to game at 4k, you need better hardware than what could play 1080p two years ago at the same framerate. Its not rocket science, and its not like some new lighting technology doesn't creep into revisions of every game engine.

But that aside, I agree with your comment overall. Star Citizen seems to be the only game that is striving for "no compromise", and last I heard the ETA on that was Q4 2037. However some tradeoff between the two ends of the spectrum would be nice. Greedy publishers are partly to blame for lack of innovation.

I watched a video from 2013 the other day of new "water" technology that used particles, for use in games, being touted in an Nvidia tech demo, with one comment stating "wow! so this is the future of water effects in games!" ... Here we are 7 years later, we still get the same crappy unrealistic water techniques in games since then.

Previous games listed previous processors and CPUs, and only 1080p. If you want to game at 4k, you need better hardware than what could play 1080p two years ago at the same framerate. Its not rocket science, and its not like some new lighting technology doesn't creep into revisions of every game engine.

But that aside, I agree with your comment overall. Star Citizen seems to be the only game that is striving for "no compromise", and last I heard the ETA on that was Q4 2037. However some tradeoff between the two ends of the spectrum would be nice. Greedy publishers are partly to blame for lack of innovation.

I watched a video from 2013 the other day of new "water" technology that used particles, for use in games, being touted in an Nvidia tech demo, with one comment stating "wow! so this is the future of water effects in games!" ... Here we are 7 years later, we still get the same crappy unrealistic water techniques in games since then.

Life as a gamer with a low budget for hardware definitely used to be harder than it is now. When the GTX 280 came out, you either had one, or had fantasies of playing "X" new game on last generation's hardware that usually just turned out to be painful. Now, "next-gen" seems to be every couple of years. The rates of improvement of both software and hardware, instead of being an exponential increase every 5-7 years, has reached a point where we're slowly just improving on what we already have. We're getting close to the point where further improvement on existing graphics engines and hardware will be impossible without many new and expensive technologies being developed. That said, I really like where things are at now. My entry level HP Omen's GTX 1050 can run RDR2 in the mid 40's at 1080 and I get to enjoy it. My rig is definitely outdated, but It's even playable on older hardware. Or if you can afford an insane build, you can play it at 4k, 60fps (surely not quite yet tho, lol). Even when progress has gotten painfully slow, R&D is always full speed ahead, so I feel like we all can get a piece of the pie. Instead of being excluded completely, I can play pretty much what I want, and join the 4k club later if I feel like I need to. You never know, though, maybe we'll start seeing games that are "4k exclusive" or something like that. Sounds like an Apple thing.

Just my 2 cents, but being at the back of the line used to mean that you just can't play. With the hardware industry's neverending efforts to deliver max resolution at max fps, this is a wonderful time to be alive. My 2005-era $5000 gaming beast was indeed awesome, and actually still kind of useful in 2015, but needed $500+ in GPU upgrades yearly! Actually I bet the dual R9 280x would still run something, idk. I think I still have at least another year before my entry level Omen just can't keep up, and by then, the newer-gen entry level Omen will be dirt cheap. I live in a harsh world, own very few possessions, but manage to do more with very little than I could 10 years ago. A true next-gen technology (think skynet, not marginally faster xboxes) would crush a lot of poor souls under it's magnificent boots.
 
Looks like a great game at 1440p for me (R5-3600X & RX 5700 XT). The only thing is, and I'm ashamed to say it, I've never played an Assassin's Creed title before. I may have to pick-up some of the older ones to get some idea of the lore before getting this one.
 
Life as a gamer with a low budget for hardware definitely used to be harder than it is now. When the GTX 280 came out, you either had one, or had fantasies of playing "X" new game on last generation's hardware that usually just turned out to be painful.
Well, even back then, it depended on what you wanted and what you had. Back then, there were a lot of people still using 4:3 CRT monitors and getting decent frame rates at those resolutions was possible with GeForce cards like the 8800 GTS or GTX, 9800 GT and Radeons like the HD 3870 and HD 4830.

However, if you had a 16:10 panel and wanted to game at 1920×1200, then you had a bare minimum requirement of the GeForce 9800GTX+ (aka the GTS 250) or Radeon HD 4850 to get decent frame rates at medium settings. If you wanted great frame rates at medium settings or decent frame rates at high settings, you'd have been looking at a GeForce GTX 260 or Radeon HD 4870 (which I had).

When it came to getting great frame rates at high settings at 1920×1200, the GTX 280 stood alone for a good while (until the HD 4870x2 came along anyway). There is no denying that.

I'll never forget the battle between the HD 5970 and GTX 295, two cards that I would never have been able to afford at the time (and I even if I could, I would never have been able to rationalise either of them) but sure was fun to dream, eh? :D

Looking back now reminds me of why I always had the policy of "The first upgrade of a video card is to get a second one of the same type." because back then, SLI and Crossfire worked brilliantly. I added a second XFX HD 4870 1GB to my first and I actually was in good stead for quite awhile (at least, according to this graph for Arkham Asylum from the HD 5970 review):
Batman_03-p.webp
 
I watched plenty of the vids. The game looks a lot like Odysseys. Graphics and seas during storms are amazing, but the game looks like a big DLC for the last game.
 
So basically, I have the CPU to play 60FPS@1080 High, & almost enough CPU for 60FPS@1440p/30FPS@4K (Ryzen 5 3600)...but my R9 380 is going to keep me at 30FPS@1080 Low?

Well, I don't have a lot of interest in this game series, so I can't justify the budget for a new GPU at this time. Having to fork over essentially $400 just to play one new game at the resolution/FPS that I can already play my current games at is not a gamer-friendly position for a company, IMHO.
 
Back