Alan Wake 2 will boot on GeForce GTX 10 and Radeon 5000 GPUs (running it is another story)

Daniel Sims

Posts: 1,375   +43
Staff
The big picture: A Remedy developer raised alarms upon admitting that Alan Wake 2 doesn't officially support graphics cards older than Turing and RDNA 2. Early post-launch testing shows that GPUs predating mesh shaders will still boot the game but with generally unacceptable performance. For example, cards like the Radeon RX 5700 XT and GeForce GTX 1080 Ti fare far worse than weaker GPUs from later generations.

If you are using a graphics card older than Nvidia's RTX 20 or AMD Radeon RX 6000 series, you will need to upgrade if you want to play the PC version of Alan Wake 2 – even if it's an enthusiast-tier model. Benchmarks following the game's launch have uncovered why and explain the somewhat shocking system requirements.

According to the spec sheet, Alan Wake 2 requires an RTX 2060 or Radeon RX 6600 at the absolute minimum. Someone with a GTX 1080 Ti or Radeon 5700 XT might surmise that those GPUs can handle the game because the aging flagships perform similarly to their mainstream successors in most other titles. However, Alan Wake 2 isn't like most other games.

Shortly before launch, a Remedy developer explained that the game doesn't officially run on GPUs that lack mesh shader support. The developer later deleted the tweet due to immediate backlash, much of it likely from users afraid that their older GPUs wouldn't run Alan Wake 2 at all.

TechPowerUp and Digital Foundry successfully started the game on cards like the 5700 XT, but encountered a dialog box warning that the hardware doesn't support all critical features. Players can dismiss the notice and proceed, but suffer a significant performance cost.

Multiple outlets found that the Radeon 5700 XT and GeForce 1080 Ti – the most robust GPUs without mesh shaders – struggle to reach 30 fps at native 1080p on the lowest graphics settings. Newer cards that support mesh shaders but typically perform similarly or worse in other games, like the RTX 2060 or GTX 1660 Ti, manage Alan Wake 2 noticeably better.

All hope may not be lost for users unable to upgrade beyond Pascal or RDNA 1. Remedy admitted that modders could theoretically add support for technologies preceding mesh shaders, but results may be limited.

Remedy's use of mesh shaders probably play a significant factor behind the title's impressive level of geometric detail, which remains apparent at medium and low settings. Mesh shaders allow games to cull polygons players can't see using techniques that are newer and more advanced than vertex or geometry shaders, thus conserving horsepower more efficiently for the remaining visible polygons.

Requiring new technologies to play the most advanced PC games is natural as times change. Despite the enduring popularity of some GeForce GTX 1000 series GPUs on Steam surveys, the cards are six years old, so it shouldn't be surprising if the time for playing high-end games on them is ending. The situation regarding Radeon 5000 series is somewhat more concerning because that generation is only around four years old.

Meanwhile, users with newer mid-range cards like the GeForce RTX 3070 or Radeon RX 6700 XT probably shouldn't worry about Alan Wake 2's performance overall. The game's most demanding ray tracing settings may be out of reach, but the medium and low presets without ray tracing still look great.

Digital Foundry explains that all of Alan Wake 2's graphics settings, including the lowest, employ advanced features like global illumination and software-based ray tracing akin to Unreal Engine 5's Lumen system. Furthermore, the PlayStation 5 version achieves 60 frames per second by combining the PC edition's medium and lowest settings in 1440p with FSR 2 set to balanced mode.

Alan Wake 2 is undoubtedly an advanced, demanding game, but it is playable on a broad range of recent hardware.

Permalink to story.

 
I remember when the 1080ti debuted and it was THE card for 4k performance... Now it can't even run a game at 1080p lol
 
Pascal is a 2016 architecture. RX 5700 is 2019 and was really behind the curve even then. It was obvious the way the wind was blowing, you bought it knowing full well it missed features the Nvidia architectures in 2019 had. You accepted that then because it was cheaper for raster only performance.

Your alternative to the PC version is consoles released at the end of 2020. Things get old. That's life.
 
"Remedy developer explained that the game doesn't officially run on GPUs that lack mesh shader support. The developer later deleted the tweet due to immediate backlash, much of it likely from users afraid that their older GPUs wouldn't run Alan Wake 2 at all."

Gamers embarrass themselves often. I know I'm not shocked that they took the word unofficially completely wrong. With the warning tweet deleted, when it doesn't work, they'll blame the developer anyway. Rinse and repeat.
 
Last edited:
All I wanna know about this game...Does the flashlight chew up batteries as fast as in the original?
 
I don't see a big problem with this and I understand from the software development perspective it's alote of work to do the programming around something that is better is always the right thing to do, but I would appreciate if a fallback option was present for the users of the older hardwares :(
 
lel :D yes, but not with the prices of today :( people don't wanne waste so many money on video card :( especialy when only thing new cards realy gives is RT and DLSS :D

Sure 4090 are stupid expensive but Intel's cards and the mainstream cards from RX 6000 should be reasonably priced, especially on the used market
 
After 7 years it's time for an upgrade.
2015 GPUs can still play the latest titles just fine; no reason to upgrade just because of this...sounds more like a reason to avoid this game if they are this bad at analysis and/or optimization.
 
lel :D yes, but not with the prices of today :( people don't wanne waste so many money on video card :( especialy when only thing new cards realy gives is RT and DLSS :D

I have 1080Ti. Before 7700/7800 XT were released, I was honestly lost at what to upgrade to. Those 2 are the best upgrade path considering power consumption, performance, and price. Spending more than 900 for a GPU is a big no for me.

But I dont need to upgrade yet, since all I play are non or older AAA games. I barely have time to play anymore. I've been watching Youtube let's play for recent AAA games.

Maybe I'll upgrade after GTA VI released (or when my 1080 Ti dies), and choose what card that can run it maxxed out >75fps on 3440x1440 without breaking the wallet. I don't even mind waiting longer.
 
sounds more like a reason to avoid this game if they are this bad at analysis and/or optimization.

The irony here is that mesh shaders are, in fact, an extremely efficient tech specifically for optimizing the rendering of huge amounts of geometry. Like the article said, even cards that are slower than a 1080Ti (e.g. GTX 1660Ti or RTX 2060) manage to outperform it because of actual hardware support for the tech.
 
2015 GPUs can still play the latest titles just fine; no reason to upgrade just because of this...sounds more like a reason to avoid this game if they are this bad at analysis and/or optimization.
But it's not bad analysis? Nvidia makes up 86 percent of gaming GPUs.. their last three generations of GPUs have all supported mesh shaders. It's not new tech.

The 1080ti is a 7 year old card. That's the difference between the GeForce 2 and the GeForce 8000 series. Imagine a game in 2007 being derided for not supporting GeForce 2 chips. Absolute nonsense.

For AMD users this is a bigger pickle, but the 5700xt was generations behind, we knew that when it came out. The tradeoff was better rasterization. Now it's time to upgrade to modern chips.

Games are going to move on sooner or later.
 
Eventually this was going to happen. I also read that the game isn't as poorly optimized as people thought, it just really pushes the high-end stuff. Gamers with the high-end stuff always complain their cards aren't be utilized, gamers with low-end hardware from five years ago complain the devs didn't optimize because they can't run the game. There's always going to be someone that isn't pleased. Yes, sometimes games are just poorly optimized too, but that doesn't sound like that is the case here. It's like Control, a showcase for Nvidia hardware that you can also play with the new AMD cards, but you're just not going to get the best visual experience.
 
This game is not to my liking . Watched a tech guy stream it and I was about to vomit . Also light/shadows blah blah and thus high sys req . No dice .
 
2015 GPUs can still play the latest titles just fine; no reason to upgrade just because of this...sounds more like a reason to avoid this game if they are this bad at analysis and/or optimization.

Yes, let's blame "bad optimization" and halt any furthering of rendering technologies to appease those with aging out hardware.
But then those same people will complain about how games don't look any better than 10 years ago, since nearly 10 year old hardware and it's limitations will have to be the standard.

Someone phone up NV and AMD, have them halt any production on new hardware and software technologies, and do the same for every game dev to let them know Pascal/RDNA1 is the limit, and all future releases must be sure to appease owners of those GPUs.

Same dumb statement that was made when DX7 and hardware T&L came out and only a handful of GPUs at the time supported it. But you like parallax mapping in games, right?
 
But it's not bad analysis? Nvidia makes up 86 percent of gaming GPUs.. their last three generations of GPUs have all supported mesh shaders. It's not new tech.

The 1080ti is a 7 year old card. That's the difference between the GeForce 2 and the GeForce 8000 series. Imagine a game in 2007 being derided for not supporting GeForce 2 chips. Absolute nonsense.

For AMD users this is a bigger pickle, but the 5700xt was generations behind, we knew that when it came out. The tradeoff was better rasterization. Now it's time to upgrade to modern chips.

Games are going to move on sooner or later.

This is spot on.
(Bold added)
 
How come on 1080Ti 2018 game like Red Dead Redemption 2 or 2020 Cyberpunk 2077 looks good @1440p @ +60fps and Alan Wake 2 on same GPU looks like crap @360p upscaled at +20 fps? How come no angry mob screaming "sabotage by nVidia", similarly like when AMD were trying to make Starfield playable, You screamed "AMD sabotage", instead everybody says "It's time to upgrade from Your COVID $400 GTX1650s"? I'll tell You why: You're all nVidia shills and You deserve to pay through the nose.
I saw My friend play RDR2 on GTX970 @1440p low-medium and It looked great, If You won't boycott that game, and those alike UE5 failures, You're making Your bed for foreseable future.
 
How come no angry mob screaming "sabotage by nVidia", similarly like when AMD were trying to make Starfield playable, You screamed "AMD sabotage"…
FSR2 AND DLSS are supported in Alan Wake 2.

AMD specifically made sure DLSS wasn’t in Starfield. That’s the difference. That was AMD trying to stop Nvidia’s (superior) tech getting into a game it sponsors.

From what I’ve seen, Alan Wake 2 runs quite well on AMD hardware as well, doesn’t look biased to me, things like path tracing and ray reconstruction only really exists on Nvidia hardware right now.
 
Yes, let's blame "bad optimization" and halt any furthering of rendering technologies to appease those with aging out hardware.
But then those same people will complain about how games don't look any better than 10 years ago, since nearly 10 year old hardware and it's limitations will have to be the standard.

Someone phone up NV and AMD, have them halt any production on new hardware and software technologies, and do the same for every game dev to let them know Pascal/RDNA1 is the limit, and all future releases must be sure to appease owners of those GPUs.

Same dumb statement that was made when DX7 and hardware T&L came out and only a handful of GPUs at the time supported it. But you like parallax mapping in games, right?
Unfortunately AMD has more to lose in bad optimized games imo. If the current consoles can't brute force their way to a subjectively pleasing experience gamers will eventually want to go the PC route. Statistically speaking there is an 85% chance they'll go the Nvidia route so there is that. Also Nvidia is an a better position for brute forced gaming with free advertisement of the 4090. (Which just went up in price before the holiday season). The crazy part is that not only is rt games taking a hit ( which is acceptable by most) but the Rasterization games are also affected. Bad optimizatized games are also bad for the resell value of older hardware. Unfortunately nothing will change until there is an intervention by a big player like AMD did once with Mantel api. Developers are always in a time crunch, only for their hard work to be reviewed bombed and affect sales in the timeframe to be considered a successful launch.
 
Native resolution may be on the low side, but it's still decent looking on a 4k TV. Performance wise, although not perfect it's not too shabby either. The game holds up well on the PS5 given how demanding the requirements are. I'm fine without RT.

I myself, still using the good old 1080 Ti, but it's starting to get long in the tooth for recent games. Many I have to turned the settings down to medium or low and still cannot hit 60fps.

GPU prices are ridiculous here in Australia, and having own a PS5, I'll be playing this game on it. My days of trying to keep up with top of the line hardwares are passed at this point in my life.
 
Back