Alan Wake II assumes everyone will use upscaling, even at 1080p

Daniel Sims

Posts: 1,376   +43
Staff
In context: Remedy Entertainment is known for pushing hardware to its limits, as most recently demonstrated with their intensive use of ray tracing. However, even in the absence of this resource-intensive feature, the system requirements for Alan Wake II are expected to rank it among the most demanding titles of 2023.

Remedy has unveiled an exhaustive system requirements sheet for "Alan Wake II," which will surely intimidate many. The sheet completely omits GPUs predating Turing and RDNA 2, signaling the end of the Pascal and RDNA 1 era. Notably, upscaling technologies like DLSS and FSR are listed as requirements.

Update (10/23): During the weekend, Redditors caught a Twitter comment from a Remedy developer who said the minimum graphics card requirements are explained by the need for mesh shader support, which is absent from previous generation GPUs (GeForce GTX 1000 and Radeon 5000 series). The somewhat controversial tweet was later deleted, but it read "I'm not sure it will run without it [mesh shader]. In theory the vertex shader path is implemented but had lots of perfs issues and bugs, we just dropped it. Meaning it might be possible to bring it back with a mod but don't expect miracles."

While the game's specifications seem high, they align with recent high-end releases such as A Plague Tale: Requiem, Lords of the Fallen, Immortals of Aveum, and Forspoken. To achieve gameplay at 1080p and 60 frames per second, somewhat beefy graphics cards like the GeForce RTX 3070 or Radeon RX 6700 XT are necessary. However, a notable point of concern is Remedy's apparent assumption that players won't be running Alan Wake II at native resolution.

Remnant II previously raised eyebrows when its developers disclosed the game was optimized for DLSS or FSR. However, Alan Wake II goes a step further, dedicating an entire system requirements section to upscaling. Every one of the six performance tiers includes upscaling modes, even for a 1080p output resolution.

Click to enlarge

The 1080p 60fps spec indicates medium graphics and upscaling at performance mode, meaning half resolution. The setting is typically intended for 4K displays, upscaling from a 1080p internal resolution, but engaging performance mode with a 1080p output means running Alan Wake II at 960 x 540. That's without ray tracing.

The game's recommendation for the lowest of three ray tracing profiles is similar to the 1080p 60fps spec, but with the frame rate expectation lowered to 30. Alan Wake II's cutting-edge path-tracing mode emerges in the two highest performance profiles, where AMD GPUs disappear in favor of high-end RTX 4000 series cards. By the way, Nvidia is bundling the game with high-end RTX 4000 GPUs until November.

On the bright side, aside from GPU performance, all other system specs are relatively standard. The 90GB storage space requirement isn't modest but aligns with contemporary standards for AAA games. All performance tiers require 16GB of RAM, unlike some recent releases that recommended 32GB for 4K gameplay.

Alan Wake II s set to debut exclusively on the Epic Games Store on October 27. Remedy has not indicated a release date for Steam yet.

Permalink to story.

 
Though it's hard to tell from that linked video, it doesn't look like any groundbreaking graphics that need upscaling. More along the line of it's not optimized and they're trying to cover their A$$. I could very well be wrong, but that's what I'm going with until I see more.
 
Though it's hard to tell from that linked video, it doesn't look like any groundbreaking graphics that need upscaling. More along the line of it's not optimized and they're trying to cover their A$$. I could very well be wrong, but that's what I'm going with until I see more.
I was thinking the opposite, I was thinking that it looks amazing. Everything from the lighting to the polygon count.

That said, I was a fan of upscaling tech because I felt it would extend the life of older cards. The idea that upscaling tech is a requirement for modern high end graphics cards for Playable frames is just unacceptable
 
This is the future of gaming Nvidia wanted: There is no native resolution, there is no game optimization, all games require absurdly powerful system for no good reason other than well, they can demand that out of gamers stuck in the Nvidia "Go buy a new GPU to get a newer version of DLSS" cycle.

So I feel I should keep repeating this because these stories are what I'm talking about when I say this: Nvidia is poison to PC gaming and it will end up being near lethal poison at this rate.
 
This is the future of gaming Nvidia wanted: There is no native resolution, there is no game optimization, all games require absurdly powerful system for no good reason other than well, they can demand that out of gamers stuck in the Nvidia "Go buy a new GPU to get a newer version of DLSS" cycle.

So I feel I should keep repeating this because these stories are what I'm talking about when I say this: Nvidia is poison to PC gaming and it will end up being near lethal poison at this rate.
Agreed but the core reason is unrealistic time schedules so devs take shortcuts and don't optimize anything as they don't even get time to fix game breaking bugs. Nvidia is an enabler no doubt.
 
This is the future of gaming Nvidia wanted: There is no native resolution, there is no game optimization, all games require absurdly powerful system for no good reason other than well, they can demand that out of gamers stuck in the Nvidia "Go buy a new GPU to get a newer version of DLSS" cycle.

So I feel I should keep repeating this because these stories are what I'm talking about when I say this: Nvidia is poison to PC gaming and it will end up being near lethal poison at this rate.

I like my 4090
 
Remedy has usually delivered on both the gameplay and graphics sides. They have been in Nvidia's "TWIMTBP" camp, including Nvidia-centric RT in Control. All RT at the time was Nvidia-centric, but I was still able to play Control on AMD GPUs. Now they're in line with Nvidia's "Moore's law is dead, no more real performance increases, just upscale and fake more frames" BS. Still looks like I can play Alan Wake II on AMD if I want.

What's more important to me, and relevant to the graphics issue, is the Epic exclusivity. That means by the time the exclusivity is over and I'll consider buying the game, I'll likely have a faster GPU, anyway.

With the growing handheld market, developers hopefully have an incentive not only to push graphics boundaries, but to include options to run on lower-powered chips.
 
Ahh yes. Remedy have a bad history of this, pushing hardware way too far and relying on upscaling. Long before DLSS was a thing. I remember Alan Wake on Xbox 360, which was possibly the lowest resolution title of that entire generation: 960 x 540. I remember people praising the visuals when all I could see was this fuzzy mess on my HDTV. Drove me nuts.

Then Quantum Break on Xbox One, yet again stealing a lot of pixels: 1280 x 720. Another blurry title when virtually everything else comparable that generation was at least 900p. I have to emphasize, both were single player, closed environment games that also suffered very poor, sub 30FPS performance despite their resolution!

At least DLSS does work rather well, but still. RTX 4080 for 4K with ray tracing but that's in "performance" DLSS mode. Natively this is only 1920 x 1080 before the image resolve to 4K.
 
Agreed but the core reason is unrealistic time schedules so devs take shortcuts and don't optimize anything as they don't even get time to fix game breaking bugs. Nvidia is an enabler no doubt.
This is interesting because I would even take it a step further, unsubstantiatedly so but still tracks: Nvidia really pushing DLSS heavily for a couple years now really lines up rather well with game publishers looking to cut back on costs and dev time as much as possible and if you look at your entire cycle, the biggest spot where there's a ton of diminished returns is the time it takes (henceforth, money) to optimize a game vs how much better performance you end up with afterwards.

So while gamers foolishly thought 'Oh cool, free performance I want upscaling tech!!' most games in production since the release of the original DLSS and moving forward were probably ordered to cut development time by as much as 20 or 30% or even more since "You won't need to optimize too much, we talked with Nvidia (or AMD) and decided to include upscaling so as long as it runs ok as low as 720p we can upscale it to 1080p near launch with just ML upscaling"

So even if this wasn't the explicit purpose or the way Nvidia initially marketed DLSS to their game publisher business partners, they've gotta at least hint at this if not outright say it at this point: Save a ton of money on dev time near release by not optimizing nearly as much, we can help with this BS upscaling"
 
Vermintide 2 just added dlss, reflex and fsr 2 via patch 5.1.0. Does anyone know why it would cause game latency to go up from 4 ms to 20 ms after patch with the same in game settings? Does adding dlss to a game change anything in the source code even with dlss off?
 
This is the future of gaming Nvidia wanted: There is no native resolution, there is no game optimization, all games require absurdly powerful system for no good reason other than well, they can demand that out of gamers stuck in the Nvidia "Go buy a new GPU to get a newer version of DLSS" cycle.

So I feel I should keep repeating this because these stories are what I'm talking about when I say this: Nvidia is poison to PC gaming and it will end up being near lethal poison at this rate.
go buy a console then and pretend amd's gpu in XSX is producing native 4K with the cost of a 4070.

qQaNyHM.jpg
 
go buy a console then and pretend amd's gpu in XSX is producing native 4K with the cost of a 4070.

qQaNyHM.jpg

Lol he said nothing about consoles yet you're meatshielding a trillion dollar company implying that blurry mess on the PC is somehow a good thing? Just because the XBox is crap doesn't excuse trash DLSS visual and FPS performance thanks to absent dev optimization on PCs that cost twice as much.
 
It feels like ALL 2023 AAA games came out in this poor state. Is this some kind of scam to make gamers upgrade? This might just be the beginning of an end of AAA gaming.
 
That said, I was a fan of upscaling tech because I felt it would extend the life of older cards. The idea that upscaling tech is a requirement for modern high end graphics cards for Playable frames is just unacceptable

This is my take as well, expecting it for high end cards is unacceptable. These technologies are for extending the life of older cards or keeping integrated graphics like the 5700G's iGPU useable.
 
Something I don't understand is why modern games at low settings look worse than games made 10+ years ago and still run worse

Screenshots from the newly released game Lords of the Fallen look absolutely awful. I haven't watched any actual gameplay video, but from screenshots used on some sites for the benchmark tests look awful. It could just be the screenshots themselves that suck and the game might be beautiful, but if the screenshots I saw look abysmal, then why would I want to buy it and play it? Pictures from the game Shadows of Modor looks better and it's nearly a 10 year old game.

These new "flashy" games that require a crap ton of horsepower to run them and then they can't so you have to resort to DLSS/FSR/XeSS to get the game to run at a good playable framerate is awful.
 
Back