Alan Wake 2 will boot on GeForce GTX 10 and Radeon 5000 GPUs (running it is another story)

How come....I'll tell You why: You're all nVidia shills and You deserve to pay through the nose.
I saw My friend play RDR2 on GTX970 @1440p low-medium and It looked great, If You won't boycott that game, and those alike UE5 failures, You're making Your bed for foreseable future.

You need to up your game; this weak **** troll bait won't cut it this far down the thread. This has nothing to do with nvidia tech since the "issue" is a vendor-agnostic software feature that affects both AMD and nvidia. Try running this game on pre-RX6000 series with no full support for DX12.1 ultimate, and you can enjoy the same slideshow that pre-Turing GTX cards do.
 
AMD specifically made sure DLSS wasn’t in Starfield. That’s the difference. That was AMD trying to stop Nvidia’s (superior) tech getting into a game it sponsors.
That's absolutely not true, starfield is getting DLSS, it just wasn't avaliable at launch. But here is the thing about starfield, it sucks as a game
 
I remember when the 1080ti debuted and it was THE card for 4k performance... Now it can't even run a game at 1080p lol
graphics improved drastically over time.
In latest games and demos, it still does not look real, but you can see it getting there.
 
The 5000 series released just over 4 years ago. While they had no true "high end" card in the 5000 lineup, the 5700XT was the best AMD card at the time. 4 years seems kind of quick to be slowly obsoleting already.
 
That's absolutely not true, starfield is getting DLSS, it just wasn't avaliable at launch. But here is the thing about starfield, it sucks as a game
That’s not true either, it was never officially confirmed but we do have reason to believe it was deliberately removed for launch. There was a tweet with a snippet of an interview done by digital foundry that basically confirmed from three of the devs they were told to remove DLSS, the tweet was taken down pretty quick but you can find screenshots of it out there.

It was taken down and never confirmed but the devs have been incredibly tight-lipped on the subject ever since. If you’re going to that effort to make sure certain members of your team don’t get interviewed again, I think we can at least speculate not including DLSS was a decision made by management.

Edit: I am aware John claims it wasn't about Starfield but he not only refused to put the tweet back up and clarify what he was talking about, from what I can find, we never knew what he was talking about, just "he's personally spoken to three devs that he knew had to remove DLSS due to sponsorships".
 
That’s not true either, it was never officially confirmed but we do have reason to believe it was deliberately removed for launch. There was a tweet with a snippet of an interview done by digital foundry that basically confirmed from three of the devs they were told to remove DLSS, the tweet was taken down pretty quick but you can find screenshots of it out there.

It was taken down and never confirmed but the devs have been incredibly tight-lipped on the subject ever since. If you’re going to that effort to make sure certain members of your team don’t get interviewed again, I think we can at least speculate not including DLSS was a decision made by management.

Edit: I am aware John claims it wasn't about Starfield but he not only refused to put the tweet back up and clarify what he was talking about, from what I can find, we never knew what he was talking about, just "he's personally spoken to three devs that he knew had to remove DLSS due to sponsorships".
Well I'm not going to take your word for but I will assume what your saying is true for now. Now. AMD payed ALOT of money to help develop starfield, I don't have a problem with them using it to showcase FSR3.0. Further. NVidia has done does this crap all the time. anyone remember nVidia, the way its meant to be played" crap? I'll give you that DLSS is better but it isn't THAT much better than FSR.

But here is the thing that REALLY makes me made about upscaling tech. as a 4k gamer and while I'm not 4k120, I consider anything that's 75+ FPS smooth enough for my eyes. Upscaling tech is now being used by devs to create unoptimized games me being able to go from max to high and hit quality or balanced is now required for me to get 1080p60. So while I love the idea of upscaling tech it's purpose has been corrupted and I wish it had never been created.
 
Well I'm not going to take your word for but I will assume what your saying is true for now. Now. AMD payed ALOT of money to help develop starfield, I don't have a problem with them using it to showcase FSR3.0.
They didn't showcase FSR3.0 though, just ordinary FSR2.
Further. NVidia has done does this crap all the time. anyone remember nVidia, the way its meant to be played" crap?
So two wrongs make a right? At no point was I standing up for Nvidia's past wrong doings.
I'll give you that DLSS is better but it isn't THAT much better than FSR.
From my experience, it is THAT much better, for me, it's how much more stable the image is, FSR has way to much shimmering and ghosting for my liking, DLSS has none of that. Also, DLSS doesn't make sense for any developer not to include it, 85%+ of PC gamers are using Nvidia cards, if you've gone to the effort of putting FSR in, it's practically no more effort to put DLSS in and will benefit more people on PC than FSR will (in modern games).
But here is the thing that REALLY makes me made about upscaling tech. as a 4k gamer and while I'm not 4k120, I consider anything that's 75+ FPS smooth enough for my eyes. Upscaling tech is now being used by devs to create unoptimized games me being able to go from max to high and hit quality or balanced is now required for me to get 1080p60. So while I love the idea of upscaling tech it's purpose has been corrupted and I wish it had never been created.
I talked about this in another comment thread here on TechSpot recently, games have been using TAA for years to hide what games actually look like underneath, DLSS and FSR are simply more advanced versions, this has been happening for a long time now.

I don't know if you play Cyberpunk or not but you can add a line in an .ini file to turn off the built-in TAA, disable the other upscalers and you'll realise that the game is incredibly "jaggy". There's a few games over the years I could point you to that allow you to turn off the TAA, DLSS and FSR actually do a really good job considering what they have to work with.
 
I don't know if you play Cyberpunk or not but you can add a line in an .ini file to turn off the built-in TAA, disable the other upscalers and you'll realise that the game is incredibly "jaggy". There's a few games over the years I could point you to that allow you to turn off the TAA, DLSS and FSR actually do a really good job considering what they have to work with.
AntiAlias, over simplified, is upscaling that is then downsampled so smooth things out. those juggles are referred to as "aliasing"
From my experience, it is THAT much better, for me, it's how much more stable the image is, FSR has way to much shimmering and ghosting for my liking, DLSS has none of that. Also, DLSS doesn't make sense for any developer not to include it, 85%+ of PC gamers are using Nvidia cards, if you've gone to the effort of putting FSR in, it's practically no more effort to put DLSS in and will benefit more people on PC than FSR will (in modern games).

I
Yeah, if you put them in performance mode. realisticly, quality or higher is identical at 4k for me. to be perfectly honestly, my TV does a better job of upscaling than either FSR or DLSS. nvidia isn't even fully supporting DLSS among its generations of GPUs. I say this constantly, you're sacrificing quality for compatibility. If I have a 3090 I don't want to buy a 40 series card just to get access.to all for DLSS. DLSS made around the 50 series will likely not fully support the 40 series.

So, maybe 85% of gamers have nvidia cards but 85% of those gamers don't have full on DLSS support.

nVidia is also starting to piss people off,I'm one of them. I've been buying the green teams GPUs for 19 years, I bought my first AMD card this year.

They didn't showcase FSR3.0 though, just ordinary FSR2.
To be perfectly fair, aside from frame generation, fsr3.0 doesn't really offer much over 2.0. It's been my experience that frame gen adds a 1.5- 2 frames of input lag and that's a noticeable at lower frames at lower FPS.


On one finale note, Starfield is a garbage game. I dumped around 70 hours into it thinking "it'll get better" and it just didn't. Bethesdas last few releases have honestly sucked. Fallout 4 was atrocious unless you had a brain tumor, skyrim had the "level, loot and leave" combo down but the game didnt offer anything ground breaking and the story was lack luster at best.

Then we have starfield which, frankly feels incomplete. I will say that a BIG REASON Betheada decided to take AMDs money was that Microsoft was pushing them to release starfield. On top of that, all consoles use AMD GPUs so it didn't really make sense. I wouldn't be surprised if they got an order to cut DLSS after it wad already implemented, although I personally doubt that. I will say I bet they did put money on FSR2.0 because it is compatible with consoles and would be required for smooth play on XBOX. So more PC users may use DLSS but more gamers use AMD GPUs
 
All bells and whistles but no substance. New games nowadays are so...meh.

Mostly rush to make use of the graphics card tech, making even new cards crawl when all settings maxed out, yet, the gameplay is lacking.
 
AntiAlias, over simplified, is upscaling that is then downsampled so smooth things out. those juggles are referred to as "aliasing"
Yes I know and I'm specifically talking about TAA, Temporal Anti-Aliasing. It works the same way DLSS and FSR do, by combining information from past frames to smooth things over.
DLSS and FSR go a step further by recreating the image into a higher resolution.

Make no mistake, TAA has been used for years to hide how low res games are rendered. It's forced on by default in a fair few games (worth saying, some game engines call their tech something different like TSR (Unreal Engine) TSAA (CryEngine))

At the end of the day, it's practically the same inputs used for DLSS and FSR, they just do a much better job of not only anti-aliasing but also up-sampling the image.

If you want a laugh, you can edit an .ini file for Cyperpunk that disables the built-in TAA, then in the settings turn all upscaling off and man, it is jaggy heaven.
nvidia isn't even fully supporting DLSS among its generations of GPUs. I say this constantly, you're sacrificing quality for compatibility. If I have a 3090 I don't want to buy a 40 series card just to get access.to all for DLSS. DLSS made around the 50 series will likely not fully support the 40 series.

So, maybe 85% of gamers have nvidia cards but 85% of those gamers don't have full on DLSS support.
I completely understand the DLSS naming scheme is rubbish, but the 20 Series can use DLSS to upscale like they always have. As new cards have come out, they have more and better tensor cores, so Nvidia are pushing new technology to take advantage of the hardware, DLSS 3.5 which is Ray Reconstruction actually works fine on the 20 series:
2023-09-21-image-6-j_1100.webp

You're basically arguing that they shouldn't improve the hardware and add features, to keep people who bought the previous gen happy. In my mind, that's the opposite of progress.
Yeah, if you put them in performance mode. realisticly, quality or higher is identical at 4k for me. to be perfectly honestly, my TV does a better job of upscaling than either FSR or DLSS.
So sorry, you're saying, if you lower the resolution of the game to 1440p, leaving everything default in the engine, your TV is doing a better job of upscaling and you're getting the same performance as using DLSS? Because you'd be the first to point this out I think?
nVidia is also starting to piss people off,I'm one of them. I've been buying the green teams GPUs for 19 years, I bought my first AMD card this year.
I really wanted to build a computer for someone with a 7800XT this year, but everyone who's come to me to build a computer have all requested Nvidia cards annoyingly.
To be perfectly fair, aside from frame generation, fsr3.0 doesn't really offer much over 2.0. It's been my experience that frame gen adds a 1.5- 2 frames of input lag and that's a noticeable at lower frames at lower FPS.
Yeah I'm with you there, even on a 4090, I don't like frame-gen in first person games, just at all, even at high framerates, third person games though like the Witcher 3, as long as the framerate is nice and high, I can get some benefit out of it, rarely used feature though to be honest.
On one finale note, Starfield is a garbage game. I dumped around 70 hours into it thinking "it'll get better" and it just didn't. Bethesdas last few releases have honestly sucked. Fallout 4 was atrocious unless you had a brain tumor, skyrim had the "level, loot and leave" combo down but the game didnt offer anything ground breaking and the story was lack luster at best.

Then we have starfield which, frankly feels incomplete.
You aren't the first person to say this to me, I had quite a few friends who were really looking forward to Starfield, pre-ordered it and all, convinced it would be amazing, after a month of playing, they've all said the same thing "I really wanted to like, I really wanted to love it, but it's just crap".

When I was watching people play it, I was surprised at how often you're quick travelling or in a load screen, why they haven't figured out how to go into buildings or large rooms without a load screen yet is baffling.
I will say that a BIG REASON Betheada decided to take AMDs money was that Microsoft was pushing them to release starfield. On top of that, all consoles use AMD GPUs so it didn't really make sense. I wouldn't be surprised if they got an order to cut DLSS after it wad already implemented, although I personally doubt that. I will say I bet they did put money on FSR2.0 because it is compatible with consoles and would be required for smooth play on XBOX. So more PC users may use DLSS but more gamers use AMD GPUs
Oh I don't doubt any of that, my issue really, is that if you implement FSR, DLSS uses the exact same inputs from the game engine, if you've developed the game for FSR, DLSS and XeSS are a doddle to implement as an extra option, there's almost no reason to not include the other options if you've developed the game for one of them.
 
O boy xD I am one of the ones who wont drop mega cash on a shiny new GPU but I did get a 1080ti for cheaps not so long ago ^__^ I will play this I guess when it gets patched

Yes I get it 1080ti is an old boy now but he has 3584 cores backed by a mighty 352bit memory interface attached to 11gigs of gddr5x ram, a card of this specs just shouldnt turn into ewaste :<
:D
 
Last edited:
I will play this I guess when it gets patched

You're in for a mighty long wait to play this on that GPU then, as it'll likely never happen.

Why would Remedy waste the time and money to totally rewrite the rendering pipeline for a completed game and degrade visual quality when they chose to use mesh shaders specifically due to the visual fidelity boost they provide.
 
Aye I hear ya it will likely be some ultra pro die hard with time to burn ^_^ not the game devs themselves
 
FYI the 6800 by XFX fell to $379 and gets around 50 to 70 fps at 1440p ultra settings and fsr2 set to quality in Alan Wake 2.

XFX Speedster SWFT319 ,Radeon™ RX 6800 Core Gaming Graphics Card with 16GB GDDR6, AMD RDNA™ 2 (RX-68XLAQFD9) https://a.co/d/fNJMiJQ
 
I purchased (pre-ordered) EVGA GTX 1080ti OC. It was a brilliant GPU. It could be OCed to a level that actually did make a real difference in games. It's still alive as a dedicated F@H gpu - big points!!

EVGA got out the GPU business after the GTX1xxx cards.

To the topic of it's useful life (not reliability, but how can it do in games, 2k, 4k etc.)
Sadly I never gamed with it after upgrading to ASUS STRIX RTX3080ti gaming OC (also a very impressive card) but it's all time relative of course.

Approx, end of 2020 (or whenever RTX3080ti was released) I have not used it. But, up to that time at 1440p there was literally no (I spend a fortune at Steam) game it couldn't manage, and manage well.

Doom Eternal had, at the time, a hidden setting for graphics that would only appear with cards over 8GB. GTX1080ti had 11GB - back in 2016! I tried it, and after really overclocking the card to the max it would run well at that "secret" nightmare setting. Of course frame rates suffered no more than 50fps, but no stutter or any other problem. That amazed me. So I played on the next highest down and manually maxed every graphics setting. Perfect, fps was stable, approx 72 fps @ 2k.

Now at least two years later I can only speculate, but it would be useless for 4k recent games - of course!!
None of the new DLSS etc etc would work at all, but I reckon it would still make a good medium level Gpu level at up to 1440p. For people who play games from 2020 and earlier at 2k, no point getting a better card. But it's next to impossible to pick one up second hand, and I will never sell mine.

Just over 4 years of top notch no stutter high settings. That's impressive, New technologies have now rendered it obsolete in 2023, but for games where switching them off, and setting graphics levels custom per game it would still manage very good frame rates. Just not at 4k. Oh and the price...Jeez not even comparable to now.

One thing. The PC must have no bottle necks and balance of top of the line components, to match the top of the line GTX1080ti at that time. I think most understand this, but on PC it's pointless unless the person has a reasonable knowledge. Not a small amount of people never get the full potential out of their GPUs. Some don't even understand the most basic NVControl Panel to tune themselves. That's the very minimum, same goes for all PC parts.

I think consoles are great! But as building PCs, working with tech I game PC. If not it would be console. Even disregarding that, consoles are good, PC vs Console arguments just show the ignorance of those participating. It's a totally different thing. A family car and an expensive top notch sports car are compared becuause it's interesting, but that's it. Different market. Same goes for PC and Console.

Sorry, pet peeve of mine when people compare similar items, but totally different market.

Finally. My current card looks like it might match (Adjusting for the date) my Old GTX1080ti - two more years to go.
However if cost, accounting for inflation, is included in said comparison, the way things are now, including my RTX3080ti nothing will even come near the GTX1080ti.

All things considered, I think it will be a very long time with big market price changes needed to get near the stellar GTX1080ti. Best card I ever owned. (For it's time.)

Direct comparisons between generations of hardware, especially two generations are good as a matter of interest, but no more.

Any serious debate (which no one has done I think) about which card is better, is pointless. But, it's interesting to follow how tech improves with time.
 
FYI the 6800 by XFX fell to $379 and gets around 50 to 70 fps at 1440p ultra settings and fsr2 set to quality in Alan Wake 2.

XFX Speedster SWFT319 ,Radeon™ RX 6800 Core Gaming Graphics Card with 16GB GDDR6, AMD RDNA™ 2 (RX-68XLAQFD9) https://a.co/d/fNJMiJQ
Yea that is a nice price, sadly the cheapest in the UK for one of those is £360 big ones approx 3x what I paid for the 1080ti ^^ food for thought at least
 
Anyone looking for a decent 1440p pc Walmart has an Asus pc for sale
ASUS
ASUS ROG Strix GT15 Gaming Desktop, Intel Core i7-12700F, NVIDIA GeForce RTX 3080, 16GB DDR4, 1TB SSD, Windows 11, G15CF-WB786
Now $1,299.00
You save $600.00
You save
$600.00
was $1,899.00
$1,899.00
If You are looking at solid 60 fps at 1440p native without upscaling at maximum settings without rt in AlanWake 2.
Although you can probably build a better PC currently with similar price target like
7800xt getting you similar rasterization performance to the 3080 shy of rt performance by 10 to 15%. ( Techpowerup's review)
paired with the
AMD Ryzen 7 7700X, MSI B650-P Pro WiFi, G.Skill Flare X5 Series 32GB DDR5-6000 Kit, Computer Build Bundle
$635.96 SAVE $235.97
$399.99

or paired with Intel® Core™ i9-12900K
ASUS Z790-V Prime WiFi MB
32GB DDR5
3-in-1 Combo
SAVE $306
$399.99
at microcenter.
This back Friday should have some sweet deals.
 
Last edited:
Back