Alan Wake II assumes everyone will use upscaling, even at 1080p

Back in the early 2000's I remember no problems running any games at anything up to 500+fps (rtcw and wolf ET on openGL, for example). The first I remember any problems was with Crysis which didnt run well on my 9800 pro. This game seemed to spur on the start of the constant GPU, upgrade trend.
My point is that 60fps has never been ok with me and was never something I was aiming for being a good frame rate. The best, minimum fps to be aiming for is 80-100 and if I'm not hovering around that mark or higher, I probably wouldn't play the game. Someone suggesting 30 fps on pc is honestly just taking the p*ss.
 
Back in the early 2000's I remember no problems running any games at anything up to 500+fps (rtcw and wolf ET on openGL, for example). The first I remember any problems was with Crysis which didnt run well on my 9800 pro. This game seemed to spur on the start of the constant GPU, upgrade trend.
My point is that 60fps has never been ok with me and was never something I was aiming for being a good frame rate. The best, minimum fps to be aiming for is 80-100 and if I'm not hovering around that mark or higher, I probably wouldn't play the game. Someone suggesting 30 fps on pc is honestly just taking the p*ss.
Can I buy pot from you? My CRT maxed out at 90hz in 2004. IIRC, the max you could do on a CRT over a VGA cable was 1024x768@100hz and even then, you risked damaging the monitor. I think I ran 1280x1024@72hz on my Sony CRT

Higher than 100hz you would get syncing and smearing issues
 
The fault is on buyers which will go and buy these messed up code and cards for a ton of money. If they sell well unoptimized games and inflated priced cards, even if it’s a mess, why bother?

If people just don’t buy, they will notice it and just release good games and well priced cards. If people accept 540p from a mid end PC, then it’s a feast for these companies.
 
It is rather sad, we can probably only depend on id Software titles to be super-optimized moving forward, perhaps some games from Japanese developers as well. And everyone seems to be moving to Unreal Engine 4/5 only rather than using their own engines like Frostbite, Luminous, Foundation, etc. Probably because all the new Gen Z programmers/designers are most familiar with UE.
That being said, Alan Wake II does look really good in terms of asset quality and visual features, and Control looked good even on low/medium settings. People just have to learn that you can't run high or ultra and expect good performance all the time, at least not on current hardware.
 
Back in the early 2000's I remember no problems running any games at anything up to 500+fps (rtcw and wolf ET on openGL, for example). The first I remember any problems was with Crysis which didnt run well on my 9800 pro. This game seemed to spur on the start of the constant GPU, upgrade trend.
My point is that 60fps has never been ok with me and was never something I was aiming for being a good frame rate. The best, minimum fps to be aiming for is 80-100 and if I'm not hovering around that mark or higher, I probably wouldn't play the game. Someone suggesting 30 fps on pc is honestly just taking the p*ss.
Lol Crysis was a very demanding 2007 game. The 9800 Pro was from 2003 and was the minimum recommended GPU. Things were moving fast in that era, by 2008 most PC ports wouldn't run on anything less than Shader Model 3.0
 
Remedy literally screwed themselves by implementing RTX. The game cannot even run on enthusiast hardware at decent resolution and frame rate.

1080p DLSS performance is in reality 540p...

2160p at DLSS performance is in reality 1080p...

amd-fidelityfx-super-resolution-nvidia-dlss-resolutions.jpg
 
Sorry, but for the moment I only have one description for this: pathetic.
This is Crysis 2 (sponsored by Nvidia) excessive tessellation shenanigan all over again. At that time it was used to put Nvidia's GTX's in a better light, as their hardware was faster at tessellation than Radeons.
 
Last edited:
Well, my 2060S won't need to worry - I've got both earlier Wake games and amassed a grand total of 100 minutes in the two - probably split several attempts to see if I actually could get to like them (I didn't)
 
Remedy literally screwed themselves by implementing RTX. The game cannot even run on enthusiast hardware at decent resolution and frame rate.

1080p DLSS performance is in reality 540p...

2160p at DLSS performance is in reality 1080p...

amd-fidelityfx-super-resolution-nvidia-dlss-resolutions.jpg
It'll be interesting to come back to this game 4-5 years from now and see how it runs on an RTX 6090. I don't think it's necessarily unoptimized, it's just demanding
 
I was thinking the opposite, I was thinking that it looks amazing. Everything from the lighting to the polygon count.

That said, I was a fan of upscaling tech because I felt it would extend the life of older cards. The idea that upscaling tech is a requirement for modern high end graphics cards for Playable frames is just unacceptable

Check this Graphics from 2015 and you can run it on Ultra 1080p (no upscaling) on GTX1060.
 
Something I don't understand is why modern games at low settings look worse than games made 10+ years ago and still run worse
It is intentional, to force the gamers to buy new hardware.
RTX 4090 for 30 fps@1080p FTW! tHe wAy iT'S mEANt tO Be PlAyEd! The more you buy the more you play! It just works!
 
It's really fascinating that they think this ok-looking game is going to get me to tolerate sub 60 FPS at literally 1/4th of 1080p and all the artifacting that comes with that on a freakin' 3070 of all things.
 
Day by day, I have more reasons not to pay for AAA games and just enjoy some classics that I already have. My 3070ti will soon be the minimum requirement for 30fps 1080p at this rate...
 
I remember the days when people were have troubles getting a GPU and not just a current gen, but even used last gen or two. I had friends with backorders on systems because they're just looking to get a new computer with a 3080 - powerhouse of a GPU at a mostly affordable price (MSRP price, that is) - with its roughly 30% gains over a 2080. It was a solid improvement. It had no issues playing games and even offering acceptable levels of performance with RT enabled.

These days you can go on any online retailer or even a local store to pick up a $1200 4080 that can barely run the newest games coming out without resorting to proprietary software programs to help boost the framerate to an acceptable playing range of 30-60fps at 1080p.

Something is very wrong here. I'm not sure where the blame lies or if it's a mixture of problems: one being Nvidia with all the new DLSS+ versions they keep making and then the push for RT and now PT and the other being devs thinking if they make a game realistic enough with graphics it'll make the crap game they made actually feel like it's a fun game!
 
It's fascinating people attack nvidia about a game that hasn't even been released yet. Completely forgetting that most unoptimized titles are AMD sponsored. There is a track history, nvidias sponsored games generally perform and look great. Amd sponsored on the other hand, where do I start? Forspoken? TLOU? Godfall? Jedi? Starfield? Immortals of avernum? Shall I keep going?

But yeah, greedy nvidia and their sponsored games forcing us to upgrade, lmao. Delusional.
 
The RT Specs I get. After all, it supports Path-Tracing on high and medium. So, it's no worse than Cyberpunk 2077. Considering the hit typically taken from RT, often up to 50% decrease in fps, you would think that raster would be a lot more forgiving than this.
 
Can I buy pot from you? My CRT maxed out at 90hz in 2004. IIRC, the max you could do on a CRT over a VGA cable was 1024x768@100hz and even then, you risked damaging the monitor. I think I ran 1280x1024@72hz on my Sony CRT

Higher than 100hz you would get syncing and smearing issues
CRTs have a limit on both horizontal and vertical sync frequency. A decent 19" monitor would do 96-120kHz horizontal and most monitors do at least 160Hz vertical. You can't hit 160Hz at higher resolutions but you can if you lower the resolution. The resolutions you state are indicative of a ~80kHz horizontal frequency cap so likely you didn't have a very high end monitor, probably a 17" which those numbers are about average for (70-82kHz). My 19" does 110kHz and 160Hz and can achieve 160Hz at about 800x600 (or in fact 960x540 like Alan Wake II runs at lmao). Smearing issues at 100Hz and higher is likely indicative of a poor quality VGA cable and/or DAC. My older machines have a sharp decline in image quality above about 75Hz but my modern DP to VGA DAC does 160Hz fine. My typical resolution for day to day use is 1600x1200@85Hz.
 
It's fascinating people attack nvidia about a game that hasn't even been released yet. Completely forgetting that most unoptimized titles are AMD sponsored. There is a track history, nvidias sponsored games generally perform and look great. Amd sponsored on the other hand, where do I start? Forspoken? TLOU? Godfall? Jedi? Starfield? Immortals of avernum? Shall I keep going?

But yeah, greedy nvidia and their sponsored games forcing us to upgrade, lmao. Delusional.

ABSOLUTELY... FRIGGING WRONG!

Starfield is running well on both ecosystem.

Now tell me if this is the case when Witcher 3 was released... or Control... or Cyberpunk... or now, Alan Wake 2...

The answer is a D4MN no...
 
Do they anticipate a failure here? Usually you release on epic game store because:

1) to get the big wad of cash from epic to be a timed exclusive.

2) the control the "reviews" (do they even have a score system yet?)

So maybe they are hoping to get as much return as they can before the inevitable backlash and crap ratings appear?

Regardless, most AAA gaming is garbage now. I look at my most played list and the vast majority are indie titles or "special" companies like Larian Studios. Satisfactory, Factorio, valheim, jagged alliance 3, battlebit remastered, the Subnautica titles, Baurtrauma, etc etc. So many great PC "exclusives" and all vastly better then assassin's creed 975, COD 419, garbage blizzard sequel of IP they created 20+ years ago, etc etc
 
Vermintide 2 just added dlss, reflex and fsr 2 via patch 5.1.0. Does anyone know why it would cause game latency to go up from 4 ms to 20 ms after patch with the same in game settings? Does adding dlss to a game change anything in the source code even with dlss off?

DLSS can actually decrease game latency, as its rendered at a lower res. And AFAIK the changes made in the code to support DLSS don't affect the game with DLSS off
 
I was sold on DLSS that I could use it to increase performance without sacrifycing IQ on a game. For instance, I could play Control at 4K with my RTX3070Ti at playable frame rates. It was great.

But now its REQUIRED for playing games at minimum settings.
And it seems to be the trend with newer releases.

I don't mind having the OPTION to use upscaling, but it sucks that its becoming MANDATORY.

Are RTX4090 users to expected to play games at 4K with DLSS3.x performance mode to get playable fames?
 
Back