Ultra vs. High Settings in PC Games or: Why Ultra Quality Settings are Dumb

This is the whole reason why the industry transitioned to RT: it's the only graphical fidelity mountain left to climb!

Doom 2020 looks almost identical to Doom 2016; given the e fact that my 1060 maxes-out Doom Eternal at 1080p (80 fps) ,means my 1060 6gb still has several years of lifetime left in it (thanks easy console ports!)
 
As one that for many years was used to playing games at potato resolution on potato hardware, how it looks was never really much of a concern to me.

I'll opt for highest settings possible now that I'm finally able to, but am not opposed to still dropping some if I have a FPS target I'm trying to hit and still missing it.
 
That's a really good article. I can't ever get my head around the way some PC games are developed from a performance perspective. There are countless examples of poorly implemented visual features that tank performance for the sake of a minor gain in fidelity. But the obsessive in us really wants to switch them on!
Efficiency is king. If the cost/benefit ratio doesn't add up then it should be canned or traded for alternative features.
 
I think these articles are great - especially for younger readers . I think in reviews 30 of years ago the some reviewers would go into detail for optimal settings for most readers .
As other commentators have alluded to - some people may only know broad brush strokes eg eg ultra vs high , 4k vs 1440p .
I was never a big gamer back then - was off doing other things - wasn't it the antialiasing that made the hit 4x/4x or 16/x/16x and either the first number or second may a big difference ?

For immersive games - more & more than will be Psychoacoustics , Psychovisual etc - hyper details in the distance is BS - take 100 million dollar cartoon now - and freeze frame it - what it's not all in focus , or it's blurred !!!! yet it works - it's more real.
Directors it movies forcibly do it - The Good , The Bad and The Ugly - the face , the hand , the gun - you can't hear your blood pressure or heart beat but could imagine it .
So get the humans right, the sound right , the focus right - put in some setting graphics like a big old fat blow fly landing on some drying blood, a dog relieving itself .
Some people can sell a story with just an audio story
Some cartoonst can sell a story in a few 2D panels .

So tune your taxing games to what does not drop immersion/emotion - unless you are after perception avantage - how many of us turned up brightness, gamma when creeping around a dark area - didn't make it seem immersive - but it saved our lives from having our throats ripped out ?


 
It is indeed very misleading how benchmarks show results on the highest settings, so one might think his GPU is getting too slow if FPS is low in the chart. This is not the intention of course, benchmarks are for comparing the selected products against each other.
In practice even GTX 1060 still does surprisingly well in most games on reasonable settings and something like GTX 1080 Ti completely fine.
 
That makes me wonder if the eye candy emphasis may fall into the category of 'last gasp'. Could a large proportion of gamers find cost effective solutions in 5 year old GTX 1080s?
 
That makes me wonder if the eye candy emphasis may fall into the category of 'last gasp'. Could a large proportion of gamers find cost effective solutions in 5 year old GTX 1080s?

The GTX 1080 is in my main gaming rig (also a 5600XT in alternate rig, about the same fps). I target 1440p 144Hz but most AAA games do around 70-80FPS on a mix of med-hi settings (no CP2077 yet, tho). Do I notice a difference? Not during gameplay, though draw distances is what I tailor the most depending on the title.

Recently for fun I stuck a Radeon 8570 1GB I had laying around into an i7-4790 machine and played a few older AAA games like 2013's Tomb Raider and on these types of games:

Set everything to Lowest
Turn up Textures 1 notch
Turn on AA to lowest
Turn on Shadows to lowest

And then set to a resolution you can play at 45+fps (800p for this setup once I OC'd the little guy to 1000 & 1350), I played about 2 hours in the game and frankly: It felt minimally different than on my rig at home, the occasional FPS dip to the high 30s being the biggest hint.

Once you're in the game, and it's a *good* game, the importance of the eye candy fades quite a bit.
 
I play everything on "Ultra Settings" but that's because I play on a 55" 4K TV that does hardware upscaling. As a result, 720p looks identical to 2160p (believe me, I've tested this literally dozens of times) so there's literally no reason whatsoever for me to not use Ultra settings because the biggest impact to GPU performance is always resolution.

I was fortunate in the sense that I didn't know my TV did this before upgrading to my RX 5700 XT in August of 2020. Fortunate because I wouldn't have spent the money on it as I can easily get >=60fps in modern AAA titles with an R9 Fury at 720p. Perhaps not at "Ultra" settings but certainly with "High" and I would have been looking stuck between a rock and a hard place because either I'd be using the R9 Fury until it was literally useless or I'd be paying through the nose for a new video card. With the RX 5700 XT, I'll be able to play games at 720p for probably the next ten years when things like FidelityFX are taken into consideration. :laughing:

I have lg oled 4k 120hz display and it is known to be one of the best scaling displays out there.

I don't know how far away you sit from your large display but for me at around 4ft I can definitely see the difference in resolution from lower resolutions like 720p and 1080p but once I get closer to 1440p then the scaling really does make it nearly impossible to tell the difference between that and true 4k.

There is a softness and pixelization that comes into view around 1080p and even though I can sit back and not think about it and be fine if I stop and look I definitely can see it.

Again I know my display is one of the best out there for scaling and while I don't necessarily not believe you I have to assume you must be sitting much further back where I could absolutely understand not noticing a difference.

It's simple. Manufacturers use the "more is better" argument.
I remember when I had a phone that would do 60/90 hz on the screen, I set it to 60, then took it around to coworkers and said, look, here's that phone that does 90hz. They played with it, noting how fast & smooth it was. Then, I showed them it was set for 60, switched it to 90 and had them play with it again. They couldn't tell the difference.
In some cases, that's the same argument for the super duper high end speed/graphics for most games & video cards. It's kind of useless, but, used for marketing.
While I agree in some cases it's more of a placebo effect I've done blind testing with multiple friends using our go to game destiny 2 switching between 60 fps (which all of us were used to) to 120 fps when I got my 4k/120hz OLED.

They started at 60 fps then went 120 fps and while most were kinda iffy on their being a big difference I had them play for about 30 min at 120 fps and then switched them back and they IMMEDIATELY said it felt like they were walking through quick sand.

Not explaining at all what they were on at each point everyone of them said they wanted to go back to the 2nd option and had thought I had moved them to 30 fps on the 3rd test.

It's easy to be fooled into thinking the old is the new until you experience the new for a good while then the old definitely shows its flaws.
 
Last edited:
This is a good article. This highlights the fact that we don't always need very high end GPUs, which most gamers don't anyway, due to the cost involved. I often find it very hard to spot any differences between Ultra and Very High graphical settings as well, particularly with untrained eyes for such details. In addition, if one is immerse in the game, chances of you spotting something that looks subtlety better is even lower.
 
Me, looking at the screenshots: "They're the same pictures."

Honestly, at this point I'd be pretty satisfied with anything that can go High 1080p 60fps. 1440p 120fps is what I would call the actual point of luxury.
 
I have lg oled 4k 120hz display and it is known to be one of the best scaling displays out there.

I don't know how far away you sit from your large display but for me at around 4ft I can definitely see the difference in resolution from lower resolutions like 720p and 1080p but once I get closer to 1440p then the scaling really does make it nearly impossible to tell the difference between that and true 4k.

There is a softness and pixelization that comes into view around 1080p and even though I can sit back and not think about it and be fine if I stop and look I definitely can see it.

Again I know my display is one of the best out there for scaling and while I don't necessarily not believe you I have to assume you must be sitting much further back where I could absolutely understand not noticing a difference.


While I agree in some cases it's more of a placebo effect I've done blind testing with multiple friends using our go to game destiny 2 switching between 60 fps (which all of us were used to) to 120 fps when I got my 4k/120hz OLED.

They started at 60 fps then went 120 fps and while most were kinda iffy on their being a big difference I had them play for about 30 min at 120 fps and then switched them back and they IMMEDIATELY said it felt like they were walking through quick sand.

Not explaining at all what they were on at each point everyone of them said they wanted to go back to the 2nd option and had thought I had moved them to 30 fps on the 3rd test.

It's easy to be fooled into thinking the old is the new until you experience the new for a good while then the old definitely shows its flaws.

My brother and I have almost the same PC but his desk is longer and I notice his games look better. Both 1080p screen and funny enough even being an inch closer to the screen or back an inch makes a big difference if you're young and have real good eyesight. I believe there is a sweet spot but since I was young for some reason I like sitting CLOSE TO THE SCREEN.
 
The dilemma is the more we spend on a graphics card, the more we expect to play it at the latest and greatest Very High or Ultra setting. With graphics cards prices through the roof, the desire to play at max settings will be even higher. (Who wants to spend $500+ and not at least give Very High or Ultra a run?)

We just need graphics card prices to go down so we can be happy with playing “only” at High settings
NOTHING will ever "go down" (assuming we're talking about prices). We're all heading towards hyperinflation.
 
I play everything on "Ultra Settings" but that's because I play on a 55" 4K TV that does hardware upscaling. As a result, 720p looks identical to 2160p (believe me, I've tested this literally dozens of times) so there's literally no reason whatsoever for me to not use Ultra settings because the biggest impact to GPU performance is always resolution.
You are either sitting too far away from your TV, have your TV setting wrong, need a new hdmi cable, or need a new TV.

TV up scaling should only change lower resolution inputs.
 
Last edited:
I see a lot of games gimping planar reflections with added blur or filtering to lower the image quality to make the RT reflections look superior when they are not at all. they muddy the waters to make their look cleaner.
Sabotaging 25 years of innovation to push their agenda.
 
I see a lot of games gimping planar reflections with added blur or filtering to lower the image quality to make the RT reflections look superior when they are not at all. they muddy the waters to make their look cleaner.
Sabotaging 25 years of innovation to push their agenda.

In fact, this is true, though I'm not sure that's exactly what they are doing. Game with raytraced effects may have compromised solution for common rasterization. I don't know if they just solely focus on raytracing or purposefully gimp the usual way. Either way, the end result is not that great, given that raytracing may often look little out of place in games and definitely hurts performance. People should use their eyes to judge the result, only the quality/visual appeal matters - the end result, not the technique behind it.
 
In fact, this is true, though I'm not sure that's exactly what they are doing. Game with raytraced effects may have compromised solution for common rasterization. I don't know if they just solely focus on raytracing or purposefully gimp the usual way. Either way, the end result is not that great, given that raytracing may often look little out of place in games and definitely hurts performance. People should use their eyes to judge the result, only the quality/visual appeal matters - the end result, not the technique behind it.


Well, thge devs have only so much time to spend on it.

I remember when new DVDs rivaled early Bluray in clarity but since they have been putting more effort into the 4k transfers, the DVD has been handed-off to the intern.

A developer has a finite amount of time to spend developing a game (so they have less time optimizing, and prefer to hand it of t the intern.)

The whole point of using RT is to save developers time, and you desires don't line-up with theirs! Be prepared for a lot of new games with half-assed raster.
 
I've always thought of the ultra settings in games as something for benchmarkers and not players. Personally, I'm mostly about FPS, in-game stuttering sucks, especially for multi-player. I have a high-end rig and usually turn settings down and off so I can get the most frames. It's really all about game play and not eye candy - in today's games I find the environment graphics just fine no matter the setting. I just want it to be smooth!
 
You are either sitting too far away from your TV, have your TV setting wrong, need a new hdmi cable, or need a new TV.
What did you think, that I didn't really look closely? I thought I was imagining things until I tried it with my face less than 30cm away from the panel surface and actively searching for differences. I'm not exactly a noob here.
TV up scaling should only change lower resolution inputs.
I don't get what you're trying to say. The TV is a 4K/2160p TV which means that EVERY resolution that ISN'T 2160p is a lower resolution. You didn't really think this through did you? :laughing:
 
Back