Ultra vs. High Settings in PC Games or: Why Ultra Quality Settings are Dumb

Great article

I am currently running an RTX 3080, and I am already doing this to go as close to 120 fps steady as possible

When I had an i5 750 cpu I remember that dropping the shadows from ultra to low would take me from 60 fps to 90 fps in Doom
 
Last edited:
The dilemma is the more we spend on a graphics card, the more we expect to play it at the latest and greatest Very High or Ultra setting. With graphics cards prices through the roof, the desire to play at max settings will be even higher. (Who wants to spend $500+ and not at least give Very High or Ultra a run?)

We just need graphics card prices to go down so we can be happy with playing “only” at High settings
 
I always appreciate when games provide detailed tooltips for each graphical option, mainly when it explains that "This option affects GPU/CPU performance".

It really lets you tweak the settings to fit your hardware, especially for those who do know that part of their system is weak but also don't know the ins and outs of every graphical option out there, along with how each directly relate to their system.

Thanks, Tim!
 
Previous gen games, I mean those early 2000s, the high and ultra high setting have drastic different, so u may want go as high possible.

Current gen means to be stable performance across platform than visual different
 
One of the things I really like about Nvidia's G-Force Experience software is that it allows you to optimize games for your specific GPU and monitor resolution. It's not always perfect, but its typically a good starting point. I use it whenever the game is available in the software and then tweak a little from there.
 
The dilemma is the more we spend on a graphics card, the more we expect to play it at the latest and greatest Very High or Ultra setting. With graphics cards prices through the roof, the desire to play at max settings will be even higher. (Who wants to spend $500+ and not at least give Very High or Ultra a run?)

We just need graphics card prices to go down so we can be happy with playing “only” at High settings

Yep: since all of the software is subservient to the console life cycle since big publishers like the consoles more (Because of the more strict walled garden) and that cycle is pretty long, GPU hardware vendors like Nvidia (And AMD trying to stay competitive) have been surpassing a lot what the software can actually do.

If you add the diminished returns of actually increasing detail levels which pushes everything in terms of hardware and development up for barely noticeable gains means that you get modern AAA games that look mostly acceptable to most people at Medium and pretty damn good for even people wanting a luxurious experience at High detail without even touching ultra or special features like ray tracing.

So yet another reason to scale back in AAA gaming, in case you all need pragmatic reasons that are applicable only to you as a consumer and the predatory, systematic abuse of developer employees that are driven to the point of mental & physical health issues wasn't enough of course, it also affects most of you directly and only ever makes money for the most part for the CEOs and boards of directors at these companies.
 
I don't like this article because render range is absolutely not a purely aesthetic issue. You also cannot demonstrate the gameplay effects of rendering more objects sooner and a longer range by showing a still frame

It's about when or if you get game information.

So personally unless you're fixing draw distances your not adjusting the graphics but the way the game plays.
 
Quality preset settings have been dumb for years. Instead of doing the intelligent thing (turning off big impacts (Ambient Occlusion) first whilst leaving textures as high as possible), half the devs just drop everything off a cliff at the same rate whilst still leaving in all the shader heavy stuff / same render distance so the game looks worse but often doesn't run that much faster. And I remember laughing when Skyrim launched and Ultra = 8x MSAA AND FXAA simultaneously. (The whole point of FXAA / SMAA is to use it in place of 4x MSAA for a performance boost).

Likewise, half the "Ultra" shader effects on mine like Chromatic Abhorration, Depth of Glaucoma, etc, get disabled anyway regardless of performance simply because they look utterly ridiculous and trying to mimic defects of light-capturing lenses in a 100% rendered game are the least "graphically realistic" things imaginable...
 
I play everything on "Ultra Settings" but that's because I play on a 55" 4K TV that does hardware upscaling. As a result, 720p looks identical to 2160p (believe me, I've tested this literally dozens of times) so there's literally no reason whatsoever for me to not use Ultra settings because the biggest impact to GPU performance is always resolution.

I was fortunate in the sense that I didn't know my TV did this before upgrading to my RX 5700 XT in August of 2020. Fortunate because I wouldn't have spent the money on it as I can easily get >=60fps in modern AAA titles with an R9 Fury at 720p. Perhaps not at "Ultra" settings but certainly with "High" and I would have been looking stuck between a rock and a hard place because either I'd be using the R9 Fury until it was literally useless or I'd be paying through the nose for a new video card. With the RX 5700 XT, I'll be able to play games at 720p for probably the next ten years when things like FidelityFX are taken into consideration. :laughing:
 
Last edited:
I’ve found that turning RT on makes a bigger difference to visual fidelity than changing the quality preset.

Which is why it’s annoying when reviewers state they don’t test RT because it tanks performance but then go ahead and test on ultra settings.
 
I just read about this a little while ago. The article also said why devs add ultra modes, but of course I forgot what they said. I'll look for it though.

Frame rate has always dictated my image quality settings. Start high then start dropping.
 
I don't agree, it depends on what you want, some people are OK playing at 30fps with max quality it doesn't make them dumb, others prefer high framerates and lower everything.
The compromise is not to lower presets but to tune each option.
 
I don't agree, it depends on what you want, some people are OK playing at 30fps with max quality it doesn't make them dumb, others prefer high framerates and lower everything.
The compromise is not to lower presets but to tune each option.
There's nothing visually wrong with 30fps as long as they're creamy-smooth 30fps. The thing is, it turns out that a faster frame rate makes your movements more accurate. I thought that the higher frame rates were all BS until I saw Linus actually demonstrate the difference. I tried it out myself and he was right. All those times that I was sure that I'd shot something but still missed now made sense. I did miss but the screen didn't refresh in time to show me that I missed.
 
I usually find that shadows quality, volumetric clouds, and particle physics are usually the settings you can dial down from ultra to high for 10-15% fps gains without any noticeable image quality degradation.
 
It's simple. Manufacturers use the "more is better" argument.
I remember when I had a phone that would do 60/90 hz on the screen, I set it to 60, then took it around to coworkers and said, look, here's that phone that does 90hz. They played with it, noting how fast & smooth it was. Then, I showed them it was set for 60, switched it to 90 and had them play with it again. They couldn't tell the difference.
In some cases, that's the same argument for the super duper high end speed/graphics for most games & video cards. It's kind of useless, but, used for marketing.
 
One of the things I really like about Nvidia's G-Force Experience software is that it allows you to optimize games for your specific GPU and monitor resolution. It's not always perfect, but its typically a good starting point. I use it whenever the game is available in the software and then tweak a little from there.
I was just about to post this. Yes, GeForce Experience does a reasonable job tweaking settings for gamers in order to find a balance between framerate and visual quality. I assume AMD graphics cards have similar software. And most modern games auto detect hardware to come up with recommended settings (Forza Horizon for example). I find GeForce Experience does a better job sometimes, FWIW. Tweaking game settings is more important now that most of us are unable to upgrade the graphics card.
 
The better investment is the display. A game on low settings at native res on an oled is gonna look better than the same game maxed on most ips displays. imho
 
Last edited:
Those few of us that can remember the huge leaps in resolution from 320x200 (CGA) to 640x480 (VGA) and on to 800x600 (SVGA) combined by the increase in colors probably wonder what all the fuss is about. The last 40-50 years has been quite a ride.
I do remember the moment I moved from green monochrome to 4 color CGA (not 4 bit but 4 colors with 64,000 pixels). I thought then the 4 colors were life changing (“It has color now!”). But then 4 color CGA to 16 color EGA blew my mind. Later, 256 color VGA made me scream “it almost looks real now!”

Of course nowadays we somehow have a few billion colors, 8.2 million pixels (with 4K) and even HDR. It really has been quite the 40-50 years and absolutely incredible to experience!
 
Some games like AC odyssey just look best when cranked to ultra, the world really pops when there's clutter and people and light/shadow effects everywhere.

but im also playing on a now lowly 980ti at 1080p, which is fine for me.
 
Good article! It's refreshing to see someone actually point out how minimal the difference in quality between these settings can be. I've found when I'm actually *playing* games (as opposed to pausing and soaking in the visuals) even medium usually looks the same (as long as they don't turn draw distance down excessively, which a few games in the past have...in which case usually I could run medium then turn draw distance back up).

I've got a GTX650 still in my home system... I am running Ubuntu 20.04 with Steam and Epic Games Launcher in wine and only have a few games that need medium, even on that hardware most can run high (and relatively few on ultra high, ultra high tends to be the one that pushes this card into "wow that's bad framerate" territory.)
 
Back