That makes me wonder if the eye candy emphasis may fall into the category of 'last gasp'. Could a large proportion of gamers find cost effective solutions in 5 year old GTX 1080s?
I play everything on "Ultra Settings" but that's because I play on a 55" 4K TV that does hardware upscaling. As a result, 720p looks identical to 2160p (believe me, I've tested this literally dozens of times) so there's literally no reason whatsoever for me to not use Ultra settings because the biggest impact to GPU performance is always resolution.
I was fortunate in the sense that I didn't know my TV did this before upgrading to my RX 5700 XT in August of 2020. Fortunate because I wouldn't have spent the money on it as I can easily get >=60fps in modern AAA titles with an R9 Fury at 720p. Perhaps not at "Ultra" settings but certainly with "High" and I would have been looking stuck between a rock and a hard place because either I'd be using the R9 Fury until it was literally useless or I'd be paying through the nose for a new video card. With the RX 5700 XT, I'll be able to play games at 720p for probably the next ten years when things like FidelityFX are taken into consideration.
While I agree in some cases it's more of a placebo effect I've done blind testing with multiple friends using our go to game destiny 2 switching between 60 fps (which all of us were used to) to 120 fps when I got my 4k/120hz OLED.It's simple. Manufacturers use the "more is better" argument.
I remember when I had a phone that would do 60/90 hz on the screen, I set it to 60, then took it around to coworkers and said, look, here's that phone that does 90hz. They played with it, noting how fast & smooth it was. Then, I showed them it was set for 60, switched it to 90 and had them play with it again. They couldn't tell the difference.
In some cases, that's the same argument for the super duper high end speed/graphics for most games & video cards. It's kind of useless, but, used for marketing.
I have lg oled 4k 120hz display and it is known to be one of the best scaling displays out there.
I don't know how far away you sit from your large display but for me at around 4ft I can definitely see the difference in resolution from lower resolutions like 720p and 1080p but once I get closer to 1440p then the scaling really does make it nearly impossible to tell the difference between that and true 4k.
There is a softness and pixelization that comes into view around 1080p and even though I can sit back and not think about it and be fine if I stop and look I definitely can see it.
Again I know my display is one of the best out there for scaling and while I don't necessarily not believe you I have to assume you must be sitting much further back where I could absolutely understand not noticing a difference.
While I agree in some cases it's more of a placebo effect I've done blind testing with multiple friends using our go to game destiny 2 switching between 60 fps (which all of us were used to) to 120 fps when I got my 4k/120hz OLED.
They started at 60 fps then went 120 fps and while most were kinda iffy on their being a big difference I had them play for about 30 min at 120 fps and then switched them back and they IMMEDIATELY said it felt like they were walking through quick sand.
Not explaining at all what they were on at each point everyone of them said they wanted to go back to the 2nd option and had thought I had moved them to 30 fps on the 3rd test.
It's easy to be fooled into thinking the old is the new until you experience the new for a good while then the old definitely shows its flaws.
NOTHING will ever "go down" (assuming we're talking about prices). We're all heading towards hyperinflation.The dilemma is the more we spend on a graphics card, the more we expect to play it at the latest and greatest Very High or Ultra setting. With graphics cards prices through the roof, the desire to play at max settings will be even higher. (Who wants to spend $500+ and not at least give Very High or Ultra a run?)
We just need graphics card prices to go down so we can be happy with playing “only” at High settings
You are either sitting too far away from your TV, have your TV setting wrong, need a new hdmi cable, or need a new TV.I play everything on "Ultra Settings" but that's because I play on a 55" 4K TV that does hardware upscaling. As a result, 720p looks identical to 2160p (believe me, I've tested this literally dozens of times) so there's literally no reason whatsoever for me to not use Ultra settings because the biggest impact to GPU performance is always resolution.
I see a lot of games gimping planar reflections with added blur or filtering to lower the image quality to make the RT reflections look superior when they are not at all. they muddy the waters to make their look cleaner.
Sabotaging 25 years of innovation to push their agenda.
In fact, this is true, though I'm not sure that's exactly what they are doing. Game with raytraced effects may have compromised solution for common rasterization. I don't know if they just solely focus on raytracing or purposefully gimp the usual way. Either way, the end result is not that great, given that raytracing may often look little out of place in games and definitely hurts performance. People should use their eyes to judge the result, only the quality/visual appeal matters - the end result, not the technique behind it.
What did you think, that I didn't really look closely? I thought I was imagining things until I tried it with my face less than 30cm away from the panel surface and actively searching for differences. I'm not exactly a noob here.You are either sitting too far away from your TV, have your TV setting wrong, need a new hdmi cable, or need a new TV.
I don't get what you're trying to say. The TV is a 4K/2160p TV which means that EVERY resolution that ISN'T 2160p is a lower resolution. You didn't really think this through did you?TV up scaling should only change lower resolution inputs.