'Dark Souls with guns' shooter Remnant II sets a worrying precedent by making upscaling...

So much negativity in the comments here, but the game has decent requirements (Graphics: GeForce RTX 2060 / AMD Radeon RX 5700) and also very positive reviews from the players on steam, so they did good for their audience. A lot of resources have been used to increasing productivity and I believe the "game was developed with upscaling in mind" probably falls in this category, but maybe they also tried to avoid some of the things that do not work well with upscaling.
 
My rig is only an AMD Ryzen 9 5900x optimized with a sweet curve setting. An AMD Radeon 7900XTX and 32GB 3800Mhz DDR4. I have played for about 3 hours and the game runs just fine.
"Only"??? If the game couldn't run fine on a set up like that then there's something seriously wrong.
 
"Only"??? If the game couldn't run fine on a set up like that then there's something seriously wrong.
I'm hearing the PS5 is rendering the resolution at 540p, incompetent developers, cluster fudge of an engine or the combination of both. This is why I am for game engine competition. Everyone is putting all their eggs in one engine. Maybe in a few years it be more efficient to run but then again games from Unreal Engine 4 were also disastrous lately. Even the recommended pc specs render at 540p and then we have some trolls laughing that you need a 4090/7900xtx to actually enjoy the game 🤦‍♂️.
 
So it's been like 10 years and game developers still can't figure out 4K 60 FPS.

Yet all we hear about 8K and 16K displays already.

What a joke.
They can, but then we hear crying about not wanting to run medium settings, because high and ultra exist. :)
 
So what's the bottom line here? Compared to other games that set graphical complexity based on true vs. upscaled capacity, does this approach look & play about the same, better, or worse?

As an enthusiast I might want to wade into the details of what the engine is doing, but I think for most players they're just going to care about what the final result was.
 
That sounds like backwards logic especially if their are no visual benefits or minimal. I would take better FPS over better shadows or reflections or whatever
 
The most ridiculous is the fact that the game is not even that, lets say advanced in graphics.
I feel like optimizing is simply stopped being a concern.
It is more of a hobby, to check if you gpu can do it on max settings.
As a graphic history enthusiast, it makes me sad.
It feels like there are no more goals to create something really stunning and next gen
 
Running smoothly at Ultra with my 8GB RTX 3080 and 11800H laptop. It might be in trouble when ray tracing is enabled later.
 
Such a gimmick; DLSS looks worse than native; noting "upscaled" about it. Should be called downscaling quality.
Yes. Upscaling is short for resolution upscaling. So if you have upscaling turned on and your game resolution set to 1080p, it will actually render the frame at a lower resolution than 1080p and then run it through a filter to try to remove the artifacts of stretching the image to fit 1080p. The point of upscaling is to get more fps at the cost of visual quality. If the upscaling filter is good then it will look almost as good as 1080p and give you more frames per second. You are making a tradeoff though. But if your video card is struggling to be playable at 1080p then you can turn on upscaling and get a playable framerate.
 
While game developers think they can get away with shoddy optimization, but the reality is that it will eventually drive people away from PC gaming. The reasons why PC gaming is so successful are mainly attributed to the lower cost of hardware while achieving better performance than consoles. In addition, you can upgrade parts to try and keep up. However at this point, even the less than 1 year old flagship is struggling without relying on upscaling technology. And as PC component prices keep going up, at some point, people will get jaded and stop chasing after upgrades. I myself have decided to bow out of this race.
 
Yes. Upscaling is short for resolution upscaling. So if you have upscaling turned on and your game resolution set to 1080p, it will actually render the frame at a lower resolution than 1080p and then run it through a filter to try to remove the artifacts of stretching the image to fit 1080p. The point of upscaling is to get more fps at the cost of visual quality. If the upscaling filter is good then it will look almost as good as 1080p and give you more frames per second. You are making a tradeoff though. But if your video card is struggling to be playable at 1080p then you can turn on upscaling and get a playable framerate.
Well, to me it looks awful; not as good. This is why I do not use it.
 
Lame. You might as well just cut down the texture quality (less VRAM required), use less detailed meshes (less VRAM and GPU power required), if even at 1920x1080 -- ON A 4090 NO LESS! -- you're going to have to run at lower res & upscale it anyway.

Straight up, given the near photorealism of some recent games (that tend to have steep requirements but not THIS steep), if this game is requiring DLSS at 1920x1080 on a 4090, it's poorly optimized, full stop.
 
Not a good early look for UE5. I think it also still boils down to a lack of time, money, and manpower as to why many games might be unoptimized. Most of the dev time is spent on the core game, art and assets, and fixing the most glaring bugs so optimization falls by the wayside. Maybe they need to improve culling in UE5 if devs can't be bothered. Ratchet & Clank looks better and runs better than this game, and same goes for even older games like RDR2
 
Well, to me it looks awful; not as good. This is why I do not use it.
depends, DLSS Quality or Balanced can look as good or even slightly more detailed than native rendering with TAA. FSR2 can also look good in games like Spider-Man whereas in others it can be prone to artifacts and distortion. XeSS works best on Intel GPUs
 
In the Nvidia control panel, the DLSS slider is only to increase sharpening correct? Do we even know how much it's scaling everything?

I had this crazy (still half asleep) idea that you might be able to downscale higher res's to your native res by using DSR and DLSS at the same time.
 
In the Nvidia control panel, the DLSS slider is only to increase sharpening correct? Do we even know how much it's scaling everything?
Where in the Nvidia control panel is the DLSS sharpening slider? Or do you mean DSR?
 
The first option in my Nvidia control panel "manage 3D settings. It just says image scaling so I figured that that was what this article is about. I have a 3080Ti.
 
The first option in my Nvidia control panel "manage 3D settings. It just says image scaling so I figured that that was what this article is about. I have a 3080Ti.
Ah, no that's Nvidia's driver-based scaling system -- it's not DLSS, as it doesn't use neural networks or tensor cores. It's done entirely through shaders, much like AMD's FSR.

 
I'm not paying for this. I'm not paying anyone to do this. And I sure as hell won't expect such a terrible performance out of my GPU. There.

It started with GPU pricing and I said "keep'em to yourselves", now I gotta say the same thing about this.
Even if the game were amazing, I'd probably play without paying. There's no way we should support this.
 
I'm hearing the PS5 is rendering the resolution at 540p, incompetent developers, cluster fudge of an engine or the combination of both. This is why I am for game engine competition. Everyone is putting all their eggs in one engine. Maybe in a few years it be more efficient to run but then again games from Unreal Engine 4 were also disastrous lately. Even the recommended pc specs render at 540p and then we have some trolls laughing that you need a 4090/7900xtx to actually enjoy the game 🤦‍♂️.
I totally hear you on game engine competition, but if they're not using Lumen (as mentioned in the article) I have to suspect they're doing something wrong. I don't want to say "incompetent," but Lumen is really efficient on the newer engine versions, you'd be crazy to turn it off.
 
I totally hear you on game engine competition, but if they're not using Lumen (as mentioned in the article) I have to suspect they're doing something wrong. I don't want to say "incompetent," but Lumen is really efficient on the newer engine versions, you'd be crazy to turn it off.
True but the game doesn't even have rally tracing and is running like Cyberpunk with pathtracing maxed out. If you don't call that incompetent I don't know what is. 😅 maybe I'm too harsh with my judgment but this is what I call brute forced computing!
 
Back