Sony publishes God of War PC system requirements, adds DLSS to Horizon Zero Dawn

Lounds

Posts: 1,080   +954
290X being on par with a GTX 960? I don't belive that for a second. The 290X was a beast.

I think that card would match the RX 570 settings.
 

Shadowboxer

Posts: 2,071   +1,650
FSR is not awful and it supports GPUs like the GTX 1060 which is the most popular GPU out there and could greatly benefit from it. DLSS and FSR don't have to be one or the other, like in HZD, have both so your AMD hating *** can turn it off.
No it is awful. Nobody is using it unless they absolutely have to. It ruins the image quality so much that unless your frame rates are unplayable you leave it turned off. DLSS was just the same in it’s first year or so. It’s not AMD hating it’s just fact. I’ve played quite a few games with FSR now and on all of them the setting has remained firmly off.

This is the difference though. FSR is good for older, outdated rigs to struggle to a playable frame rate. DLSS is more for high end builds for people trying to get high refresh rates.

There are several reasons why devs are prioritising DLSS over FSR, first of all Nvidia will market your game for free if you add DLSS, secondly the results are far superior with DLSS 2.0 or higher, often offering improved visual quality not worse. Thirdly, most people buying new cards today are buying RTX cards. The RTX3090 has outsold the entire RX6000 series by itself.

The fact is, AMD only made FSR available on GeForce because they wouldn’t have a hope in hell of getting devs to implement it otherwise. And even then it’s still struggling. A lot of people have heard of DLSS, not many people have heard of FSR.
 

Lionvibez

Posts: 2,621   +2,381
I think its important to point out DLSS has been out longer and in its second version. Where as FSR is basically still 1.0 if you compare it to DLSS 1.0 the gap is smaller. I will like to see how much it will improve given the same amount of time.

For me personally I prefer native res to either so I won't be using them.
 
This is the day my 2070 at 2k60 is no longer capable of ultra settings. Well except for Cyberpunk, which I was lucky to get 45 fps at 1080p with DLSS on performance. That's an outlier though.
 

Solokreep14

Posts: 14   +6
I feel RAM requirements is very subjective. Most developers will add a buffer in their recommendation because they have no idea what you are running in the background, in addition to the game. I feel 8GB is not going to cut it nowadays for a gaming system, and especially when you are pushing 1440p or higher. In any case, the minimal specs is still very acceptable.
I actually experienced 16gb of ram in my case not being enough. this is my experience and for sure was a dram bottleneck. in ghost recon breakpoint, I up my resolution to 4k, and my ram usage went up too, from like 9gb-10gb ish to my full 15gb out of 16gb, I had gotten a freeze for a second and UBISOFT sent a message in real time via the software saying lacking of system ram turn off background processing. I then realized that 16gb is the bare minimum for gaming, im at 32gb now and never had that problem again and there are plenty of games now that use a full 16gb