Nvidia DLSS added to Crysis Remastered, System Shock remake demo

Shawn Knight

Posts: 15,294   +192
Staff member
Why it matters: Nvidia has announced that DLSS is now available in two additional games: Crysis Remastered and the System Shock remake demo. Both are beautiful games that should benefit immensely from the upscaling tech, especially Crysis Remastered, which is much like the original in that it is tough on hardware.

The first game in the Crysis series debuted in 2007 and was a visual masterpiece. It was also very taxing on PC hardware of the era, resulting in the popular, “But can it run Crysis?” meme. The remastered version launched on September 18, 2020, on PlayStation 4, Xbox One and PC, and after some patch work, seems to be more akin to what gamers were expecting from the release.

The System Shock remake demo, meanwhile, dropped last month courtesy of Nightdive Studios. It provides players with a glimpse of what the developer has been working on over the past five years and what they can expect when the full launch takes place this summer.

The System Shock demo uses the Unreal Engine 4 plugin to implement DLSS. Matthew Kenneally, lead engineer at Nightdive Studios, said they were able to use the plugin to add DLSS support to System Shock “over the weekend.”

Nvidia further noted that when The Fabled Woods drops on March 25, it’ll be DLSS-ready. This “dark and mysterious narrative short story” was developed by CyberPunch Studios and is being published by Headup. There’s a free demo of the game over on Steam if you’re interested in giving it a spin.

A full list of games that support DLSS can be found here.

Permalink to story.

 
They are remaking system shock!? I did not know this. I will definitely have to get on board with that, hopefully it won’t be like £50+ on launch.

As for DLSS. It’s amazing, I’m very glad to see it coming to more games. As a consumer this is very good news for us. However I would imagine AMD are not best pleased! Lol.
 
I'm hoping Nvidia soon, or at least eventually, decides to release a 3050 with support for DLSS. Not sure they will since such a card would eat through the profits of the higher end ones: If it can't raster high end details just lower to 720 high and then DLSS back up to 1440p

But hey it would definitively push DLSS 2.0 in a big way plus a 3050 might even escape some of the Ethereum craze so it would be a good idea but not sure they will, they might just go with 1750 and intentionally remove DLSS support from it.
 
Aahh, DLSS, another little tool from nvidia to make sure the prisoners dont escape...

It's a real shame when people complain about getting double the FPS because 1 company decided to make it happen, Let me guess.. someone writes letters instead of sending a text or email because "I escaped the prison they cant imprison me!"
 
It's a real shame when people complain about getting double the FPS because 1 company decided to make it happen, Let me guess.. someone writes letters instead of sending a text or email because "I escaped the prison they cant imprison me!"

It is possible, in my opinion, that both of you are right: DLSS *is* a very desirable feature and greatly improves experiences for gamers, yet at the same time Nvidia is going to try their best to take advantage of FINALLY having the killer feature they wanted to have to try and starve out AMD out of the GPU market.

They tried this exact tactic with G-Sync and Ray Tracing and AMD is doing the same it always does to respond: come back later with an inferior but "open source" implementation of the same thing.

Issue is:

1) AMD is well, always second and never leading with this kinds of features
2) Even with arguably competitive or even superior product stack on the high end right now, they're losing ground to Nvidia due to how bad their supply of cards is. And no, not just the pandemic or Ethereum: if everything else was normal AMD would still be shipping a tiny fraction of the GPUs Nvidia is able to release to market.

While we can't really fault AMD for 2) in this case (In my opinion, given the TSMC constrains they must have their strategy of prioritizing Ryzen, Epyc and consoles is actually very sound) it is going to mean that for long enough for many games to implement DLSS 2.0 as the standard and for the consumers to expect them means Nvidia probably has the killer feature they wanted and AMD will not have something to respond that gets adopted by nearly as many games.
 
It is possible, in my opinion, that both of you are right: DLSS *is* a very desirable feature and greatly improves experiences for gamers, yet at the same time Nvidia is going to try their best to take advantage of FINALLY having the killer feature they wanted to have to try and starve out AMD out of the GPU market.

They tried this exact tactic with G-Sync and Ray Tracing and AMD is doing the same it always does to respond: come back later with an inferior but "open source" implementation of the same thing.

Issue is:

1) AMD is well, always second and never leading with this kinds of features
2) Even with arguably competitive or even superior product stack on the high end right now, they're losing ground to Nvidia due to how bad their supply of cards is. And no, not just the pandemic or Ethereum: if everything else was normal AMD would still be shipping a tiny fraction of the GPUs Nvidia is able to release to market.

While we can't really fault AMD for 2) in this case (In my opinion, given the TSMC constrains they must have their strategy of prioritizing Ryzen, Epyc and consoles is actually very sound) it is going to mean that for long enough for many games to implement DLSS 2.0 as the standard and for the consumers to expect them means Nvidia probably has the killer feature they wanted and AMD will not have something to respond that gets adopted by nearly as many games.
Umm sure you can fault them. They choose profits over making a dedicated graphics card that, and amd know this, wont turn much of a profit right away against Nvidia. For AMD it just makes sense, for a fan, well you are SOL if AMD video card is what you want. It aint happening anytime soon.
While nvidia dont have their dedicated cards either, they are found in laptops and pre built desktops now. Way more than anything AMD have or are doing.
 
Last edited:
It's a real shame when people complain about getting double the FPS because 1 company decided to make it happen, Let me guess.. someone writes letters instead of sending a text or email because "I escaped the prison they cant imprison me!"

I'd get double fps if I cut my resolution in half too!
 
It is possible, in my opinion, that both of you are right: DLSS *is* a very desirable feature and greatly improves experiences for gamers, yet at the same time Nvidia is going to try their best to take advantage of FINALLY having the killer feature they wanted to have to try and starve out AMD out of the GPU market.

They tried this exact tactic with G-Sync and Ray Tracing and AMD is doing the same it always does to respond: come back later with an inferior but "open source" implementation of the same thing.

So please explain how Freesync is inferior to Gsync?
 
So please explain how Freesync is inferior to Gsync?
There's been some articles explaining how there's more than just the VESA stuff to get a Gsync certification and how there's some actual processing being done and it's not just DRM and how the exception is Gsync Laptops which is basically just Freesync.

But I didn't write those articles and neither do I use either of these techs so we can just add "arguably" to the inferior and your objection is noted and justified imo.
 
It is possible, in my opinion, that both of you are right: DLSS *is* a very desirable feature and greatly improves experiences for gamers, yet at the same time Nvidia is going to try their best to take advantage of FINALLY having the killer feature they wanted to have to try and starve out AMD out of the GPU market.

They tried this exact tactic with G-Sync and Ray Tracing and AMD is doing the same it always does to respond: come back later with an inferior but "open source" implementation of the same thing.

Issue is:

1) AMD is well, always second and never leading with this kinds of features
2) Even with arguably competitive or even superior product stack on the high end right now, they're losing ground to Nvidia due to how bad their supply of cards is. And no, not just the pandemic or Ethereum: if everything else was normal AMD would still be shipping a tiny fraction of the GPUs Nvidia is able to release to market.

While we can't really fault AMD for 2) in this case (In my opinion, given the TSMC constrains they must have their strategy of prioritizing Ryzen, Epyc and consoles is actually very sound) it is going to mean that for long enough for many games to implement DLSS 2.0 as the standard and for the consumers to expect them means Nvidia probably has the killer feature they wanted and AMD will not have something to respond that gets adopted by nearly as many games.
Glad you got the point of my post, unlike the drones.

you forgot physx, gameworks and specially hairworks. Hell, they even used sli to try to force intel in giving them a x86 license.

this is nvidia MO and the sad reality, its working on two fronts: cuda on enterprise and marketing hype that their drones are swallowing entirely.

To be honest, I would even dare say that 95% of these YouTube reviews are not entirely honest due to nvidia bribing them.
 
Glad you got the point of my post, unlike the drones.

you forgot physx, gameworks and specially hairworks. Hell, they even used sli to try to force intel in giving them a x86 license.

this is nvidia MO and the sad reality, its working on two fronts: cuda on enterprise and marketing hype that their drones are swallowing entirely.

To be honest, I would even dare say that 95% of these YouTube reviews are not entirely honest due to nvidia bribing them.

Here's the thing: this is exactly how you innovate. NVIDIA making NVIDIA exclusive stuff on it's own isn't exactly new; remember when NVIDIA first introduced Pixel Shaders? Got rolled into DirectX as Shader Model 1.2. AMD makes it's own implementation later? Shader Model 1.4. And both got unified in Shader Model 2.0, and we ended up with a unified specification within the DirectX API.

We see the same thing with Gsync; VRR ended up getting included in the mainline HDMI 2.1 specification, in a vendor independent way. [Freesync remains optional in DP 2.0 for some reason however...]

As for some of NVIDIA's other API's, I *again* note PhysX is proprietary, but is an open standard that AMD is free to implement if it so chooses.
 
So please explain how Freesync is inferior to Gsync?
As far as I am aware it’s not. But the standard to be certified as Gsync is tighter. One thing a lot of people don’t realise is the range freesync operates in. So I have an Asus freesync monitor and this has freesync between 38-144hz. If my game drops below 38 fps then I lose the sync and get tearing. Gsync however requires the range to be between 0 and max fps so you will always be synced. The range per monitor can vary, cheaper freesync monitors can have a range starting as high as 48hz or higher.

There might be some differences regarding the Gsync chip on the monitor but I don’t know too much about that side of it to be honest.
 
It's probably just me but it would be nice if these articles just reminded us of what the acronyms mean, especially if the acronym is in the title. I'm getting on a bit now and suffering from TAO. Maybe the site should have a page that gives a quick single paragraph explanation of the acronyms used in your articles rather than driving me to other sites for an explanation.

(TAO total acronym overload)
 
There's been some articles explaining how there's more than just the VESA stuff to get a Gsync certification and how there's some actual processing being done and it's not just DRM and how the exception is Gsync Laptops which is basically just Freesync.

But I didn't write those articles and neither do I use either of these techs so we can just add "arguably" to the inferior and your objection is noted and justified imo.
I recall a review were the reviewer couldn’t justify the price of a gsync monitor over a freesync one, since the
As for some of NVIDIA's other API's, I *again* note PhysX is proprietary, but is an open standard that AMD is free to implement if it so chooses.
They didn't release all of it and they did it because once again, the industry rallied behind open standards.

thanks to that, its pretty much dead.

But I dont understand how a customer would be happy about propetary tech that limits your choices.
 
I recall a review were the reviewer couldn’t justify the price of a gsync monitor over a freesync one, since the
There are things Gsync does better; it tends to have a wider refresh range, for example. It also tends to be lower latency due to the HW module, which is generally what raises the cost a bit.

Long term, HDMI 2.1's implementation will win out, since pretty much ever display coming out will support it by default.
They didn't release all of it and they did it because once again, the industry rallied behind open standards.

thanks to that, its pretty much dead.
Technically, the CPU portion of the PhysX API is *by far* the most used physics implementation in games now.

As for the GPU accelerated portion, none of the other mainstream physics APIs handles multi-object dynamics as well as PhysX did.

And as I noted: PhysX *is* an open standard. AMD is free to implement it whenever they choose to do so.
But I dont understand how a customer would be happy about propetary tech that limits your choices.
Here's what happens, and has happened for nearly 30 years now:
1: NVIDIA notices there's some limitation that isn't being filled by existing APIs
2: NVIDIA makes an API that fills that need
3: AMD quickly comes up with it's own implementation, with limitations, and makes their implementation open to try and gain support
4: Microsoft notices there's a demand for such a feature; adds it to the next revision of DirectX
5: The vendor-specific standards die, since DirectX now covers all use cases
6: OpenGL notices they are now lacking a major feature; gets to adding it a few years after everyone starts using it.

Rinse and repeat.
 
There are things Gsync does better; it tends to have a wider refresh range, for example.
So far that I could find, the only place that this matters is bellow 30 hz and at that point, I think that the system has bigger problems to worry about.

It also tends to be lower latency due to the HW module
Sorry, but not true.
which is generally what raises the cost a bit.
"Cost a bit" started at a US$200 premium, now is around US$100.
Long term, HDMI 2.1's implementation will win out, since pretty much ever display coming out will support it by default.
Agreed and glad thats the case. If Nvidia had their way and the whole industry adopted their hardware solution, we would be paying more, because thats how they roll.
Technically, the CPU portion of the PhysX API is *by far* the most used physics implementation in games now.
Searched a bit on this one and seems to be tied to Havok.
As for the GPU accelerated portion, none of the other mainstream physics APIs handles multi-object dynamics as well as PhysX did.
Funny, seems that nobody is using that option though.
And as I noted: PhysX *is* an open standard.
Not all of it, read bellow.
AMD is free to implement it whenever they choose to do so.
Nobody but Nvidia can use Phsyx on a GPU, since the GPU acceleration layer is written in CUDA. While you can translate that to OpenCL (using https://github.com/ROCm-Developer-Tools/HIP ), that alone isn't enough. The bigger catch is, that PhysX alone doesn't offer much.
Most of the features used by games (cloth, debris, breakable objects, GPU side particles) are not part of PhysX, they were part of the APEX SDK and have now been moved to the Visual FX library.
The source code for the APEX SDK is incomplete, you can't get it to compile with GPU support. Without being able to recompile the APEX SDK, you can't get "PhysX support" (in the way that term is being used by many games) on non-NVidia-cards.
Here's what happens, and has happened for nearly 30 years now:
1: NVIDIA notices there's some limitation that isn't being filled by existing APIs
2: NVIDIA makes an API that fills that need
3: AMD quickly comes up with it's own implementation, with limitations, and makes their implementation open to try and gain support
4: Microsoft notices there's a demand for such a feature; adds it to the next revision of DirectX
5: The vendor-specific standards die, since DirectX now covers all use cases
6: OpenGL notices they are now lacking a major feature; gets to adding it a few years after everyone starts using it.

Rinse and repeat.
The irony of that part is that you quoted my belief that as a customer, I want options and you somehow are complaining that these companies were able to bring us the customers either the same or better implementation of a locked tech without shackles.

Example, I really loved the Arkham games and was not too happy in buying a GTX 970 to play it in all its glory.

I wont do that mistake again...
 
Back