Crysis Remastered dev on game's top graphics setting: "no card out there" can handle it...

midian182

Posts: 9,745   +121
Staff member
What just happened? Crysis Remastered arrives tomorrow (September 18) and while its recommended specs are very generous, the game's top graphics setting, called 'Can it run Crysis?' is as demanding on a PC's hardware as the notorious original game. According to one of the devs, "there is no card out there" capable of running the mode in 4K at 30fps.

The excitement that followed Crytek's announcement of Crysis Remastered in April turned to disappointment following the fugly trailer. Responding to the criticism, the company delayed the July release date by two months to ensure the remaster met fans' expectations. Judging from the latest comparison video (top), it could have succeeded.

Earlier this month, Crytek released the game's minimum and recommended PC specs, the latter of which asks for just a GeForce GTX 1660 Ti / AMD Radeon Vega 56 and Intel Core i5-7600k or higher / AMD Ryzen 5 or higher.

The company also revealed that the game's highest graphical setting would be called 'Can it run Crysis? ' a nod to the long-running meme that arrived following the original's 2007 release. Few PCs at the time could handle the FPS, especially at higher settings, and hardware right up until a few years ago still struggled.

Speaking to PC Gamer, Project Lead of Crysis Remastered, Steffen Halbig, said the 'Can it run Crysis?' mode brings "unlimited view distances. No pop ups of assets, and no LoD changes anymore." That all sounds like it will push graphics cards to their limits; Steffen went as far as to claim, "in 4k, there is no card out there which can run it [the game] in 'Can it Run Crysis? ' mode at 30 FPS."

That's quite a bold statement, though using the words "out there" likely refers to non-Ampere cards—the RTX 3080 only launched today. Even the crushing Microsoft Flight Simulator can reach 40 fps (high quality) at 4K in our tests. The beastly RTX 3090, meanwhile, can run Forza Horizon 4 at 78 fps at high graphics settings in 8K, according to Nvidia.

Even if Ampere is able to pass 30 fps in 'Can it run Crysis?' mode at 4K, the mode is obviously going to crush lesser cards and could become a new benchmark for testing GPUs.

Crysis Remastered lands on the Epic Games Store this September 18.

Permalink to story.

 
If that image is what they call demanding then they can keep their switch port with brute force computing. At this point Crysis 3 looks better and launched more than 7 years ago. While gamers want Crysis 4 with visuals that compete with Unreal engine 5, Crytek is focusing on the lowest common denominator on a remaster that can potentially due more damage than good to their infamous Crysis!
 
Last edited:
Unoptimized, unneeded visuals

Taking this kind of mentality to its logical conclusion, we'd all still be gaming on a 150nm NVIDIA GeForce4 460 Go at 1280X720.

Even further, you don't rly need much more than 16bit Amiga 500 to game, as long as you can distinguish the toon you are controlling from the surroundings.

The world would still be playin' Double Dragon.
 
If that image is what they call demanding then they can keep their switch port with brute force computing. At this point Crysis 3 looks more better and launched more than 7 years ago. While gamers want Crysis 4 with visuals that compete with Unreal engine 5, Crytek is focusing on the lowest common denominator on a remaster that can potentially due more damage than good to their infamous Crysis!
"looks more better"? How to English.
 
Taking this kind of mentality to its logical conclusion, we'd all still be gaming on a 150nm NVIDIA GeForce4 460 Go at 1280X720.

Even further, you don't rly need much more than 16bit Amiga 500 to game, as long as you can distinguish the toon you are controlling from the surroundings.

The world would still be playin' Double Dragon.

Would it, though? Because visuals are only half the equation, something that game companies seem to forget these days. There were literally a bazillion interactive things you couldn't do on those older CPUs and forget about distinguishing really small objects on gfx chipsets of the time. That said, I would be perfectly happy with a 2560x1080 monitor forever if the games provided as much content and depth as they did back in the golden days.
 
Whole Crysis Remaster is just not optimized and rushed and they are even proud of calling it: The newest masterpiece or that it can not even reach 4K at 30FPS, that is not funny or something to be proud of, it is just utterly sad.
 
It's $30, I might play it again. Of course I won't be able to run the 8K textures or 4K mode, but just to see what it would look like on my computer. I'm most interested in seeing that unlimited draw distance and the hardware-agnostic ray-tracing.
I didn't hear anything about the multiplayer though so I'm assuming that it won't have it.

It's definitely a state-of-the-art game. There is no game out here that has some of the graphical features that this one has.
 
Taking this kind of mentality to its logical conclusion, we'd all still be gaming on a 150nm NVIDIA GeForce4 460 Go at 1280X720.

Even further, you don't rly need much more than 16bit Amiga 500 to game, as long as you can distinguish the toon you are controlling from the surroundings.

The world would still be playin' Double Dragon.
But he didn't say that.
 
Taking this kind of mentality to its logical conclusion, we'd all still be gaming on a 150nm NVIDIA GeForce4 460 Go at 1280X720.

Even further, you don't rly need much more than 16bit Amiga 500 to game, as long as you can distinguish the toon you are controlling from the surroundings.

The world would still be playin' Double Dragon.
heeeeeey!!! what's wrong with Double Dragon? :laughing::laughing:
 
No lod on a map thats huge. So youll just kill the gpu with triangles smaller than single pixels? Thats called "unoptimized garbage", not "stunning visuals". Then again thats pretty much what you can implement in two months. This remaster really is just a fiasco. Should have done remake thing Boys.
 
Crysis was the first game(apart from far cry 1) that let me cut through a tree with an assault rifle anytime I wanted, so ??
 
The game lacks a lot of polygons to look really good, it seems they kept a lot of the original assets.
 
More powerful gpu’s are not the answer to substandard programming. Maybe it’s time for the folks at Crysis to consider another profession.
 
No card out there? Not even now, when the 3080 is out there? Actually, though, this spotlights what's wrong with Nvidia's DLSS (and its AMD counterpart). Game developers have to support DLSS. TV producers don't have to support upscaling in every brand of TV set out there. So there ought to be a setting where your computer thinks you have a 1080p monitor, and the video card just quietly upscales everything to 4K or 8K or whatever unbeknownst to applications or even the operating system. Of course, without extra hints from the software, it wouldn't do quite as good a job, but that way one could upscale any game.
 
Weird that they decided to boast about their shoddy optimisation, usually developers brag about the range of hardware their code can run on.

Highlighting the fact even the fastest GPUs can’t run your game with it’s fairly sub-standard graphics isn’t the swagger they think it is.
 
No card out there? Not even now, when the 3080 is out there? Actually, though, this spotlights what's wrong with Nvidia's DLSS (and its AMD counterpart). Game developers have to support DLSS. TV producers don't have to support upscaling in every brand of TV set out there. So there ought to be a setting where your computer thinks you have a 1080p monitor, and the video card just quietly upscales everything to 4K or 8K or whatever unbeknownst to applications or even the operating system. Of course, without extra hints from the software, it wouldn't do quite as good a job, but that way one could upscale any game.
Can't agree more, the stupidest thing of DLSS is its limitation to work only on some games, it should work across at least all modern games or say DX12 games, no game with DLSS is remotely interesting to me, it's a waste silicon, instead they can use this to improve upscaling like on TVs, a 1080p image on those 4k high-end TVs looks beautiful.
 
No card out there? Not even now, when the 3080 is out there? Actually, though, this spotlights what's wrong with Nvidia's DLSS (and its AMD counterpart). Game developers have to support DLSS. TV producers don't have to support upscaling in every brand of TV set out there. So there ought to be a setting where your computer thinks you have a 1080p monitor, and the video card just quietly upscales everything to 4K or 8K or whatever unbeknownst to applications or even the operating system. Of course, without extra hints from the software, it wouldn't do quite as good a job, but that way one could upscale any game.
I believe you are talking about SSAA...unless I misunderstood your statement.
 
Taking this kind of mentality to its logical conclusion, we'd all still be gaming on a 150nm NVIDIA GeForce4 460 Go at 1280X720.
I guess you're too young to remember that half the reason the Crysis games ran so poorly was indeed lazy optimization rather than eye candy:-

"That's right. The tessellated water mesh remains in the scene, apparently ebbing and flowing beneath the land throughout, even though it's not visible. The GPU is doing the work of creating the mesh, despite the fact that the water will be completely occluded by other objects in the final, rendered frame. Obviously, that’s quite a bit needless of GPU geometry processing load. We’d have expected the game engine to include a simple optimization that would set a boundary for the water at or near the coastline, so the GPU isn’t doing this tessellation work unnecessarily."

"Crytek's decision to deploy gratuitous amounts of tessellation in places where it doesn't make sense is frustrating, because they're essentially wasting GPU power—and they're doing so in a high-profile game that we’d hoped would be a killer showcase for the benefits of DirectX 11."


link
 
Back