Capcom reveals next-gen consoles run Resident Evil Village at 4K/45fps with ray tracing...

Cal Jeffrey

Posts: 4,171   +1,421
Staff member
In context: Sony and Microsoft have both promised their next-generation consoles would introduce casual gamers to the ray-tracing experience. What they have not said is how much RT would impact performance. It is up to developers to implement the technology and optimize it. Capcom is one of the first to reveal ray tracing's impact on frame rate.

Capcom held its Resident Evil Showcase on Thursday, during which it revealed Resident Evil Village's expected performance on next-gen consoles. The game maker had already said Village would run at 60 frames per second in 4K. What it hadn't mentioned is how much running it with ray tracing enabled would affect performance.

Unsurprisingly, both PlayStation 5 and Xbox Series X|S will take hits in the frame rate with ray tracing running. For the PS5 and XBX running the game in 4K with RT, players can expect to get about 45fps. The Xbox Series S can run the game in 1440p at 45fps or 30fps with RT enabled. It's a significant hit on frame rate across the board but not terrible.

Capcom additionally confirmed performance for last-gen consoles. On base model PS4s and XB1s, the best you're going to get is 900p at 45fps and 30fps, respectively. PlayStation 4 Pro and Xbox One X models have settings to output 4K at 30fps or 1080p at 60fps.

To check out the game yourself, be sure to take Resident Evil Village for a test drive during one of the upcoming timed demo events (above). PlayStation 4 and 5 owners in the US will get the first crack at it. Starting tomorrow, April 17, from 5 pm to 1 am PDT, players can download a one-hour trial of the village section. A castle demo drops during the same timeframe one week later, on May 24.

If you miss those sneak peeks, don't worry. Beginning May 1, a combined village/castle demo will release on all platforms (PS4/5, XB1/X|S, Steam, and Stadia). It is still only a one-hour demo but will be available through launch day on May 7.

Permalink to story.

 
Okay, I'll give it to them, that's pretty good for consoles. At first I was like, "45fps?" then I realized my PC can't even do 4K45 with ray tracing sooooo......
 
Ray Tracing is mainstream now. It’s available on Xbox, PS, GeForce and Radeon. It wouldn’t surprise me if the next switch has it. Although for some reason Techspot turns it off when they compare flagship GPUs.

Personally as a tech enthusiast I would like to know what the performance level is. I’ve heard reports that the consoles ray trace better than the RX6000 series. I’d like to see that debunked or verified by someone. The closest I’ve seen is Digital foundries coverage of it where they matched settings as close as possible on control and it ran actually quite significantly better on the XSX than the 6800XT.

However there is barely any reporting on ray tracing on Techspot. There is more coverage on boring work laptops than there is of this exciting next gen visual technology for some reason. Guys can we get an all RT head to head please with console comparisons so we can get some kind idea of the state of the tech? Right now I couldn’t tell you if a 6800XT could play my most played game of 2021 so far - Minecraft RTX.
 
Ray Tracing is mainstream now. It’s available on Xbox, PS, GeForce and Radeon. It wouldn’t surprise me if the next switch has it. Although for some reason Techspot turns it off when they compare flagship GPUs.

Personally as a tech enthusiast I would like to know what the performance level is. I’ve heard reports that the consoles ray trace better than the RX6000 series. I’d like to see that debunked or verified by someone. The closest I’ve seen is Digital foundries coverage of it where they matched settings as close as possible on control and it ran actually quite significantly better on the XSX than the 6800XT.

However there is barely any reporting on ray tracing on Techspot. There is more coverage on boring work laptops than there is of this exciting next gen visual technology for some reason. Guys can we get an all RT head to head please with console comparisons so we can get some kind idea of the state of the tech? Right now I couldn’t tell you if a 6800XT could play my most played game of 2021 so far - Minecraft RTX.

While I agree that RayTracing is the future and will become the norm in game engines, it is currently mostly just a option rather than built into the game experience. And I don't see that going away until last gen consoles stop getting major support.

People are not missing out if they do not have raytracing on ATM. It is still the early days of Raytracing, so I'd expect games to take advantage of current hardware to better effect in the years to come.
 
Okay, I'll give it to them, that's pretty good for consoles. At first I was like, "45fps?" then I realized my PC can't even do 4K45 with ray tracing sooooo......
Has been a seismic shift recently with those pesky consoles;they must be cheap.
 
Personally as a tech enthusiast I would like to know what the performance level is. I’ve heard reports that the consoles ray trace better than the RX6000 series. I’d like to see that debunked or verified by someone. The closest I’ve seen is Digital foundries coverage of it where they matched settings as close as possible on control and it ran actually quite significantly better on the XSX than the 6800XT.
They compared 6800XT to consoles in photo mode benchmarking coverage, but a) 6800XT still achieved the same/slightly better worst case ("Corridor of Doom") performance than consoles, b) it did so while running RT at twice the resolution (consoles use checkerboarded RT, PC uses full res) + higher LOD level (as consoles run below PC's low LOD) + an older engine build + using regular DX12 instead of specialized console APIs. So no, XSX didn't run better than 6800XT, 6800XT just ran "less better" than expected. And TBH these results aren't even directly comparable due to differences in settings and software, which DF mentioned clearly in their footage, but even with those higher settings and worse software stack it never ran worse. Which is underwhelming, as they said, but not worse.

BTW I just beat Control on a slightly underclocked 6900XT (which should result in <10% better performance than 6800XT) at high details + RT reflections, RT transparent reflections and RT debris and got a mostly stable 1440p60 with dips to 50s and sometimes momentary 40s in intense combat. This would be up to 50% better than what consoles offer, even though both rasterization and RT quality were higher. Obviously with unlocked framerate consoles would do better, as photo mode suggests, so the real advantage is probably closer to maybe 20-30% (I got 70-80fps where consoles did 50-60) over XSX most of the time, but it is still there. I suspect I could get another 10-20% if exact console settings were available on PC.

This footage can be found here, the link should point exactly to the beginning of PC/console comparison:
 
I'd have no interest in playing any game at 45fps even with freesync (I thought when I got the LG B9 that I'd be able to crank all settings to max and get the same experience at 40fps as at 60 but that's not the case). However, if it can do 45fps at 4K with RT then it can do 60 at 1440p, so you've got a still decent resolution and frame rate option with RT enabled.

My own feelings on RT, as someone lucky enough to get a 3080 for near rrp and having completed both Control and Watch Dogs Legion, is that it is amazing at first and then you quickly take it for granted. I personally find HDR more revelatory, and 4K over 1440p, but I found myself being willing to play legion at around 50fps with RT on at 4K rather than turn off the reflections or lower the resolution to hit 60fps. So I guess my point is that it's great, but you quickly just get used to it (because SSR already does a pretty good job at faking things like reflections), so it's only useful if you are already playing at 4K 60 HDR and can enable it with little enough loss of fluidity thanks to VRR.
 
Stupid choice to got with 4k 45 fps for RT. I don't know anyone that would want that instead of 1440p 60 fps RT - which I'm pretty sure it's possible on both consoles. If they can do 4k 45fps, then they can do 1440p 60fps, at least. Maybe even 1800p dynamic.

Unless you do a 800% zoom and pause like Digital Foundry, no one cat tell the difference between 1440p and 4k while playing.

I really don't understand some devs...
 
I certainly won't want RT to spoil a game's fluidity. Having said that, RE should be a slow paced game, so 45 FPS or not, it should not matter. Instead, I think it makes more sense to play the game at a higher resolution.
 
I certainly won't want RT to spoil a game's fluidity. Having said that, RE should be a slow paced game, so 45 FPS or not, it should not matter. Instead, I think it makes more sense to play the game at a higher resolution.
Of course it matters, every time you turn your camera in game you will see how bad 45fps is compared to 60fps. even if it's not a fast paced game and even with VRR or Freesync/Gsync you can still notice the slower awkward turning and moving of the camera and even the character...

Maybe you got used too much with 30 fps on consoles, but the difference is very noticeable, no matter the game. This is not a strategy game with fixed camera angle...
 
People are not missing out if they do not have raytracing on ATM.

I somewhat disagree with this. RT definitely adds graphical fidelity in the few games that support it. On the other hand I wouldn't go out and buy an RTX card solely because it does RT better.
 
Stupid choice to got with 4k 45 fps for RT. I don't know anyone that would want that instead of 1440p 60 fps RT - which I'm pretty sure it's possible on both consoles. If they can do 4k 45fps, then they can do 1440p 60fps, at least. Maybe even 1800p dynamic.

Unless you do a 800% zoom and pause like Digital Foundry, no one cat tell the difference between 1440p and 4k while playing.

I really don't understand some devs...
I agree with your fps point but not your no difference between 1440p and 4K. In most games I can see it quite clearly, but I sit about 3m away from a 55" screen.
 
I agree with your fps point but not your no difference between 1440p and 4K. In most games I can see it quite clearly, but I sit about 3m away from a 55" screen.
Is that comparing the two resolutions on the same screen though? If so, then yes - you absolutely will notice the difference between native and scaled resolutions on a big panel. On the other hand, if one took two seemingly identical high quality monitors, say 24" in size and with the same panel technology, and put them side by side, would everyone be able to spot if one was a 1440p model compared to a 4K one, in the middle of a gaming session? Perhaps some will, but I suspect the majority won't, unless they're being deliberately told to look for the differences.
 
Actual 4K or dinamic resolution ?

If the consoles manage to get 45fps a 3060 Ti / 3070 should be achieving 60fps.
 
I’m interested to see the comparisons on the PC with a 3080 or 3090.

You could do that, but that‘s comparing a $500 console to a $1000+ GPU alone, that‘s useless unless it‘s used in a PC that probably costs even more.

So it would be interesting to see how a $2k to 3k PC does vs a $ 0.5k console, but one should keep the price difference in mind.
 
I agree with your fps point but not your no difference between 1440p and 4K. In most games I can see it quite clearly, but I sit about 3m away from a 55" screen.
Sure at that distance if you stop and look at details maybe you (and a few other people will see, while others won't) see differences, but which is more important for you if you had to choose: playing at 4k 45fps or 1440p 60fps?

I know what I would chose. Also it's per game basis, this difference... what you can see in one game you might not in a another. It comes down to a lot of things, but mainly to how the post-processing is done, about TAA implementation and if it's dynamic or static resolution.
 
Sure at that distance if you stop and look at details maybe you (and a few other people will see, while others won't) see differences, but which is more important for you if you had to choose: playing at 4k 45fps or 1440p 60fps?

I know what I would chose. Also it's per game basis, this difference... what you can see in one game you might not in a another. It comes down to a lot of things, but mainly to how the post-processing is done, about TAA implementation and if it's dynamic or static resolution.
I wouldn't play at 45fps at any resolution. Yep, in some games it is hard to tell the difference, in others it's quite clear. I went down to 1440p in watch dogs legion to get above 60, but found myself quite quickly going to back to 4K cos it just looked crisper and shinier, even without specifically looking for detail.
 
Is that comparing the two resolutions on the same screen though? If so, then yes - you absolutely will notice the difference between native and scaled resolutions on a big panel. On the other hand, if one took two seemingly identical high quality monitors, say 24" in size and with the same panel technology, and put them side by side, would everyone be able to spot if one was a 1440p model compared to a 4K one, in the middle of a gaming session? Perhaps some will, but I suspect the majority won't, unless they're being deliberately told to look for the differences.
Yep, on the same screen. I didn't realise that 1440p would look better on a native 1440p screen.
 
Back