Capcom reveals next-gen consoles run Resident Evil Village at 4K/45fps with ray tracing...

Yep, on the same screen. I didn't realise that 1440p would look better on a native 1440p screen.
It does depend a lot on the scaling method used, but the bigger the screen, the more noticeable the difference between native and scaled will be. For example, a large 4K panel displaying a 1080p output will be interpolating one pixel into four. I use a 4K 28" monitor, but typically play games at 1440p for performance reasons (unless 4K can run fine at 60 fps, as it's a G-Sync monitor) - there's a noticeable difference to my eyes between native and scaled, even at that size.
 
They compared 6800XT to consoles in photo mode benchmarking coverage, but a) 6800XT still achieved the same/slightly better worst case ("Corridor of Doom") performance than consoles, b) it did so while running RT at twice the resolution (consoles use checkerboarded RT, PC uses full res) + higher LOD level (as consoles run below PC's low LOD) + an older engine build + using regular DX12 instead of specialized console APIs. So no, XSX didn't run better than 6800XT, 6800XT just ran "less better" than expected. And TBH these results aren't even directly comparable due to differences in settings and software, which DF mentioned clearly in their footage, but even with those higher settings and worse software stack it never ran worse. Which is underwhelming, as they said, but not worse.

BTW I just beat Control on a slightly underclocked 6900XT (which should result in <10% better performance than 6800XT) at high details + RT reflections, RT transparent reflections and RT debris and got a mostly stable 1440p60 with dips to 50s and sometimes momentary 40s in intense combat. This would be up to 50% better than what consoles offer, even though both rasterization and RT quality were higher. Obviously with unlocked framerate consoles would do better, as photo mode suggests, so the real advantage is probably closer to maybe 20-30% (I got 70-80fps where consoles did 50-60) over XSX most of the time, but it is still there. I suspect I could get another 10-20% if exact console settings were available on PC.

This footage can be found here, the link should point exactly to the beginning of PC/console comparison:
Yikes. 40-50fps with a 6900xt? Ive been playing the game at 1440p, rt on, on a 3080 and usually stay in the triple digits. I tried it with DLSS off but couldn't notice a visual difference other than a reduced framerate (though still a fair bit higher than 40-50fps). I made fun of DLSS when it was first released in the 20xx series cards but its an absolute monster now. If they can get it to be as effective as it is with control but in every game without having to add specific driver support than it will truely be a killer tech (its amazing as it stands).

Honestly I think variable upscaling and variable LOD will be the must have tech going forward. Being able to ramp up detail in slow moving scenes and decrease it in fast action sequences where you won't likely notice it is going to be a requirement going forward into the realm of 4k and 8k displays. Otherwise your simply wasting huge amounts of gfx performance rendering complex backgrounds that are only going to be onscreen for a couple of frames. And once console gamers get a taste of what it's like playing at 120fps they won't want to go back.
 
It does depend a lot on the scaling method used, but the bigger the screen, the more noticeable the difference between native and scaled will be. For example, a large 4K panel displaying a 1080p output will be interpolating one pixel into four. I use a 4K 28" monitor, but typically play games at 1440p for performance reasons (unless 4K can run fine at 60 fps, as it's a G-Sync monitor) - there's a noticeable difference to my eyes between native and scaled, even at that size.
I wont buy a 4k panel for that reason. Everything performs so much better at 1440p and displaying anything at non native resolutions is a huge hit to fidelity. Once nvidia and developers all get the hang of variable resolutions in game and can scale things at DLSS quality that might change though. If I can use my desktop at 4k and game at 1440p for higher fps and have the same sharpness as I do when gaming at "native" resolutions that would be perfect.

I miss that aspect of CRTs. If you needed higher FPS you'd just lower the resolution and it stayed just as sharp. I miss almost nothing else about CRTs though.
 
Okay, I'll give it to them, that's pretty good for consoles. At first I was like, "45fps?" then I realized my PC can't even do 4K45 with ray tracing sooooo......

On Playstation its kinda useless but on Xbox (which has variable refresh rate) with a OLED like I have that has hdmi 2.1 (with vrr) its basically going to look the same.

 
Ray Tracing is mainstream now. It’s available on Xbox, PS, GeForce and Radeon. It wouldn’t surprise me if the next switch has it. Although for some reason Techspot turns it off when they compare flagship GPUs.

Personally as a tech enthusiast I would like to know what the performance level is. I’ve heard reports that the consoles ray trace better than the RX6000 series. I’d like to see that debunked or verified by someone. The closest I’ve seen is Digital foundries coverage of it where they matched settings as close as possible on control and it ran actually quite significantly better on the XSX than the 6800XT.

However there is barely any reporting on ray tracing on Techspot. There is more coverage on boring work laptops than there is of this exciting next gen visual technology for some reason. Guys can we get an all RT head to head please with console comparisons so we can get some kind idea of the state of the tech? Right now I couldn’t tell you if a 6800XT could play my most played game of 2021 so far - Minecraft RTX.
That's because their lead coverage is done by a YouTuber who has a strong amd bias and has backed himself up in the corner of ray tracing sucks and isn't worth looking it (that is until amd is on equal or better ground than Nvidia)

At that point expect full coverage non stop.
 
While I agree that RayTracing is the future and will become the norm in game engines, it is currently mostly just a option rather than built into the game experience. And I don't see that going away until last gen consoles stop getting major support.

People are not missing out if they do not have raytracing on ATM. It is still the early days of Raytracing, so I'd expect games to take advantage of current hardware to better effect in the years to come.
Early days does not negate the need for coverage. Early adopters deserve since reviews and comparisons.

Imagine going to the movies the first couple weeks/months meant there were literally no reviews and you'd just have to wait til it was available to more people before reviews showed up.
 
Yikes. 40-50fps with a 6900xt? Ive been playing the game at 1440p, rt on, on a 3080 and usually stay in the triple digits. I tried it with DLSS off but couldn't notice a visual difference other than a reduced framerate (though still a fair bit higher than 40-50fps).
It was mostly 70-80, 40-50 was the absolute worst case. But yeah, Nvidia has a lot of performance advantages in this game: better performance overall, lesser RT performance hit and visually lossless DLSS. Still, it looked good enough on a 4K60 screen, so I'm not overly disappointed at the results. But I cannot deny DLSS is becoming a real game changer at 4K overall and 1440p with RT enabled.

Yikes. 40-50fps with a 6900xt? Ive been playing the game at 1440p, rt on, on a 3080 and usually stay in the triple digits. I tried it with DLSS off but couldn't notice a visual difference other than a reduced framerate (though still a fair bit higher than 40-50fps). I made fun of DLSS when it was first released in the 20xx series cards but its an absolute monster now. If they can get it to be as effective as it is with control but in every game without having to add specific driver support than it will truely be a killer tech (its amazing as it stands).

Honestly I think variable upscaling and variable LOD will be the must have tech going forward. Being able to ramp up detail in slow moving scenes and decrease it in fast action sequences where you won't likely notice it is going to be a requirement going forward into the realm of 4k and 8k displays. Otherwise your simply wasting huge amounts of gfx performance rendering complex backgrounds that are only going to be onscreen for a couple of frames. And once console gamers get a taste of what it's like playing at 120fps they won't want to go back.
I totally agree. Control in particular could have gained a lot from dynamic resolution and maybe even dynamic RT resolution. There were scenes where I was running at about 100fps, so 66% above 60fps target - the game could easily render 1620 or 1800p there. Meanwhile, when I did get dips below 60, it was mostly in dynamic battles full of intense visual effects that took most of the screen, so a momentary dip to about 1200 or 1080p wouldn't have been very noticable. As for RT resolution scaling, Control on PS5/XSX runs RT in checkerboarded mode and while it is visible in static scenes, traversal and combat make that rather hard to notice.

Obviously it would still be worse than DLSS-like mechanisms, but mostly good enough, easy to implement and absolutely hardware agnostic.
 
Ray Tracing is mainstream now. It’s available on Xbox, PS, GeForce and Radeon. It wouldn’t surprise me if the next switch has it. Although for some reason Techspot turns it off when they compare flagship GPUs.

Personally as a tech enthusiast I would like to know what the performance level is. I’ve heard reports that the consoles ray trace better than the RX6000 series. I’d like to see that debunked or verified by someone. The closest I’ve seen is Digital foundries coverage of it where they matched settings as close as possible on control and it ran actually quite significantly better on the XSX than the 6800XT.

However there is barely any reporting on ray tracing on Techspot. There is more coverage on boring work laptops than there is of this exciting next gen visual technology for some reason. Guys can we get an all RT head to head please with console comparisons so we can get some kind idea of the state of the tech? Right now I couldn’t tell you if a 6800XT could play my most played game of 2021 so far - Minecraft RTX.
You must be new to the site. They don't benchmark console tech here.
 
Back