look at the latest Unreal Engine PS5 demo. That is at 30 FPS. If they need an 8c/16t CPU (and RDNA2 GPU and super fast SSD) to achieve 30 FPS in that, an 4c/8t CPU will struggle to hit 30 FPS.
Exactly what I am about to get to.
Check this out...
You can watch a movie in 1080p at 24 frames per second, even in a compressed x264 or x265 format, on a decent low end, low cost computer nowadays with tons of action in it without it stuttering on ya.
Not sure what the exact requirements for that is, but it sure isn't an Intel i9-10980XE CPU along with 4 SLI NVIDIA GeForce GTX 1080 Ti Graphics Cards mounted on an EVGA 151-SX-E299 Motherboard...
Not by a long shot!
In a lot of First-Person Shooter Games you have some type of weapon and maybe a hand holding it shown usually in the middle bottom region of the game window, like in Wolfenstein 3D, Doom, Half-Life, etc.. So there is no need whatsoever to be "rendering in realtime" the weapon and hand. All of the frames of animation for it, not requiring too many, can already be saved(gif, png) and "Blitted" over top of the rest of the rendered 3D action going on behind it. If any Game is actually rendering this as 3D then they are losing huge amounts of percentage of the CPU power. Render only what you need to.
And if you can display Action in 1080p Movies at 24 frames per second and not micro stutter whatsoever then why would you need to Render and Raytrace at 60 frames per second or more?
The Timer used in the game loop should be a microsecond Timer utilising milliseconds so when you shoot at the 549th millisecond of a certain second of time in a game it will register in the "game's log" as Hour:Minute:Second:Millisecond where .549 is the millisecond. If your "bullet" connected at that time with 100% kill value on that bullet and some other player shot you with 100% kill value on their bullet but at the 731st millisecond ( .731) of that certain second of time then your 549th millisecond "shot" will have registered first and you get the kill.
Even with the 8K textures they will use in the new games, the 3D in the games will still look "too dark" as most of them do, requiring the user to crank up the "brightness" value. Lighting in 3D is the true bottleneck.
If you use only 1 light source it won't look too great. Step up to 3-point lighting and it will look far better, but still far from how it looks in reality.
If you capture an object with Texture(Tree with Bark) on a camera from 4 feet away in real life, then make sure you put the object(Tree) you are rendering it on at 4 feet away from the Game Camera and then match up the 2 photos. If they aren't at all close to 98% to 100% then work on your lighting until they are.
That is why most of these 3D video games mimicking reality look like "video games" and not "reality"!
The higher in resolution you go for your Game and Textures, the more Processing(CPU) and Rendering(GPU) you will need to render them.
So even if your Game Resolution is at 128,000p and your Texture Resolution is at 64K and you still get the Lighting in the Game "wrong or not 98-100%" then you will still have a "Dark" looking 3D game that still won't look any closer to reality as it does now, where the "Reality" in a Bluray Movie at 1080p will still blow it away!