My 2013 4770K is absolutely fine for my gaming needs which is why I haven't upgraded yet.
Neither DDR3 nor PCI-E 3.0 is a hurdle in any way.
As for "4K" I have been playing at 3440X1440 with my Haswell CPU and a 1080 since 2016.
I know you people find this hard to believe but you know 3440X1440 has been a thing since the last decade and last decade CPUs are just fine to play at that resolution.
Good for you! I truly mean that. I just meant, eventually, the day will come... Anyway, "just buy stuff if you need it" is a good motto to live by.
But I can relate. My old i7-4790K walked a long, long way with me until I switched to AMD. I started with the 3600x, then the 5700x (which was, in my memory, the biggest upgrade step 'felt') and then to the 7950x3d (just to partner up my 4090 with a 'decent' CPU). And yes, the framerate is better (4K is my resolution), everything is better, and MT is a lot faster. But to be totally honest, the 5700x would have been sufficient for years to come because the perception of everyday performance, the 'real' difference for the user, so to speak, is not night and day (as sometimes suggested in the reviews). The same goes for DDR5 frequencies and timings; no real difference between 6400 (what I have now) and 5600 (my old kit). You can also spend so much (too much) money on faster RAM (especially Intel XMP), but you will not see the difference in everyday usage unless you fire up your benchmarks. But I know a lot of folks spending much on their hardware don't think so. They religiously pursue any contrary opinion (in psychology, this is called cognitive dissonance, and it is a central part of our consumer society, just like sublimation and fear of missing out). I build like 2-3 rigs every year, so I am a big hardware geek, but I try to stay reasonable. That's why I thought Steve's latest article ("CPUs don't matter for 4K Gaming? Wrong!") was so problematic, because, in my own experience, they don't.
I normally like his stuff, but that article felt forced in the wrong way (trying to be innovative and revisit a common point, proving what most 4K gamers and also most other hardware experts believed to be true is actually wrong). Steve had three carefully curated game engines, where the CPU somewhat matters at 4K (to make a point and force the framerates into CPU-bound situations), and even then he puts a 3600 against the fastest gaming CPU on the market, a 7800x3D. That is really stretching it, even if the 3600 was "so popular" in its day. That felt unreal and forced, and then some of the folks (but still, apparently only a minority) in the comments even felt educated by the fact that a mid-class 5yr old 6-core CPU can lose some FPS against the fastest gaming CPU in 4K. I mean, come on. The lengths to go just to make some vain point nobody else in the benchmark industry does. That was, in my opinion, ridiculous or even misleading for some inexperienced readers. Of course, you should not use a too-old CPU with a very modern GPU, I believe that too (but even then, in your case, if it's all ok, use that and don't let that FUD and FOMO get to you).
TLDR: Run that 4770K until it hurts.
