I have barely enough energy to post my experience with this question
as seeing this argument over and over makes me very tired... so if you're interested here goes:
I have an AMD 1950X and an Intel 8700K.
When I first set up my current gaming machine, I had it running with the
- 1950X (@ 4 GHz) with an
- ASRock Fatal1ty X399 Professional Gaming sTR4 AMD X399 mobo, an
- EVGA 1080 Ti FTW3, a
- Samsung 1TB 960 Pro and
- 32GB of G.SKILL TridentZ RGB Series (4 x 8GB) DDR4 3600 playing games at
- 2560x1440 @ 60-120 Hz (I switch refresh depending on how I feel).
All of my games played pretty much perfectly - or at least as expected since I'm a hardcore realist. I didn't expect this rig to play newer AAA games with max settings at 120 fps - so I'd drop my refresh to 90 or 75 or 60 depending on where I wanted to run V-Sync. Sometimes I just used Adaptive V-Sync... depends on the game and how I was feeling that day. The problems rose when I did something that was heavily single threaded - think emulation. Newer forms of game emulation NEVER played well. You would almost expect that emulation on a host running at 4 GHz would give you at least 80% the performance of running on a host at 5 GHz but that's just not how it ever worked. It was more like 50% the performance. Also, when playing games at 2560x1440@60 and trying to record at reasonably good quality settings at the same resolution the recordings would consistently drop frames. I always had to resize down to get smooth 60fps recordings or streams. Many of you are saying that all of these threads will cause you to excel at recording while playing...
Due to having these issues (and my file/plex server dying) I decided to move my 1950X and X399 mobo + RAM over to that machine and get the 8700K for my primarily gaming machine. The new setup for my gaming machine ended up like this:
- 8700K (@4.7 GHz on all cores) with an
- ASUS ROG Strix Z370-E Gaming, an
- EVGA 1080 Ti FTW3, a
- Samsung 1TB 960 Pro and
- 16GB of G.SKILL TridentZ RGB Series (2 x 8GB) DDR4 3200 CL14 playing games at
- 2560x1440 @ 60-120 Hz (I switch refresh depending on how I feel).
You'll notice that the only things that really changed are the processor, mobo, and RAM. I've gone with a relatively mild overclock of 4.7 on all cores since I didn't delid and I'm wanting to keep my temps pretty low which they are. I noticed a pretty stark difference on this rig compared to when it was running Threadripper...
First, my gaming performance was not noticeably different. We all know that as you run at higher resolutions and refresh rates the GPU becomes the bottleneck - hence the 720p benchmarks when showing differences in CPU performance. True, no one with these procs will ever play at that resolution but why bother with these cpu benchmarks if you're just going to make the GPU the limiting factor? People complaining about this tire me... these types of benchmarks are not really to show you how fast you can play a current game at 720p but more to give you an idea of the differences between the CPU performance and how that
might affect future games.
Second, as I mentioned earlier emulation had an almost 100% increase in performance. It is a DRASTIC difference and changed it from almost unplayable to almost perfect.
Third, recording was suddenly super smooth. 2560x1440@60 recordings didn't drop frames anymore. My Twitch streams were staying at 60 frames. My 8700K is doing a much better job at giving me good recordings and streams.
So I've decided to leave things this way. I'm running my VMs and Plex / File services on my 1950X and it's an overall much better experience. Hope this info helps someone trying to make a decision based on specific needs.
FYI - the apps I typically use on my PC are:
- FFXIV
- FFXV
- The Witcher 3
- Battlefield 1
- Tekken 7
- Cuphead
- CEMU Emulator
- VMWare Workstation Pro
- OBS