Here's an idea for a future budget article - by how much does Ryzen vs Coffee Lake differ on older games? I'm thinking pre-2014 stuff like the Bioshock Trilogy, Deus Ex Human Revolution, Half Life 2, Portal 2, Oblivion / Skyrim, etc. Hell, it's that time of year when I'm more getting the urge to fire up Amnesia: Dark Descent (2010), SOMA (2015) or have a blast at FEAR1 (2005) again, than struggle to maintain interest in some recent 2017 titles. A lot of us play a wider mix of old & new games than most mainstream tech sites tend to represent, and I haven't seen a single site do something like an R3 1300X vs i3-8100 for older games outside of the usual "12-game bubble" (BF1, Civ VI, GTA V, Hitman, Overwatch, Tomb Raider, Witcher 3, etc).
In theory "all old games will run fine on modern CPU's", in practise some open-world titles like Morrowind / Oblivion / Operation Flashpoint with large draw distances can absolutely bring a modern CPU to its knees by maxing out only 1-2 cores. In theory the Intel would win on IPC (based on Cinebench 1T scores), but the question is by how much do those synthetics scale on older 1-4 core usage games in actual practice? Be interesting to do something different and pick a few oldies to test as part of a budget gaming article series.