Intel 5th-Gen Core vs. 10th-Gen Core Architecture Benchmarked

The performance difference between the 5775C and the 10105F is pretty sad.

To be fair, it‘s former top end (consumer) vs current bottom end, but I had expected higher IPC gains. I am curious if the results are different in non-gaming workloads.

If you look at performance per $, the picture changes considerably, but that wasn‘t what the review was about.
 
Furthermore, if you had also told me that AMD would be dominating Intel on mainstream and high-end desktop platforms, as well as the server market by 2021, I would have passed out from laughter, so definitely don’t listen to my long term predictions.

This is no laughing matter, folk who bought heavily AMD stocks around 2014ish are rich now and are reading this from their villa in Hawai overlooking the USN PACFLEET.

RIP.
 
Thank you Steve - really enjoyable material, at least for my geeky eyes :)
(and that bit about ALT+F4 your predictions on the future - priceless. Still chuckling :) )
 
Thank you AMD for lighting a fire under intel's azz and waking them from their slumber of continuously ripping us for little to no advancement gen to gen, in almost 10 years.

Finally Alder Lake brings competition back and that's thanks to AMD! How ironic... 🙃
 
Thank you AMD for lighting a fire under intel's azz and waking them from their slumber of continuously ripping us for little to no advancement gen to gen, in almost 10 years.

Finally Alder Lake brings competition back and that's thanks to AMD! How ironic... 🙃
Continuously ripping us? Never happened. Back when Broadwell released consumer grade chips were much much cheaper. My i7 4790K was top of the stack and cost £240. AMD came along and raised all the prices. Now the cheapest Ryzen 5000 costs more than the top end i7 of Sandy Brisge, Ivy Bridge, Haswell etc.

Historically AMD have price gouged like no tomorrow. The Athlon 64 FX range cost over $1000 for a single core. Although at the time Intels Extreme Editions cost about the same.

The more you know!
 
Any chance you can do the benchmarks at 1440p or 4k? Nobody ever believes me that the perf difference at those resolutions is limited by the GPU not the CPU.
 
When I saw you demagogically picked up the Haswell with L4 cache which back then made it almost as fast as Skylake 6xxx, honestly.

Very low attempt but what, it is like picking those Ryzen 1st gen parts which are actually on a different node and using also the refreshed arch with performance boost due to silicon bug fixes and then telling that is the real difference among the architectures over the years.

Very low journalism.
 
Isnt it interesting that despite all the screams about how quad cores are obsolete and dead that not onyl is the i3 close in a surprising number of titles but it keeps above 60 FPS for 1% lows in every game tested.

Goes to show how little CPU power you actually need to play games, all these gaming rigs with core i9s and ryzen r9 processors are wasting money. I think the slower DDR3 memory bus may be hurting broadwell more then anythign else.
 
It is important so AMD can get away with charging me the same amount for a six core (5600X) that I paid years ago for an 8 core (2700)
You weren't forced to buy it that was your choice.

AMD is not responsible for pricing after it leaves them they set msrp. Then market demand and retailer mark ups take over. You could have waited and picked up a 5600X for much cheaper now.
 
Last edited:
Any power numbers?? I'd be curious how the old i7 5775c fared against the i3-10105f, and the others to see if that at least was any better.
 
Isnt it interesting that despite all the screams about how quad cores are obsolete and dead that not onyl is the i3 close in a surprising number of titles but it keeps above 60 FPS for 1% lows in every game tested.

Goes to show how little CPU power you actually need to play games, all these gaming rigs with core i9s and ryzen r9 processors are wasting money. I think the slower DDR3 memory bus may be hurting broadwell more then anythign else.

So true. However it’s nice to have 120+ fps as an option which is more easily achievable on 6-8 cores.
 
Isnt it interesting that despite all the screams about how quad cores are obsolete and dead that not onyl is the i3 close in a surprising number of titles but it keeps above 60 FPS for 1% lows in every game tested.

Goes to show how little CPU power you actually need to play games, all these gaming rigs with core i9s and ryzen r9 processors are wasting money. I think the slower DDR3 memory bus may be hurting broadwell more then anythign else.
In games that actually utilise cores an 8c cpu has 1% lows higher than the averages of a 4core cpu. Also, take this bench with a grain of salt. He isnt really using cpu taxing settings and areas. I know for a fact that a 7700k (4c /8t) drops to low 30s in cyberpunk with rtx on. And thats the average im talking about.
 
So true. However it’s nice to have 120+ fps as an option which is more easily achievable on 6-8 cores.

I can achieve 144 FPS stable at WoW Vanilla with my 4770K and it's not even OC'd.

Don't fall for their propaganda that you need a 16 Core CPU & 32 threads to get super fps in games.

There's a lot of ppl that don't even play so-called Triple A games such as Failpank et al and even more than 30 FPS are meaningless in strategy games made by Paradox e.g. such as "Crusader Kings".

 
You weren't forced to buy it that was your choice.

AMD is not responsible for pricing after it leaves them they set msrp. Then market demand and retailer mark ups take over. You could have waited and picked up a 5600X for much cheaper now.
MSRP for the 5600x is set by AMD at $300. The 2700x was MSRP $299. AMD is setting that, not the market.
 
AMD raised prices big time recently but these fanboys just stick their fingers in their ears and pretend it didnt happen.
They get REALLY upset if you point out that the 3600, which commonly retailed for $159, was a whopping 2-3% slower then the 3600xt, meaning that real world pricing the 5600x was 20-25% faster then the 3600 in games for 90% more money.

but intel is the big bad.
 
Back