Then and Now: A decade of Intel CPUs compared, from Conroe to Haswell

Steve

Posts: 3,038   +3,142
Staff member

It's hard to believe 15 years have passed since I tested the Pentium 4 series for the first time. I honestly don't remember many of my experiences with the P4 range, if not because of its age then because it was a pretty rubbish series. I do have many fond memories of testing the Core 2 Duo series however.

Six long years after the Pentium 4, we reviewed the first generation Core 2 Duo processors. Today we are going to take a look back at the Core 2 Duo and Core 2 Quad CPUs and compare them to the Nehalem-based Core i5-760 and Core i7-870, the Sandy Bridge Core i5-2500K and Core i7-2700K chips, and then to the current generation Haswell Celeron, Pentium, Core i3, Core i5 and Core i7 parts.

Read the complete article.

 
Very interesting article, I`ve had these CPUs:

Core 2 Duo E4500
Core 2 Duo E8400
Intel Core i5-760

I'm looking forward to upgrade once Skylake comes.
 
I am one of these people that "claim" to be running an overclocking socket 775 cpu. I am still running a QX9650 @ around 4.6Ghz and it works fine for me. Lol my works pc is better in a way its a i7-3770. I do want to look at an i7 one day in my home rig as I do a lot of encoding and transcoding and I am sure i7 with boost performance for me dramatically but so far for gaming at least there are very few games that are hindered by the cpu performance so I am happy for the time being.
 
I am one of these people that "claim" to be running an overclocking socket 775 cpu. I am still running a QX9650 @ around 4.6Ghz and it works fine for me. Lol my works pc is better in a way its a i7-3770. I do want to look at an i7 one day in my home rig as I do a lot of encoding and transcoding and I am sure i7 with boost performance for me dramatically but so far for gaming at least there are very few games that are hindered by the cpu performance so I am happy for the time being.

The power consumption of a QX9650 @ around 4.6GHz would be hideous ;)

I always find it interesting when someone buys extreme high-end hardware that is essentially very poor value when compared to the much cheaper alternatives and then keeps it for a crazy long period to time.

I always imagine that those buying the $1000 EE processors upgrade every other day ;)

To me it makes more sense to buy more ‘sensible hardware’ and upgrade ever year or two.

Anyway thanks for commenting it is certainly interesting to hear what readers are still working with.
 
I owned Q9550 till April (now in my brother's PC), now i5-4690K. Since December '14 with GTX970. I don't mean to offend anyone, but while Haswell is faster, and most of test numbers might checkout, my observations are different. Nowhere during everyday usage I would assume that Quad is slower than 33%, 50% in benchmarks. Set aside synthetic and productivity benches, I can see You used ingame benchmarks to test the performance because the selection is almost my own when I setup the two computers side by side to test them (except Crysis). And they are worthless, will just give Your readers false presumptions. Nowhere playing Metro R my fps dropped below 30, unless some hiccups, Tomb Rider was actually unplayable on Q9550 in Mountain Village and Shanty Town, and not comfortable there on i5, this game was just poorly made. Hitman in-game was constant 60. The thing is, CPU stuck at Sandy Bridge era, with mediocre incremental improvements, and C2 is just what most people out there would need for everyday computing, aside from 1. professionals doing video/photoshop, 2.gamers playing latest sandbox/multiplayer titles. But jumping newest, bestest, awsomest Intel CPU will give You headache if You expect more than 50%.
 
I owned Q9550 till April (now in my brother's PC), now i5-4690K. Since December '14 with GTX970. I don't mean to offend anyone, but while Haswell is faster, and most of test numbers might checkout, my observations are different. Nowhere during everyday usage I would assume that Quad is slower than 33%, 50% in benchmarks. Set aside synthetic and productivity benches, I can see You used ingame benchmarks to test the performance because the selection is almost my own when I setup the two computers side by side to test them (except Crysis). And they are worthless, will just give Your readers false presumptions. Nowhere playing Metro R my fps dropped below 30, unless some hiccups, Tomb Rider was actually unplayable on Q9550 in Mountain Village and Shanty Town, and not comfortable there on i5, this game was just poorly made. Hitman in-game was constant 60. The thing is, CPU stuck at Sandy Bridge era, with mediocre incremental improvements, and C2 is just what most people out there would need for everyday computing, aside from 1. professionals doing video/photoshop, 2.gamers playing latest sandbox/multiplayer titles. But jumping newest, bestest, awsomest Intel CPU will give You headache if You expect more than 50%.

To call tomb raider a poorly made game, considering that even in some benchmarks (see linus tech tips) gets constantly the same high fps with many different processors it is quite... unsettling.
 
The power consumption of a QX9650 @ around 4.6GHz would be hideous ;)

I always find it interesting when someone buys extreme high-end hardware that is essentially very poor value when compared to the much cheaper alternatives and then keeps it for a crazy long period to time.

I always imagine that those buying the $1000 EE processors upgrade every other day ;)

To me it makes more sense to buy more ‘sensible hardware’ and upgrade ever year or two.

Anyway thanks for commenting it is certainly interesting to hear what readers are still working with.
I while I agree to the smaller increaments shopping, You have to take into account that some some were buying them cheaper, after end-of-life. I bought my Q9550 on sale in 2009, cause I already had 775 board, and when finally switched for Haswell this year, for 50% more CPU power, I paid 50% more (in EU, so doesn't count it in US$). I'm a fan of paying bang for the back, so I've been waiting too long to get some impovements in that respect, but GTA5 and Witcher 3 forced my hand.
 
I hoped for a clock-to-clock and core-to-core comparison, that would show the IPC progress pretty well. But it's a very good article anyway, thanks for writing it.

I wonder though what's with Broadwell - the article says that these CPUs are unavailable, but I've found some shops that have them in stock. And if they are available in Poland, they shouldn't be hard to get elsewhere.
 
I like the article but there is one thing which bothers me: "GTX 980 Gaming" results in Crysis 3. They were somewhat weird in Broadwell review but I thought it was one-time mistake. Here's what I mean:

the difference between i7 4790k and Celeron G1820 are 4.6 FPS @1680x1050 and 4.8 FPS @1920x1080.

The conclusion based on such results could be:

a) you don't really need a good CPU to play Crysis 3, since Celeron gives almost same results as i7,
b) something bottlenecks the CPUs

Another example, this time based on my own experience: Crysis 1 @1280x1024, details: High:
1) about 35 FPS right after the start of "Relic" misson and about 30 FPS during "Assault" misson on the bridge (Athlon X2 5200 @2612 MHz, 3 GB DDR2 @800 MHz and GF GTS 450 512 MB)
2) about 42 FPS right after the start of "Relic" misson and about 44 FPS during "Assault" misson on the bridge
(i5 4460 @3200 MHz, 8 GB DDR3 @1600 MHz and the same GF GTS 450 512 MB).

It doesn't mean that i5 4460 is only a bit faster than Athlon X2 5200 in Crysis, it means that the CPU is bottlenecked by GPU. That can be caused by many factors e.g. resolution, settings, GPU-related test scene or the GPU itself. I guess that in your case it's the GPU-related test scene (and maybe settings).


Same thing with Tomb Raider but I don't know the game nor the game engine, so I won't comment on that.

I wrote the above because I like Techspot and I'd like the site to improve, but maybe I've missed something and I'm totally wrong. In this case - feel free to correct me.

Greetings
 
Great article, but this type of comparison only matters to me when the clock speeds are matched up.

I hoped for a clock-to-clock and core-to-core comparison, that would show the IPC progress pretty well. But it's a very good article anyway, thanks for writing it..
Agreed.
 
Really enjoyed the article. thanks!
Anyone notice just how taxing the 4K encoding is?? wow! even on new CPUs!
 
Great article. Hope you can do a similar one for AMD cpus.

The most expensive CPU I've ever bought was a Athlon64 4400+, 400 US$.
The second most expensive, a i7 5820K, 380 US$. ;)

Also, it would be nice for another pair of articles with the same base as this one, but instead for Nvidia and AMD GPUs.

Greetings
 
Great article. Hope you can do a similar one for AMD cpus.

The most expensive CPU I've ever bought was a Athlon64 4400+, 400 US$.
The second most expensive, a i7 5820K, 380 US$. ;)

Also, it would be nice for another pair of articles with the same base as this one, but instead for Nvidia and AMD GPUs.

Greetings
Oh, but there was one? Even two.
https://www.techspot.com/article/928-five-generations-nvidia-geforce-graphics-compared/
https://www.techspot.com/article/942-five-generations-amd-radeon-graphics-compared/
Although I admit it would be more fun to compare 10 years of GPU, probably there would be problem with getting those ancient cards. Compared to CPU, top of the line GPUs age faster and die like flies.
 
Great article @Steve, I love seeing a history comparison of processors and how they evolved over time. Its good to see were making process but its sad in recent times mostly in how we have slowed down so much (Since Sandy Bridge).
 
Great article. Is there any way to compare with CPU's from the 90's too? That would be so awesome :)
 
The advances in CPU's have benefited mostly laptops over desktops, and I am perfectly fine with that (if only the same can happen with phones). I think it was Ivy Bridge -> Haswell that increased battery life by several hours on ultrabooks.
 
Great article!
It really proves that the i5-2500k was one of the best and definitely most future proof CPUs of it's time even if users didn't over-clock it.
 
@Steve
Great article as always. There's is something I'd like to suggest for future "[insert game name here] Benchmarked: Graphics & CPU Performance" articles. Would it be possible to include a "HyperThreading on/off" analysis under the "CPU Performance" section?
 
What impresses me the most is the low cost Celeron G1820 performed comparably to the $100 more expensive AMD A10-7870k. For most people the Celeron G1820 would work well. I still use only Core i7 CPU's in PC"s where a customer uses Photoshop. The last few generations of Core i7 work superbly, performance wise, with Photoshop.
 
Upgrading every year or two is waste of money considering how little of a performance improvement your getting. My work rig is an HP 8200 Elite with an i5-2400 (stock clocks) and 12GB RAM, my other work PC is an Ivy Bridge i7 (stock clocks) and it can be difficult to tell the difference, even with multiple demanding programs running.
Not saying that overall the new chips aren't quicker, they are, but it takes specific programs/bench's to show the difference outside of normal use.
My i7 930 @ 4.0GHz at home 'feels' just as fast, if not faster.
And thats from a perspective of business, from a gaming perspective its even less so.
I had an older comparison chart showing gaming performance between an i7 920 @ 4.0GHz vs a i7 3770k @ 4.0GHz over 15 different games, and the difference was a few frames at the most, even in demanding titles.

Spending what I did on my 930 was the best $300+ dollars I've ever spent on a CPU, 5+ years later its still pushing new GPU's like a boss, while I've watched people waste hundreds more by upgrading every gen or every other gen and convince themselves it was an upgrade they needed.
Here's an interesting comparison:
http://www.anandtech.com/show/8426/...view-core-i7-5960x-i7-5930k-i7-5820k-tested/2





The 990X is still boss hog.
 
Last edited:
I'm still rocking an i7-860 with the Asus P55D-E Pro motherboard, and it has been flawless for 5+ years now. Thank you for the article, it is really nice to see what the performance of the chips over the years look like stacked up to each other.
 
The power consumption of a QX9650 @ around 4.6GHz would be hideous ;)
Oh yeah. Anandtech's OC review of the processor fits with my experience of Yorkfield quads (and Kentsfield before them). My X6800 made an effective space heater, and unless you won the chip lottery, pushing a Q9450/Q9550/QX9650/QX9770 (all of which I've owned) to 4.2-4.5 pretty much means very good custom watercooling and a very stable motherboard.
16135.png

I always find it interesting when someone buys extreme high-end hardware that is essentially very poor value when compared to the much cheaper alternatives and then keeps it for a crazy long period to time...I always imagine that those buying the $1000 EE processors upgrade every other day ;)
Unless money is a no object proposition, the only way to stay in the race is to upgrade as soon as the next series arrives whether it offers tangible gains or not. Hardware devalues too rapidly to keep it for any length of time and it hits harder with the expensive parts - any perusal of eBay or similar for current pricing of a 4-5 year old top-of-the-line-system ( say, a 990X + Asus R3E mobo) should make sobering reading.

Thanks for another interesting read.
I hoped for a clock-to-clock and core-to-core comparison, that would show the IPC progress pretty well. But it's a very good article anyway, thanks for writing it.
That is more an academic exercise for the most part, and unless you do your benchmark homework, it doesn't take into account advances in instruction set extensions, memory controller and memory subsystem differences, or changes to a host of other architectural differences aside from the core pipeline itself (I/O controller, cache hierarchy etc.)
 
E6600 2nd best CPU ever made. It may not be fastest or most economical, but it still works great after all these years in one of my builds. It allowed ordinary PC user to switch steam tractor from early XX century for a rocket ship. Of course after there was i7 920 which is the best CPU ever made. It's hard to imagine that 920 is nearly 10 years old and it can still do everything modern S2011/3 can. All stuff in HEDT segment since then is pretty much re-brand of original 920 with some slight tweaks and add-ons.

That's my personal take on last 15 years of Intel CPUs. How sad I'm that gone are the days of silicon based CPUs which offer 50% performance increase with each generation.
 
Back