Intel Core i3 vs. Core i5 vs. Core i7: What do you get by spending more?

Seeing the i3 do so well blew my tiny little mind. I knew the i7 in gaming wasn't vastly superior to the i5 but I had no idea the i3 did so well. I would really like to see the FX line up added to the test.
 
I recently upgraded my Mobo/CPU/RAM and went with the i5 3660. Coming from a dual-core AMD 2850 which was 6 years old, the speed increase (along with 8 GB of DDR3 memory) was astonishing. But here's the thing - the biggest speed increase I've seen is because I have an SSD. My boot time is not much faster than before - still around 10 seconds to login screen. Of course, many apps that I use are more responsive; converting audio formats from one to another are much quicker, unzipping archives is faster, etc. I'm not a big gamer, but I bought a nice gaming board from Gigabyte that I got at a good price. To me, I don't know that I could have justified spending well over $100 more for the i7. The i5 is plenty for me.
 
I have an i5 2400 in my work machine and I can notice the difference from all the i3's we deploy, especially with Office 13 and some other programs I use daily.
If this article proves anything its that game developers are still behind the times/lazy in taking the next step when it comes to utilizing everything a top dog CPU has to offer.
That or they are making the performance accessible/accommodating for people with less-expensive setups.
There are games that show a distinct advantage (Tribes, Dragon Age: Inquisition, etc) but not enough of them. This is why, 5 years later, I still run a i7 930 @ 4.0GHz and to all those who upgraded, sorry but, I'm still gaming as fast as you.

If you buy an i7 for gaming it is because you don't want to upgrade for 5+ years. Just look at the old i7-9xx. It still performs just fine the most modern games that utilize all 8 threads while the old pre-sandy bridge i5's can't even beat a Phenom II x4. Same thing will happen in 2 years with the i7-4770K.
Agreed.
I've upgraded my GPU here and there to play at my resolution (1600p) but have no desire to upgrade from my 930.
Now, I understand that GPU's take over mostly at high resolutions but will an i3 push 980's in SLi at 4K as good as an i7?
 
Last edited:
I agree with their conclusion that the newer Core i3 offers enough performance for gaming.

However, anyone that is building a new systems should consider an i5 or i7 just for taking advantages of new DX12 performance and new gaming engines being designed from new ways to take advantage of multi-core performance on the XB1/PS4 consoles.

As developers move to be more multi-core optimized, newer games in the next year or so will have an advantages on the i5/i7 over the i3.

This isn't really good advice. The main push for DX12 and things like Mantle is to remove the CPU as the limiting factor so that gaming performance will be largely dictated by your video card. This means that the i7 and i5 will actually be less of a performance gain when you consider budget since you would have more money to sink in to your video card.

Also, the console CPUs are pathetic at best. It may have 8 cores, but even all of them working 100% efficiently I doubt could 'out math' a new i3, probably not even a Pentium.
 
I agree with their conclusion that the newer Core i3 offers enough performance for gaming.

However, anyone that is building a new systems should consider an i5 or i7 just for taking advantages of new DX12 performance and new gaming engines being designed from new ways to take advantage of multi-core performance on the XB1/PS4 consoles.

As developers move to be more multi-core optimized, newer games in the next year or so will have an advantages on the i5/i7 over the i3.

This isn't really good advice. The main push for DX12 and things like Mantle is to remove the CPU as the limiting factor so that gaming performance will be largely dictated by your video card. This means that the i7 and i5 will actually be less of a performance gain when you consider budget since you would have more money to sink in to your video card.
That is how it should be. If I wanna game I would need to only buy higher end GPU instead of both CPU and GPU. I hope Mantle makes it work.
 
I have an older Intel Ivy Bridge Core i5 3570k CPU and it's still more than adequate for my needs. Combine it with a decent SSD and this machine kicks ***.
 
Nah. It's still a dual core chip and some current and most certainly future games won't even run on dual-thread CPUs. Personally I can't recommend Pentiums to anyone.
Every game every released runs on dual-thread CPUs, and it will probably remain that way for a long time still. Also, measures like "how many threads it has" are meaningless. Most smartphones SoCs today have four threads, yet the dual-thread Pentiums run circles around them. Same for the console CPUs, which have eight threads but are no more powerful than an average laptop CPU (since they are equivalent to two 15W Kabini chips together).
If you buy an i7 for gaming it is because you don't want to upgrade for 5+ years. Just look at the old i7-9xx. It still performs just fine the most modern games that utilize all 8 threads while the old pre-sandy bridge i5's can't even beat a Phenom II x4. Same thing will happen in 2 years with the i7-4770K.
Buying an i7 does not mean you won't need to upgrade in five years (compared to buying an i5). The only advantage an i7 has over an i5 is hyper-threading. The only scenario where an i7 would be any benefit is if the game can use more than four threads. Even then, the CPU performance increase would be somewhere up to 30% (what Intel claim hyper-threading can offer), but that does not translate into 30% higher FPS. The most likely scenario is that within five years your i7 will either provide no advantage over an i5, or will provide a small increase in performance IF the games uses hyper-threading. It means that if the i5 is no longer enough, the i7 isn't either. And if the i7 is enough, the i5 is as well.
Also, you can't compare the i5 and mainstream i7 of today (literally the same chip, only the i5 has hyper-threading and 2 MB of cache disabled; been this way since Sandy Bridge) to the first generation i5 and i7, which were two different chips (Lynnfield vs. Bloomfield) with different TDPs on different platforms. Their difference was more than just hyper-threading and a bit of cache.
 
If you buy an i7 for gaming it is because you don't want to upgrade for 5+ years. Just look at the old i7-9xx. It still performs just fine the most modern games that utilize all 8 threads while the old pre-sandy bridge i5's can't even beat a Phenom II x4. Same thing will happen in 2 years with the i7-4770K.
Forgot one important part: On top of the performance stuff I mentioned above, there is a bigger issue. Five years from now you'll be pressed to upgrade anyway due to platform improvements, not performance. The X58 platform from five years ago has no access to PCIe 3.0 and very limited access to USB 3.0 and SATA 6 Gbps (not natively supported, some motherboards used additional controllers to support those features and had few ports). Those features are now standard (and with many ports) on even the cheapest H97/Z97 motherboards you can find.
 
Personally I regretted going for the mainstream Haswell. I'm upgrading this year to the cheapest options of Haswell-E. Here are my arguments:
* I don't make use of the integrated Intel HD graphics (Haswell-E has none)
* I bought an aftermarket CPU cooler (Haswell-E CPUs include none)
* I bought a high-end, two-way SLI-capable Z87 mobo; which in the end is similarly priced to the cheapest X99 mobo options with 3-way SLI
* For such small price difference between the most expensive mainstream i7 and the cheapest i7 Extreme, it's worth for 4 additional threads
Well I agree with you but you also have to take into account the cost of DDR4 which is much higher than DDR3 is currently. Yes an i7 5820K is one heck of a bargain since you get support for 3 GPUs (28 lanes), 6 physical cores and 6 virtual (12 threads total), and the better designed thermal interface for better temps and performance. Another thing you have to take into account however is the binning process for the i7 5820k because while its a 6 core 12 thread beast, it starts very low clocked and from what I have seen it does go all over the place when it comes to overclocking. I have seem many people report easily hitting 4.5ghz on some while others stress at the 4.0ghz threshold which could limit the survivability of the chip in the long run (I am factoring in the voltage generally required and the fact that the processor is lower clocked). Overall its still a great deal, but you just have to make sure you understand the risks involved with such a chip before making the drop.

For gaming though, your probably going to see a very minute difference until DX12 becomes the mainstream. Even with exceptions included games do not really utilize beyond 4 cores properly for the most part. I own a 5930K and I can say I love everything about that chip and more, however comparing it to my friends i5 3570K at similar clocks who also owns an R9 290X, with the CFX turned off on my machine the difference is negligible at best.
 
For gaming though, your probably going to see a very minute difference until DX12 becomes the mainstream.
The point of DirectX 12 is reducing the workload on the CPU by reducing API overhead. That means that low-end CPUs will perform closer to higher-end CPUs, not that high-end CPUs (specially the ones with higher core counts) will automatically see better usage. If Core i7 processors are already not relevant for gaming compared to i5 and i3 processors, DirectX 12 will make them even more so.
The only thing that can make i7 processors more relevant for gaming is better multithreading support. That already came with DirectX 11, but it failed to make any impact on the market. The vast majority of DX11 developers still prefer to put most of the workload in less than four threads.
 
Nah. It's still a dual core chip and some current and most certainly future games won't even run on dual-thread CPUs. Personally I can't recommend Pentiums to anyone.
Every game every released runs on dual-thread CPUs, and it will probably remain that way for a long time still. Also, measures like "how many threads it has" are meaningless. Most smartphones SoCs today have four threads, yet the dual-thread Pentiums run circles around them. Same for the console CPUs, which have eight threads but are no more powerful than an average laptop CPU (since they are equivalent to two 15W Kabini chips together).
If you buy an i7 for gaming it is because you don't want to upgrade for 5+ years. Just look at the old i7-9xx. It still performs just fine the most modern games that utilize all 8 threads while the old pre-sandy bridge i5's can't even beat a Phenom II x4. Same thing will happen in 2 years with the i7-4770K.
Buying an i7 does not mean you won't need to upgrade in five years (compared to buying an i5). The only advantage an i7 has over an i5 is hyper-threading. The only scenario where an i7 would be any benefit is if the game can use more than four threads. Even then, the CPU performance increase would be somewhere up to 30% (what Intel claim hyper-threading can offer), but that does not translate into 30% higher FPS. The most likely scenario is that within five years your i7 will either provide no advantage over an i5, or will provide a small increase in performance IF the games uses hyper-threading. It means that if the i5 is no longer enough, the i7 isn't either. And if the i7 is enough, the i5 is as well.
Also, you can't compare the i5 and mainstream i7 of today (literally the same chip, only the i5 has hyper-threading and 2 MB of cache disabled; been this way since Sandy Bridge) to the first generation i5 and i7, which were two different chips (Lynnfield vs. Bloomfield) with different TDPs on different platforms. Their difference was more than just hyper-threading and a bit of cache.
There are already games that refuse to run on less than four threads. When DA:I came out, it refused to run on anything with less than 4 threads. Maybe they changed that, but most likely not.

Also, you talk about bloomfield i7 (920 model for example) which was unusually popular for extreme platform, but there was i7 for 1156 (i7 860 for example) that is more common! You sir, are the one mixing things up! Even if there were two different chips, difference is basically the same as now (difference in cache size, HT, frequency and number of channels on memory controller).
You can say that i5 760 was 4670k of today, i7 860 -> i7 4770k, i7 920 -> 5820k (lowest on the 1150). Or you can say i7 920 -> 4820k (2011 quad) if that makes you happy.
 
Well I agree with you but you also have to take into account the cost of DDR4 which is much higher than DDR3 is currently. Yes an i7 5820K is one heck of a bargain since you get support for 3 GPUs (28 lanes), 6 physical cores and 6 virtual (12 threads total), and the better designed thermal interface for better temps and performance. Another thing you have to take into account however is the binning process for the i7 5820k because while its a 6 core 12 thread beast, it starts very low clocked and from what I have seen it does go all over the place when it comes to overclocking. I have seem many people report easily hitting 4.5ghz on some while others stress at the 4.0ghz threshold which could limit the survivability of the chip in the long run (I am factoring in the voltage generally required and the fact that the processor is lower clocked). Overall its still a great deal, but you just have to make sure you understand the risks involved with such a chip before making the drop.

Well, right now the DDR4 part still holds true. I forgot to mention it, even when I'm considering it in the budget. I hope 'though that people interested in Skylake and DDR4 will also consider the Extreme option, and not simply discarding it thinking beforehand it's too expensive (people usually marry the most expensive X99 mobo with the most expensive Extreme CPU to prove their point).

The thing is that when I'm playing videogames, most of the time I'm doing another lengthy task in the background -effectively running the CPU at 100%- due to lack of available time and opportunities to play. Either that or my brother is connecting to his local user through remote desktop over LAN [from a basic notebook] to do his homework -video editing- while I'm playing in my user. Yes, two interactive Windows sessions running simultaneously. I know I'm probably in the 1% use scenario, if not simply the minority, and noone else would see the limits of the mainstream option.

I once considered having a NAS running for storage purposes over LAN, but the need of computing power is also there. The rest of my family doesn't have the budget to meet additional requirements as my siblings are now starting college; so I saw the opportunity of benefiting from my gaming rig -still a lot better than a basic notebook, regardless of whether my GPU is good or not for OpenCL/GL, professional-grade or not.
 
I love these kind of articles. Techspot seems to be the vest at this kind of things, followed closely by Tom's Hardware.

It's interesting to note that up to 770/960 and 280X level of performance an i3 is enough.

Now, if you'd be so kind, do this article with AMD CPUs :D. 860K, FX 4300, 6300, 8350 and/or 9590. Throw in even OC results if you have time :D

AMD makes CPUs? No, just kidding. Oh wait, seriously? Never heard of
 
So, if I understand the article correctly, it sounds like it said, if you purchase a very fast GPU you don't really need a quad-core for video games?

Also, no mention of multi-tasking in general. I would assume that anyone who keeps a lot of programs running at once would benefit from a quad-core CPU. I have 1 or 2 web browsers open almost 24/7 with a ridiculous amount of open tabs. Plus, Visual Studio programming environment, PDF readers, email clients, Office programs. Wouldn't a quad-core in these scenarios be hugely beneficial as well?

I just bought my first quad-core CPU literally days ago and after reading the article it makes it sound like a waste of money.
 
Don't overthink your decision :). You will most likely benefit from choosing an i7 for work! And even so new games can experience sudden fps drops on some scenes (FC4 for example) if you play on i3 (and I'm not talking drops below 60fps for vsync, but somewhere in the 30's). That part of the story is not seen here.
 
So, if I understand the article correctly, it sounds like it said, if you purchase a very fast GPU you don't really need a quad-core for video games?

Also, no mention of multi-tasking in general. I would assume that anyone who keeps a lot of programs running at once would benefit from a quad-core CPU. I have 1 or 2 web browsers open almost 24/7 with a ridiculous amount of open tabs. Plus, Visual Studio programming environment, PDF readers, email clients, Office programs. Wouldn't a quad-core in these scenarios be hugely beneficial as well?

I just bought my first quad-core CPU literally days ago and after reading the article it makes it sound like a waste of money.

Can’t say in the multi-tasking scenario that you speak of a Core i5 will be hugely beneficial over a Core i3, at least no more so than what we have shown. Unless you are encoding a video at the same time and using Photoshop heavily and still expecting normal usage. The web browser thing is more a test of system memory that CPU performance, you can open 100 tabs if you want as long as you have enough system memory the performance will be much the same on a Core i3 as a Core i5.

If you can afford a Core i5 over a Core i3 it is obviously a better choice, it is after all faster.

Don't overthink your decision :). You will most likely benefit from choosing an i7 for work! And even so new games can experience sudden fps drops on some scenes (FC4 for example) if you play on i3 (and I'm not talking drops below 60fps for vsync, but somewhere in the 30's). That part of the story is not seen here.

No way. I have played Far Cry 4 extensively with a Core i3 and Core i5 using the GTX 980 and I was unable at all times to tell which processor was being used.
 
Last edited:
"The Core i3 absolutely offers the best value for PC gamers" - oh really? This poor excuse of the review did not test ANY MMOs (in huge raid settings with 1000nds of NPC-s + 100s of players at the same time in the scene.) If you are MMO oriented gamer (and here I mean real MASSIVE-multiplayer, not some solo-player-online-adventure-with-few-random-dudes-trying-to-gank-you-during-entire-day type mmo), you want i7!
 
So, if I understand the article correctly, it sounds like it said, if you purchase a very fast GPU you don't really need a quad-core for video games?

Also, no mention of multi-tasking in general. I would assume that anyone who keeps a lot of programs running at once would benefit from a quad-core CPU. I have 1 or 2 web browsers open almost 24/7 with a ridiculous amount of open tabs. Plus, Visual Studio programming environment, PDF readers, email clients, Office programs. Wouldn't a quad-core in these scenarios be hugely beneficial as well?

I just bought my first quad-core CPU literally days ago and after reading the article it makes it sound like a waste of money.

Can’t say in the multi-tasking scenario that you speak of a Core i5 will be hugely beneficial over a Core i3, at least no more so than what we have shown. Unless you are encoding a video at the same time and using Photoshop heavily and still expecting normal usage. The web browser thing is more a test of system memory that CPU performance, you can open 100 tabs if you want as long as you have enough system memory the performance will be much the same on a Core i3 as a Core i5.

If you can afford a Core i5 over a Core i3 it is obviously a better choice, it is after all faster.

Don't overthink your decision :). You will most likely benefit from choosing an i7 for work! And even so new games can experience sudden fps drops on some scenes (FC4 for example) if you play on i3 (and I'm not talking drops below 60fps for vsync, but somewhere in the 30's). That part of the story is not seen here.

No way. I have played Far Cry 4 extensively with a Core i3 and Core i5 using the GTX 980 and I was unable at all times to tell which processor was being used.
yeah ok, it might not be as big of a difference on Nvidias, but for AMD graphics fps drops are quite noticable. Known issue with their driver. There was a good article with in-game graphs not a while ago, I'm sure you read it.
 
yeah ok, it might not be as big of a difference on Nvidias, but for AMD graphics fps drops are quite noticable. Known issue with their driver. There was a good article with in-game graphs not a while ago, I'm sure you read it.

So is it the GPU or CPU causing these slow downs?

Also if it is a known driver issue should we have mentioned it in the article? I don’t see how this could be a part of the story.
 
There are already games that refuse to run on less than four threads. When DA:I came out, it refused to run on anything with less than 4 threads. Maybe they changed that, but most likely not.
Also, you talk about bloomfield i7 (920 model for example) which was unusually popular for extreme platform, but there was i7 for 1156 (i7 860 for example) that is more common! You sir, are the one mixing things up! Even if there were two different chips, difference is basically the same as now (difference in cache size, HT, frequency and number of channels on memory controller).
You can say that i5 760 was 4670k of today, i7 860 -> i7 4770k, i7 920 -> 5820k (lowest on the 1150). Or you can say i7 920 -> 4820k (2011 quad) if that makes you happy.
So are you telling me that this game would run on a 15W Kabini because it has four threads, but not on a 55W Pentium (which can be overclocked) because it has two? Number of threads is not indicative of performance.
Also, the comment I was quoting is talking specifically about Bloomfield ("Just look at the old i7-9xx"). And the reason it performs so well still today is because it can be overclocked. With the possible exception on the i7-875K, that's not the case with Lynnfield processors. It does not mean a Lynnfield i7 performs better today than a Lynnfield i5 just because it has hyper-threading. Just like that doesn't work for Sandy Bridge, Ivy Bridge and Haswell, that does not work for Nehalem.
His comment said that the i7-9xx performs well today while a Lynnfield i5 barely outperforms a Phenom II. That means that, in games, a Lynnfield i7 barely outperforms the Phenom II as well (outside of the overclockable 875K). This is not the same as saying that today you can buy a Haswell i7 (on LGA 1150) and it will make the system live longer than a Haswell i5, that will not happen just because of hyper-threading. But the difference is that unlike with Lynnfield, we have unlocked i5 chips as well, so an i5-4690K will live just as long as an i7-4790K.
 
Can’t say in the multi-tasking scenario that you speak of a Core i5 will be hugely beneficial over a Core i3, at least no more so than what we have shown. Unless you are encoding a video at the same time and using Photoshop heavily and still expecting normal usage. The web browser thing is more a test of system memory that CPU performance, you can open 100 tabs if you want as long as you have enough system memory the performance will be much the same on a Core i3 as a Core i5.

If you can afford a Core i5 over a Core i3 it is obviously a better choice, it is after all faster.

I've been happily using a SandyBridge and IvyBridge pair of dual-core CPUs for a long while now. They have served me pretty well.

I must now throw out a related movie quote answering the question, "Why did you buy the quad-core CPU?". Answer: "Because he could"!. :) . If I was rich I could justify my quad-core purchase with that answer by responding, "Why not?". But I'm not. It's never a good decision to buy something if it is not a good bang for the buck.This obviously only applies to people who have to spend their money wisely.

The article does gives the impression that a quad-core is the sweet spot in the world of multi-core CPUs. So, on one hand, I consider myself a victim of perpetual consumerism. Purchasing for no other reason than because something is newer, faster, or more technologically advanced. In many of my purchases the benefits of a newer generation versus last generation are very tangible and more easily notable and measurable, I.e., USB 3.0 > USB 2.0. If all I get is faster blu-ray ripping/encoding/decoding for my extra $70.00 I think I might want my money back. I could have built two extra computers instead of one.

Even in the parts of the article that show a quad-core CPU is faster than dual-core CPUs, is it really enough to worry about? I'm thinking no at this point.
 
Even in the parts of the article that show a quad-core CPU is faster than dual-core CPUs, is it really enough to worry about? I'm thinking no at this point.
In the worst case scenario in the article for gaming, which is Metro Rudux with a GTX 980, the i5-4690 offers 20,91% more performance (at 1080p) for 87,5% higher price. That's already a bad value, but it's unrealistic that someone on a budget would be using a GTX 980 to begin with. With the GTX 960, it offers 6,62% more performance for the same 87,5% higher price. And that's the case where the jump from the i3 to the i5 is the highest in performance.
Indeed anything above a Core i3 is a bad value. People can justify it with "I'm not on a budget so why not" or the hypothetical future game that will finally use four or more cores effectively (which we've been hearing for years, yet still never happened), but a Core i5 isn't by any means good value for gaming.
 
Hmmm Haswell-E looks like it's an improvement to my 2600K, albeit at 4.7Ghz. Might consider looking at an upgrade this year.... wait!... £800+ for the CPU. I think I'll hold off a while longer :)

Must get dull doing these CPU comparisons when a lot of the tests show very small improvements.

Personally I regretted going for the mainstream Haswell. I'm upgrading this year to the cheapest options of Haswell-E. Here are my arguments:
* I don't make use of the integrated Intel HD graphics (Haswell-E has none)
* I bought an aftermarket CPU cooler (Haswell-E CPUs include none)
* I bought a high-end, two-way SLI-capable Z87 mobo; which in the end is similarly priced to the cheapest X99 mobo options with 3-way SLI
* For such small price difference between the most expensive mainstream i7 and the cheapest i7 Extreme, it's worth for 4 additional threads

Go for it. I did. It's a great platform.

I upgraded from a 2600K which was no slouch to say the least but the lack of available PCIe lanes after going SLI bugged me.

2011-3 has loads of plus points.
More RAM slots, cores/threads, PCIe lanes, sata ports, m.2 sata slot and there's nowasted chip space taken up by the iGPU. Now the Haswell-E chips are absolutely huge compared to the mainstream parts, I couldn't believe the size when I saw it next to my 2600K.
 
So is it the GPU or CPU causing these slow downs?

Also if it is a known driver issue should we have mentioned it in the article? I don’t see how this could be a part of the story.
I'd assume you're referring to AMD's well documented graphics driver overhead issue...
jpg

...that becomes apparent as CPU resources become strained.

I think this may be part of the rationale behind Steve and many other site product comparisons using an Nvidia card when testing CPU gaming effectiveness, since it largely removes this graphics driver overhead from the equation. The AMD graphics driver overhead issue and the vagaries of CrossfireX/SLI scaling, are also a major reason that multi-GPU gaming benchmark results show such huge variances depending upon the testing system being used. It's also probably fair to say that the overhead issue was a significant factor in AMD pursuing Mantle, and why the speedup when using it was quite dramatic in CPU-constrained scenarios.


/My apologies Steve for bombarding your review thread with the work of other sites.
 
Back