G
Guest
Seeing the i3 do so well blew my tiny little mind. I knew the i7 in gaming wasn't vastly superior to the i5 but I had no idea the i3 did so well. I would really like to see the FX line up added to the test.
Agreed.If you buy an i7 for gaming it is because you don't want to upgrade for 5+ years. Just look at the old i7-9xx. It still performs just fine the most modern games that utilize all 8 threads while the old pre-sandy bridge i5's can't even beat a Phenom II x4. Same thing will happen in 2 years with the i7-4770K.
I agree with their conclusion that the newer Core i3 offers enough performance for gaming.
However, anyone that is building a new systems should consider an i5 or i7 just for taking advantages of new DX12 performance and new gaming engines being designed from new ways to take advantage of multi-core performance on the XB1/PS4 consoles.
As developers move to be more multi-core optimized, newer games in the next year or so will have an advantages on the i5/i7 over the i3.
That is how it should be. If I wanna game I would need to only buy higher end GPU instead of both CPU and GPU. I hope Mantle makes it work.I agree with their conclusion that the newer Core i3 offers enough performance for gaming.
However, anyone that is building a new systems should consider an i5 or i7 just for taking advantages of new DX12 performance and new gaming engines being designed from new ways to take advantage of multi-core performance on the XB1/PS4 consoles.
As developers move to be more multi-core optimized, newer games in the next year or so will have an advantages on the i5/i7 over the i3.
This isn't really good advice. The main push for DX12 and things like Mantle is to remove the CPU as the limiting factor so that gaming performance will be largely dictated by your video card. This means that the i7 and i5 will actually be less of a performance gain when you consider budget since you would have more money to sink in to your video card.
Every game every released runs on dual-thread CPUs, and it will probably remain that way for a long time still. Also, measures like "how many threads it has" are meaningless. Most smartphones SoCs today have four threads, yet the dual-thread Pentiums run circles around them. Same for the console CPUs, which have eight threads but are no more powerful than an average laptop CPU (since they are equivalent to two 15W Kabini chips together).Nah. It's still a dual core chip and some current and most certainly future games won't even run on dual-thread CPUs. Personally I can't recommend Pentiums to anyone.
Buying an i7 does not mean you won't need to upgrade in five years (compared to buying an i5). The only advantage an i7 has over an i5 is hyper-threading. The only scenario where an i7 would be any benefit is if the game can use more than four threads. Even then, the CPU performance increase would be somewhere up to 30% (what Intel claim hyper-threading can offer), but that does not translate into 30% higher FPS. The most likely scenario is that within five years your i7 will either provide no advantage over an i5, or will provide a small increase in performance IF the games uses hyper-threading. It means that if the i5 is no longer enough, the i7 isn't either. And if the i7 is enough, the i5 is as well.If you buy an i7 for gaming it is because you don't want to upgrade for 5+ years. Just look at the old i7-9xx. It still performs just fine the most modern games that utilize all 8 threads while the old pre-sandy bridge i5's can't even beat a Phenom II x4. Same thing will happen in 2 years with the i7-4770K.
Forgot one important part: On top of the performance stuff I mentioned above, there is a bigger issue. Five years from now you'll be pressed to upgrade anyway due to platform improvements, not performance. The X58 platform from five years ago has no access to PCIe 3.0 and very limited access to USB 3.0 and SATA 6 Gbps (not natively supported, some motherboards used additional controllers to support those features and had few ports). Those features are now standard (and with many ports) on even the cheapest H97/Z97 motherboards you can find.If you buy an i7 for gaming it is because you don't want to upgrade for 5+ years. Just look at the old i7-9xx. It still performs just fine the most modern games that utilize all 8 threads while the old pre-sandy bridge i5's can't even beat a Phenom II x4. Same thing will happen in 2 years with the i7-4770K.
Well I agree with you but you also have to take into account the cost of DDR4 which is much higher than DDR3 is currently. Yes an i7 5820K is one heck of a bargain since you get support for 3 GPUs (28 lanes), 6 physical cores and 6 virtual (12 threads total), and the better designed thermal interface for better temps and performance. Another thing you have to take into account however is the binning process for the i7 5820k because while its a 6 core 12 thread beast, it starts very low clocked and from what I have seen it does go all over the place when it comes to overclocking. I have seem many people report easily hitting 4.5ghz on some while others stress at the 4.0ghz threshold which could limit the survivability of the chip in the long run (I am factoring in the voltage generally required and the fact that the processor is lower clocked). Overall its still a great deal, but you just have to make sure you understand the risks involved with such a chip before making the drop.Personally I regretted going for the mainstream Haswell. I'm upgrading this year to the cheapest options of Haswell-E. Here are my arguments:
* I don't make use of the integrated Intel HD graphics (Haswell-E has none)
* I bought an aftermarket CPU cooler (Haswell-E CPUs include none)
* I bought a high-end, two-way SLI-capable Z87 mobo; which in the end is similarly priced to the cheapest X99 mobo options with 3-way SLI
* For such small price difference between the most expensive mainstream i7 and the cheapest i7 Extreme, it's worth for 4 additional threads
The point of DirectX 12 is reducing the workload on the CPU by reducing API overhead. That means that low-end CPUs will perform closer to higher-end CPUs, not that high-end CPUs (specially the ones with higher core counts) will automatically see better usage. If Core i7 processors are already not relevant for gaming compared to i5 and i3 processors, DirectX 12 will make them even more so.For gaming though, your probably going to see a very minute difference until DX12 becomes the mainstream.
There are already games that refuse to run on less than four threads. When DA:I came out, it refused to run on anything with less than 4 threads. Maybe they changed that, but most likely not.Every game every released runs on dual-thread CPUs, and it will probably remain that way for a long time still. Also, measures like "how many threads it has" are meaningless. Most smartphones SoCs today have four threads, yet the dual-thread Pentiums run circles around them. Same for the console CPUs, which have eight threads but are no more powerful than an average laptop CPU (since they are equivalent to two 15W Kabini chips together).Nah. It's still a dual core chip and some current and most certainly future games won't even run on dual-thread CPUs. Personally I can't recommend Pentiums to anyone.
Buying an i7 does not mean you won't need to upgrade in five years (compared to buying an i5). The only advantage an i7 has over an i5 is hyper-threading. The only scenario where an i7 would be any benefit is if the game can use more than four threads. Even then, the CPU performance increase would be somewhere up to 30% (what Intel claim hyper-threading can offer), but that does not translate into 30% higher FPS. The most likely scenario is that within five years your i7 will either provide no advantage over an i5, or will provide a small increase in performance IF the games uses hyper-threading. It means that if the i5 is no longer enough, the i7 isn't either. And if the i7 is enough, the i5 is as well.If you buy an i7 for gaming it is because you don't want to upgrade for 5+ years. Just look at the old i7-9xx. It still performs just fine the most modern games that utilize all 8 threads while the old pre-sandy bridge i5's can't even beat a Phenom II x4. Same thing will happen in 2 years with the i7-4770K.
Also, you can't compare the i5 and mainstream i7 of today (literally the same chip, only the i5 has hyper-threading and 2 MB of cache disabled; been this way since Sandy Bridge) to the first generation i5 and i7, which were two different chips (Lynnfield vs. Bloomfield) with different TDPs on different platforms. Their difference was more than just hyper-threading and a bit of cache.
Well I agree with you but you also have to take into account the cost of DDR4 which is much higher than DDR3 is currently. Yes an i7 5820K is one heck of a bargain since you get support for 3 GPUs (28 lanes), 6 physical cores and 6 virtual (12 threads total), and the better designed thermal interface for better temps and performance. Another thing you have to take into account however is the binning process for the i7 5820k because while its a 6 core 12 thread beast, it starts very low clocked and from what I have seen it does go all over the place when it comes to overclocking. I have seem many people report easily hitting 4.5ghz on some while others stress at the 4.0ghz threshold which could limit the survivability of the chip in the long run (I am factoring in the voltage generally required and the fact that the processor is lower clocked). Overall its still a great deal, but you just have to make sure you understand the risks involved with such a chip before making the drop.
I love these kind of articles. Techspot seems to be the vest at this kind of things, followed closely by Tom's Hardware.
It's interesting to note that up to 770/960 and 280X level of performance an i3 is enough.
Now, if you'd be so kind, do this article with AMD CPUs . 860K, FX 4300, 6300, 8350 and/or 9590. Throw in even OC results if you have time
So, if I understand the article correctly, it sounds like it said, if you purchase a very fast GPU you don't really need a quad-core for video games?
Also, no mention of multi-tasking in general. I would assume that anyone who keeps a lot of programs running at once would benefit from a quad-core CPU. I have 1 or 2 web browsers open almost 24/7 with a ridiculous amount of open tabs. Plus, Visual Studio programming environment, PDF readers, email clients, Office programs. Wouldn't a quad-core in these scenarios be hugely beneficial as well?
I just bought my first quad-core CPU literally days ago and after reading the article it makes it sound like a waste of money.
Don't overthink your decision . You will most likely benefit from choosing an i7 for work! And even so new games can experience sudden fps drops on some scenes (FC4 for example) if you play on i3 (and I'm not talking drops below 60fps for vsync, but somewhere in the 30's). That part of the story is not seen here.
yeah ok, it might not be as big of a difference on Nvidias, but for AMD graphics fps drops are quite noticable. Known issue with their driver. There was a good article with in-game graphs not a while ago, I'm sure you read it.So, if I understand the article correctly, it sounds like it said, if you purchase a very fast GPU you don't really need a quad-core for video games?
Also, no mention of multi-tasking in general. I would assume that anyone who keeps a lot of programs running at once would benefit from a quad-core CPU. I have 1 or 2 web browsers open almost 24/7 with a ridiculous amount of open tabs. Plus, Visual Studio programming environment, PDF readers, email clients, Office programs. Wouldn't a quad-core in these scenarios be hugely beneficial as well?
I just bought my first quad-core CPU literally days ago and after reading the article it makes it sound like a waste of money.
Can’t say in the multi-tasking scenario that you speak of a Core i5 will be hugely beneficial over a Core i3, at least no more so than what we have shown. Unless you are encoding a video at the same time and using Photoshop heavily and still expecting normal usage. The web browser thing is more a test of system memory that CPU performance, you can open 100 tabs if you want as long as you have enough system memory the performance will be much the same on a Core i3 as a Core i5.
If you can afford a Core i5 over a Core i3 it is obviously a better choice, it is after all faster.
Don't overthink your decision . You will most likely benefit from choosing an i7 for work! And even so new games can experience sudden fps drops on some scenes (FC4 for example) if you play on i3 (and I'm not talking drops below 60fps for vsync, but somewhere in the 30's). That part of the story is not seen here.
No way. I have played Far Cry 4 extensively with a Core i3 and Core i5 using the GTX 980 and I was unable at all times to tell which processor was being used.
yeah ok, it might not be as big of a difference on Nvidias, but for AMD graphics fps drops are quite noticable. Known issue with their driver. There was a good article with in-game graphs not a while ago, I'm sure you read it.
So are you telling me that this game would run on a 15W Kabini because it has four threads, but not on a 55W Pentium (which can be overclocked) because it has two? Number of threads is not indicative of performance.There are already games that refuse to run on less than four threads. When DA:I came out, it refused to run on anything with less than 4 threads. Maybe they changed that, but most likely not.
Also, you talk about bloomfield i7 (920 model for example) which was unusually popular for extreme platform, but there was i7 for 1156 (i7 860 for example) that is more common! You sir, are the one mixing things up! Even if there were two different chips, difference is basically the same as now (difference in cache size, HT, frequency and number of channels on memory controller).
You can say that i5 760 was 4670k of today, i7 860 -> i7 4770k, i7 920 -> 5820k (lowest on the 1150). Or you can say i7 920 -> 4820k (2011 quad) if that makes you happy.
Can’t say in the multi-tasking scenario that you speak of a Core i5 will be hugely beneficial over a Core i3, at least no more so than what we have shown. Unless you are encoding a video at the same time and using Photoshop heavily and still expecting normal usage. The web browser thing is more a test of system memory that CPU performance, you can open 100 tabs if you want as long as you have enough system memory the performance will be much the same on a Core i3 as a Core i5.
If you can afford a Core i5 over a Core i3 it is obviously a better choice, it is after all faster.
In the worst case scenario in the article for gaming, which is Metro Rudux with a GTX 980, the i5-4690 offers 20,91% more performance (at 1080p) for 87,5% higher price. That's already a bad value, but it's unrealistic that someone on a budget would be using a GTX 980 to begin with. With the GTX 960, it offers 6,62% more performance for the same 87,5% higher price. And that's the case where the jump from the i3 to the i5 is the highest in performance.Even in the parts of the article that show a quad-core CPU is faster than dual-core CPUs, is it really enough to worry about? I'm thinking no at this point.
Hmmm Haswell-E looks like it's an improvement to my 2600K, albeit at 4.7Ghz. Might consider looking at an upgrade this year.... wait!... £800+ for the CPU. I think I'll hold off a while longer
Must get dull doing these CPU comparisons when a lot of the tests show very small improvements.
Personally I regretted going for the mainstream Haswell. I'm upgrading this year to the cheapest options of Haswell-E. Here are my arguments:
* I don't make use of the integrated Intel HD graphics (Haswell-E has none)
* I bought an aftermarket CPU cooler (Haswell-E CPUs include none)
* I bought a high-end, two-way SLI-capable Z87 mobo; which in the end is similarly priced to the cheapest X99 mobo options with 3-way SLI
* For such small price difference between the most expensive mainstream i7 and the cheapest i7 Extreme, it's worth for 4 additional threads
I'd assume you're referring to AMD's well documented graphics driver overhead issue...So is it the GPU or CPU causing these slow downs?
Also if it is a known driver issue should we have mentioned it in the article? I don’t see how this could be a part of the story.