Intel Core i7-3820 Review: Sandy Bridge-E for the masses

By on February 8, 2012, 11:17 PM

Late last year when we reviewed the new Sandy Bridge-E processors, we mentioned a more affordable version called the Core i7-3820 was coming. Although information about the chip had been revealed, the processor has yet to hit shelves and is now expected to arrive later in February. Fortunately, sample units are being passed around ahead of general availability, so we don't have to wait to see how it stacks up.

The i7-3820 is particularly intriguing because of its sub-$300 retail price -- far less than other chips in the series. For instance, the Core i7-3960X has an MSRP of $999 and sells for more like $1,049, while the i7-3930K has an MSRP of $583 and is fetching $599 at e-tail. Both are six-core CPUs operating over 3GHz with massive 15MB and 12MB L3 caches.

At roughly half the price of the 3930K, we expected Intel to butcher the i7-3820, and while that's partially true, the 3820 remains an impressive specimen with four cores operating at 3.6GHz, a 10MB L3 cache and HyperThreading support. Compared to the similarly priced i7-2600K, the 3820 offers additional L3 cache, support for PCI Express 3.0, quad-channel memory and a platform that will take as much as 32GB of system memory.

Read the complete review.




User Comments: 26

Got something to say? Post a comment
Guest said:

I'm an AMD fan, but if I had to choose between the 2600k and the 3820 I'd go with the latter, because it's bigger.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

I'm an AMD fan, but if I had to choose between the 2600k and the 3820 I'd go with the latter, because it's bigger.
What's bigger the CPU ID number or the system build price?

If I was going to spend extra money on the LGA2011 platform, I wouldn't go below a 6 core CPU.

Sarcasm Sarcasm said:

I'd rather go with a 2600k (and I did) because of how easy it is to overclock. Just hit 5ghz on 1.425v full load and couldn't be happier.

Captain828 Captain828 said:

Regarding the Gaming Performance section, testing those games for a CPU review makes no sense.

You should be testing CPU-limited games such as GTAIV or ArmA2 or at least drop those resolutions at the lowest setting and use the lowest graphical details to remove the GPU out of the equation.

As it stands, the gaming performance page doesn't say anything to me.

About the CPU itself, I'd pick the 2600K instead.

Typo on last page: "the slightly slower i7-2500K" should read "the slightly slower i5-2500K"

Staff
Steve Steve said:

We don't bother with non-realistic gaming settings anymore for these high-end CPU articles. Now we test closer to settings gamers are actually going to use and having said that at 1680x1050 the frame rates were still well over 60fps anyway. The gaming results only say what they need to, the processor makes bugger all difference, at least when comparing one high performance CPU to the next.

Also the games you mentioned are not "CPU-limited", they are more CPU dependent than most games, that is not to say the CPU will be the bottleneck when using a high-end graphics card. Also you will find the The Witcher 2: Assassins of Kings to be just as CPU dependent as any other game out there...

[link]

Crysis 2 isn't bad either...

[link]

DanUK DanUK said:

Nice review.

Just a heads up that on your Fritz Chess 13 graph it says (Lower is better) but I think it's meant to say higher?

For me i'm most concerned with the gaming performance section, and as i'm still running an i7-920, I dont feel the need to upgrade yet. Still it's interesting to see what intel are coming out with in the buget range. These chips are pretty powerful but also a hell of a lot more energy efficient that previous generations.

Arris Arris said:

Is it just me or does the tick (or tock, not sure which one this is) of Intel's tick/tock release schedule seem to be becoming less and less of an actual advancement?

LinkedKube LinkedKube, TechSpot Project Baby, said:

Arris said:

Is it just me or does the tick(or tock, not sure which one this is) of Intel's tick/tock release schedule seem to be becoming less and less of an actual advancement?

For consumer products maybe, but enterprise equipment will prob get the most benefit from it.

amstech amstech, TechSpot Enthusiast, said:

The reviewer says this new chip does OK and applications compared to a i7 920 but not in games? The 2600K barely outperforms the 920 (if at all) when talking games and this CPU isn't any different, so the 2600K must have been disappointing to this reviewer as well considering his logic.

Guest said:

With ACTA there is no reason to upgrade PC anymore.

GeforcerFX GeforcerFX said:

At that price range it's just another slap in the face to AMD after the bulldozer failure, I would love to see all these CPU's benched in Windows 8 though, since it handles more the 4 cores and SMT better then Windows 7, I think you would see the SB-E chips rise to there price tag, and the AMD chips actually start competing.

Sarcasm Sarcasm said:

Steve said:

We don't bother with non-realistic gaming settings anymore for these high-end CPU articles. Now we test closer to settings gamers are actually going to use and having said that at 1680x1050 the frame rates were still well over 60fps anyway. The gaming results only say what they need to, the processor makes bugger all difference, at least when comparing one high performance CPU to the next.

Also the games you mentioned are not "CPU-limited", they are more CPU dependent than most games, that is not to say the CPU will be the bottleneck when using a high-end graphics card. Also you will find the The Witcher 2: Assassins of Kings to be just as CPU dependent as any other game out there...

[link]

Crysis 2 isn't bad either...

[link]

I actually agree with this methodology of testing only in real world scenarios. I mean come on, who in the world will actually buy this type of CPU to game at 800x600 resolution?

dividebyzero dividebyzero, trainee n00b, said:

The reviewer says this new chip does OK and applications compared to a i7 920 but not in games?..[ ]...The 2600K barely outperforms the 920 (if at all) when talking games and this CPU isn't any different

That really depends on what games are being tested, and if the system is GPU limited at the testing resolution/game IQ

Staff
Per Hansson Per Hansson, TS Server Guru, said:

captain828 said:

...or at least drop those resolutions at the lowest setting and use the lowest graphical details to remove the GPU out of the equation.

As it stands, the gaming performance page doesn't say anything to me.

So you play your games at the lowest resolution and graphical details?

Ok, I guess this review is not very enlightening for you then...

Guest said:

I'm holding out for an 8-core with HT ~3 Ghz processor before upgrading from my i7 920.

Thankfully, it's not like the i7 920 with 12 Gb RAM is struggling or anything, so that helps me wait.

Captain828 Captain828 said:

Steve said:

Also the games you mentioned are not "CPU-limited", they are more CPU dependent than most games [...]

So you first say they aren't CPU-limited and then that they are more dependent of the CPU than most games.... doesn't that make them CPU-limited compared to other games?

Sure, you still need a decent GPU to run them, but it will matter less, just how it matters less what kind of quad-core CPU you need to run a modern game (as this review clearly showed).

And I don't even see how the links you posted matter in regards to this review as that review had a GTX590, and when using more than one GPU (since the 590 is pretty much a 570 in SLI) CPU overhead can increase quite a bit when all GPU cores are at full load.

sarcasm said:

I actually agree with this methodology of testing only in real world scenarios.

I agree with it as well, but when the results show a difference of maximum 20% between first place and last for 13 CPUs there's just not much to talk about.

Also, GTAIV and ArmA 2 are actual games, so those qualify as real world scenarios in my book.

Skyrim with uGridsToLoad set to a high level can be pretty taxing on the CPU as well.

Per Hansson said:

captain828 said:

...or at least drop those resolutions at the lowest setting and use the lowest graphical details to remove the GPU out of the equation.

As it stands, the gaming performance page doesn't say anything to me.

So you play your games at the lowest resolution and graphical details?

Ok, I guess this review is not very enlightening for you then...

If you posted just to insult, then you would have better not posted at all since I don't see anything constructive in your comment.

Only thing I was saying is that there's just not much to look at that page since ALL CPUs offer more than 60 FPS, except in C2, but then again no CPU offers 60 FPS in that game.

As such any extra performance doesn't net you anything tangible.

Guest said:

I'm very confused whether or not to buy a system based on the i7-3820 or wait until the Ivy Bridge processors come out. Which one will be faster and have more hardware support?

Staff
Steve Steve said:

So you first say they aren't CPU-limited and then that they are more dependent of the CPU than most games.... doesn't that make them CPU-limited compared to other games?

As I was saying no not at all. If you were using an Athlon II with a high-end GPU then yes certain games could be considered CPU limited. If you are using something like the Core i7-3820 or 2600K then no not so much

I agree with it as well, but when the results show a difference of maximum 20% between first place and last for 13 CPUs there's just not much to talk about.

So when the real world results are not exciting in future we should look for ways to make them exciting by using settings no gamer is ever going to use?

---agissi--- ---agissi---, TechSpot Paladin, said:

Are you kidding? Wait for Ivy Bridge. It'll be faster, cooler, and have ~4x the graphics performance. With Virtu on the Z68 boards the CPU's GPU is utilized so the graphics card doesnt have to spool up and increase power/temps. Of course once a higher load is demand then what the CPU's GPU can handle the graphics card takes it from there.

Captain828 Captain828 said:

Steve said:

As I was saying no not at all. If you were using an Athlon II with a high-end GPU then yes certain games could be considered CPU limited. If you are using something like the Core i7-3820 or 2600K then no not so much

ArmA 2 OA - 10k distance, all maxed out @ 1080p. Try that and let me know how those CPUs scale.

Also, as I previously stated, add SLI/Crossfire in the mix and things can get CPU bound really fast.

So when the real world results are not exciting in future we should look for ways to make them exciting by using settings no gamer is ever going to use?

Not necessarily, but you should look for games that are going to tax the CPU more.

Leaving the gaming part out, a real-world heavy multitasking scenario would be nice in the future.

Guest said:

Here in Brazil there is nobody to beat the prices of AMD, it's cheaper to set up a system "Dual Processor" AMD than buying an i7, so I love AMD! Thanks to low prices and low power consumption!

hahahanoobs hahahanoobs said:

1.480v to get 4.6GHz? Is that because you were using Offset Mode? I found Offset mode sets the idle voltage way too high... on my P8P67 EVO and 2500K anyway.

slamscaper slamscaper said:

steve said:

To be fair, the i7-3960X isn't all that impressive when gaming either,

You have to be kidding. This CPU chews through everything, including games. You make it sound as if there are way better CPU's available for gaming, when in fact that 3960X is one of the top performers.

dividebyzero dividebyzero, trainee n00b, said:

:

To be fair, the i7-3960X isn't all that impressive when gaming either

You have to be kidding. This CPU chews through everything, including games. You make it sound as if there are way better CPU's available for gaming, when in fact that 3960X is one of the top performers.

I'll think you'll find that Steve meant from a performance-per-dollar aspect. The 2 extra cores + L3 cache + tripled price tag over the 2500K/2600K don't translate into a significant real world gaming advantage. So while the 3960X/3930K are impressive in their own right, in relation to Intel's own mainstream platform of LGA 1155 it amounts to a neglible increase in performance at a higher cost of power consumption and overall platform cost...and that very slight increase in performance (in those cases where it does show) is offset by the 2500K/2600K/2700K's superior overclocking ability.

From Steve's 3960X review:

For gamers there's very little to see here. The Core i7-3960X is no faster than the Core i7-2600K or even the Core i5-2500K.

Guest said:

I'm really concidering the 3820 for my build. My parts list for an i5 3570 is $1350-ish and $1550-ish for the 3820 list. In the later build there would be two available PCIe ports at 16x- the support for quad channel ram-usb3...etc. Not to mention, I'll be on a MB with the newer socket and chipset for later CPUs.

I'd like to hear anything anyone has to say about running Planetside 2 well. As far as I can tell anything you can do CPU wise is what gains frames per second. Love to see these chips benchmarked with it somehow.

captaincranky captaincranky, TechSpot Addict, said:

Is it just me or does the tick (or tock, not sure which one this is) of Intel's tick/tock release schedule seem to be becoming less and less of an actual advancement?
Microcenter is using the i3-3225 as a promotional football, selling it for $119.95, ($144.95 @ Newegg), if you but if with a given selection of boards.

This chip is a dual core 4 thread offering, but it has the better HD-4000 graphics onboard. I bought it with a Gigabyte Z77X-UD3H (ATX) board, $149.95 (?) @ Newegg)), for just a shade over $200.00.

Haven't put this together yet, but with a 22nm CPU drawing only 55 watts TDP, graphics included all, it should be an outstanding and thrifty mainstream dynamo.

Point being, Ivy Bridge, without the 22nm process doesn't seem to offer too much over Sandy. Or at the very least you have to go to great lengths to obviate it.

OTOH, Ivy Bridge seems to offer an honest improvement, with respect to a general purpose mid-price machine.

As far as the memory latency issues uncovered in the review go, buying the exact same brand memory modules, (GSkill DDR3-1600), in 4GB capacity will net you 9-9-9-24, but 8GB modules come in at 10-10-10-30. I'm guessing that disparity could only become worse, when spread across 4 8GB modules in 4 channels.

So, where should we be laying the blame for the memory bandwidth issues in the review, on the CPU, the boards, the DIMMs themselves, or any combination of the foregoing factors?

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.