Intel Core i7-5775C Review: Broadwell is off to a rough start on the desktop

Steve

Posts: 3,034   +3,142
Staff member

Last month, we got our first peek at Intel’s new ‘Braswell’ system-on-chip solutions built on a new 14nm process and featuring Intel’s new Airmont CPU cores. Like Braswell, Broadwell is the 14nm die shrink of Haswell, which was built using a 22nm process. Moving to a smaller process results in a smaller die and lower power consumption. However, as is often the case with Intel’s ‘Tick’ updates, a few minor tweaks have also been applied.

The advantage of this minor update is that it's compatible with the existing LGA1150 platform when using an Intel 9-series chipset. It's also possible that Z87 boards may support these new processors through a BIOS update.

There are almost a dozen desktop Broadwell CPUs inbound, five of them 65W desktop parts, and today we have the flagship model on hand, the Core i7-5775C, and its BGA version, the Core i7-5775R. Both chips are almost identical. Other than the fact that the Core i7-5775R is a BGA part, the only other difference is the Turbo frequency which is just 100MHz higher at 3.8GHz. Moving past the specifications, another difference is the price. Whereas the Core i7-5775C features a $366 MSRP the 5775R costs slightly less at $348.

Read the complete review.

 
Page 11 "A Rough But Promising Release" is showing up with a page not found error... Otherwise thanks for the review as always. It's performance leads a bit to be desired, but then again considering the lower clock speed and lower TDP it's not hateful either, but more in line with expectations all things considered.
 
Do we know if there will be a "Broadwell-E"? or will that wait for Skylake-E?

Looks like there is no reason to buy this processor at all...
 
For the sake of the iGPU benchmarks, you should include DDR3 memory frequencies. It makes a HUGE difference in Kaveri's performance. Guess these were run with 1600MHz...
 
For the sake of the iGPU benchmarks, you should include DDR3 memory frequencies. It makes a HUGE difference in Kaveri's performance. Guess these were run with 1600MHz...

Everything was tested with DDR3-2400.

Page 11 "A Rough But Promising Release" is showing up with a page not found error...

I am not sure why that is, it has always worked for me. Glad you enjoyed the review though and thanks for checking it out.
 
Consumers should stop consuming and start using what they have and be happy with it. I have a Core 2 Duo T9600 laptop that I use daily and it's enough for everything except gaming and rendering. Elsewhere, you really don't need to be on latest platform at all. But hey, what would it happen with the tech companies if consumers wouldn't have this uncontrolled urge of buying things they don't really need?
 
Well, the tests are surprisingly in line with year or previous leaks were Intel was having trouble with 14nm node. There's improvement in clock per clock performance but several sites had problem OC those samples over 4.2/4.4 and only at the expence of massive voltage. Add a 25% bump in price over Devil's Canyon (over the lackluster availability) and You're better off buying 4690K/4790K. Let's hope Skylake won't follow the suit. And lets hope AMD has some rabbit up their sleeve, cause it starts to look like We being played by Intel who hit the brick wall of physics, and just changes sockets to squeeze money out of enthusiasts.
 
Consumers should stop consuming and start using what they have and be happy with it. I have a Core 2 Duo T9600 laptop that I use daily and it's enough for everything except gaming and rendering. Elsewhere, you really don't need to be on latest platform at all. But hey, what would it happen with the tech companies if consumers wouldn't have this uncontrolled urge of buying things they don't really need?

Not sure this website is for you ;)

A Core 2 Duo T9600 is bloody slow by today’s standards and I would hate having to do my work on such an old system, it would be slow and miserable. Plus who said everyone looking at upgrading to Broadwell would be coming from Haswell or Ivy Bridge? I am sure you have plenty of Penryn/Conroe brothers that are looking to upgrade.

Well, the tests are surprisingly in line with year or previous leaks were Intel was having trouble with 14nm node. There's improvement in clock per clock performance but several sites had problem OC those samples over 4.2/4.4 and only at the expence of massive voltage. Add a 25% bump in price over Devil's Canyon (over the lackluster availability) and You're better off buying 4690K/4790K. Let's hope Skylake won't follow the suit. And lets hope AMD has some rabbit up their sleeve, cause it starts to look like We being played by Intel who hit the brick wall of physics, and just changes sockets to squeeze money out of enthusiasts.

So exactly what the conclusion said then ;)

“That said, the 5775C couldn’t even be overclocked to match the maximum Turbo frequency of the 4790K which is 4.4GHz (at least our sample couldn’t be, anyway). Just to get the CPU stable enough at 4.2GHz for testing we had to feed it quite a lot of extra voltage.”

“In the end, most consumers will be better off holding back for a few more months till Skylake arrives along with the new 100-series chipsets and DDR4 memory support.”

“Limited availability, likely the result of 14nm node's immaturity. Unimpressive performance and poor overclocking potential. For raw power (sans integrated graphics), the 4790K is cheaper and faster.”
 
I guess having a Sandy Bridge i5/i7 is still worth it today. Pff.

It's nice that the this i7 Broadwell uses some 20% less energy than i7 Devil's Canyon, but seeing how it's also slower, it's a moot point.

Sad. Roll on Zen and Skylake.
 
So exactly what the conclusion said then ;)
Yes, I'm sorry, I just felt I need some introduction to the notion that Intel hit the development wall and You're not the only portal, that I've noted, had problems running it over 4.2GHz, and I have growing hopes AMD will be able to deliver something more cost effective next year, and maybe force Intel's hand on pricing.
 
Yes, I'm sorry, I just felt I need some introduction to the notion that Intel hit the development wall and You're not the only portal, that I've noted, had problems running it over 4.2GHz, and I have growing hopes AMD will be able to deliver something more cost effective next year, and maybe force Intel's hand on pricing.

That's the dream isn't it, not sure it will become a reality next year though. I don't think Intel has hit a wall, I just think the move to 14nm was a little tougher than anticipated. It will appear smoother for AMD has Intel has paved the way.
 
"A Core 2 Duo T9600 is bloody slow by today's standards and I would hate having to do my work on such an old system, it would be slow and miserable. Plus who said everyone looking at upgrading to Broadwell would be coming from Haswell or Ivy Bridge? I am sure you have plenty of Penryn/Conroe brothers that are looking to upgrade. "
A T9600 with an ssd is undistinguishable from a i5 Ivy until you throw in the mix some games or rendering work. So yes, your image of a Core 2 Duo system might be one with a slow HDD and that's not the point. In fact, most of the speedyness of today's systems is related to the faster storage solutions. And that was my point. For the vast majority of people, all these annual cpu launches are irrelevant because they don't change the game. They just give you a second here and there.
 
"A Core 2 Duo T9600 is bloody slow by today's standards and I would hate having to do my work on such an old system, it would be slow and miserable. Plus who said everyone looking at upgrading to Broadwell would be coming from Haswell or Ivy Bridge? I am sure you have plenty of Penryn/Conroe brothers that are looking to upgrade. "
A T9600 with an ssd is undistinguishable from a i5 Ivy until you throw in the mix some games or rendering work. So yes, your image of a Core 2 Duo system might be one with a slow HDD and that's not the point. In fact, most of the speedyness of today's systems is related to the faster storage solutions. And that was my point. For the vast majority of people, all these annual cpu launches are irrelevant because they don't change the game. They just give you a second here and there.

I am not going to argue with you but if you honestly think an almost 8 year old Intel dual-core is indistinguishable from an Ivy Bridge Core i5 then I don't think I need to.

That said it would be interesting to look back at a Core 2 Duo E8300 and compare it to today's Celeron, Pentium and Core processors.

Blast from the past...
https://www.techspot.com/review/85-intel-core-2-wolfdale-vs-conroe/page2.html

CINEBENCH R10 multi-thread CPU performance
Core 2 T9600 = 5936pts
Core 2 Duo E8400 = 7209pts
Ivy Bridge Core i5 = 25,500pts

Fun fact, Pentium G3220 scores a little over 11,000pts.
 
iGPU Gaming Performance. Bioshock Infinite : "1440x900 Low = 64.2fps". What was the iGPU performance like on 1080p Medium for BI and other games? As nice as it is that iGPU's are giving people a wider choice of dGPU-not-required games each year, I really wouldn't personally look at them seriously until they could play at least 2010-2013 games at native 1080p resolutions on "Medium" at a stable 60fps. I'm no "Ultra snob" and happily use "Medium" on my HTPC's 750Ti, but in my experience, "Low" presets are just too ugly (there's often a far bigger downward texture quality jump from Med to Low/Very Low than upward jump from Med to High to Ultra on some games). Example for Bioshock Infinite:-

High : http://images.bit-tech.net/content_images/2013/04/bioshock-infinite-performance/high-textures-b.jpg
Low : http://images.bit-tech.net/content_images/2013/04/bioshock-infinite-performance/low-textures-b.jpg

Likewise, non-native 720-900p resolutions can often give ugly scaling effects (blurred text, etc) via monitor upscaling that can't be captured in screenshots. APU's still seem to be "all or nothing" thing, ie, you either go all out and build a minimum 750Ti on a chip, or you pick a cheaper CPU (Haswell i3 or even a 2nd hand Sandy / Ivy Bridge i5) then buy a 750Ti if you're on a tight low end budget. A half speed iGPU is still just below that cutoff threshold of acceptability for FPS games, even with settings turned down. And soon with the GTX950 coming out, the "low end" bar will have been raised again. It's like a rat-race iGPU's cannot win and constantly lag 2-5 years behind modern low end cards, with a price premium that doesn't compete with cheaper CPU + budget dGPU.

Would also be nice to see Intel make a modern "Gamer Edition" Haswell / Skylake equivalent of the i5-3350P (ie, a proper iGPU-less quad-core with a die size equal to or smaller than an i3 (with iGPU) and a lower price to match). I know this is a tech site, and we're supposed to be "enthusiastic" over the next big thing, but with annual +5% clock for clock performance boosts, the last 3 generations of which show the same fps in many games when mated with a typical mid/upper-mid range single dGPU, I think many people have reached the point of finding lower more aggressive pricing of Intel's i5 quad-cores far more attractive & news-worthy than a desktop iGPU rat-race that both dGPU gamers and non gamers alike shrug their shoulders at.
 
"Integrated Iris Pro 6200 graphics steals the only edge AMD's APUs had over Intel."

Am I missing something? You are comparing an Intel processor that costs more than triple the price of the AMD processor. Are the Iris Pro 6200 graphics going to be filtered down to the i3's? If so, then AMD might be in real trouble.
 
"Integrated Iris Pro 6200 graphics steals the only edge AMD's APUs had over Intel."

Am I missing something? You are comparing an Intel processor that costs more than triple the price of the AMD processor. Are the Iris Pro 6200 graphics going to be filtered down to the i3's? If so, then AMD might be in real trouble.

"Integrated Iris Pro 6200 graphics steals the only edge AMD's APUs had over Intel."

Am I missing something? You are comparing an Intel processor that costs more than triple the price of the AMD processor. Are the Iris Pro 6200 graphics going to be filtered down to the i3's? If so, then AMD might be in real trouble.

Obviously the Core i7-5775C costs three times more than the most expensive APU. The point is Intel is now focused on improving graphics performance which is why we said AMD should take notice. We weren’t suggesting that the Core i7-5775C would cannibalize APU sales :)

The only reason we have suggest AMD APUs over Intel’s offerings for the past 4 years has been its superior graphics performance. If that goes away with how things stand now that would be very bad for AMD. Obviously the Iris Pro 6200 graphics and future versions of it will filter down to more affordable chips, it is just a matter of time.
 
The only reason we have suggest AMD APUs over Intel’s offerings for the past 4 years has been its superior graphics performance. If that goes away with how things stand now that would be very bad for AMD. Obviously the Iris Pro 6200 graphics and future versions of it will filter down to more affordable chips, it is just a matter of time.
A more powerful i3 to and iGPU to go with [insert random thinHTPC build]? Yes please.
 
“That said, the 5775C couldn’t even be overclocked to match the maximum Turbo frequency of the 4790K which is 4.4GHz (at least our sample couldn’t be, anyway). Just to get the CPU stable enough at 4.2GHz for testing we had to feed it quite a lot of extra voltage.”

“In the end, most consumers will be better off holding back for a few more months till Skylake arrives along with the new 100-series chipsets and DDR4 memory support.”

“Limited availability, likely the result of 14nm node's immaturity. Unimpressive performance and poor overclocking potential. For raw power (sans integrated graphics), the 4790K is cheaper and faster.”
Excellent review @Steve , I am shocked by the results though. So you could not even get the chip up to 4.5ghz (Let alone 4.4 which was the turbo as you said)??? Or how much voltage do you think would be necessary (Have you played with it a bit more). That is really disappointing because that makes it more of a no buy for the enthusiasts who want the unlocked capabilities of that chip (Err maybe it would be better to say enthusiasts who want to unlock the capabilities of that chip).

Good read, but this is the first time in awhile I have seen an Intel chip disappoint so much.
 
Thanks for this review. Based on that I will not need to invest in replacing my i5-4460.
Do we know when to expect the "Tock" from Intel? Next year?
 
"A Core 2 Duo T9600 is bloody slow by today's standards and I would hate having to do my work on such an old system, it would be slow and miserable. Plus who said everyone looking at upgrading to Broadwell would be coming from Haswell or Ivy Bridge? I am sure you have plenty of Penryn/Conroe brothers that are looking to upgrade. "
A T9600 with an ssd is undistinguishable from a i5 Ivy until you throw in the mix some games or rendering work. So yes, your image of a Core 2 Duo system might be one with a slow HDD and that's not the point. In fact, most of the speedyness of today's systems is related to the faster storage solutions. And that was my point. For the vast majority of people, all these annual cpu launches are irrelevant because they don't change the game. They just give you a second here and there.

Could be true if you are only using your PC for browsing... But if you do more than that I doubt a C2D would give you the same user experience as anything you can buy today.
 
Gotta say that I'm fairly disappointed. I know my 3770k isn't the bee's knee's anymore, but there really isn't much of a performance increase going from the 3000 series to the 4000 series and the 5000 series seems like a small step back. It seems like the 3k, 4k, and 5k series are all battling it out for the top spot and the only winner is me as I can hold off for another generation. There really isn't any reason at all to upgrade.

The 4.5ghz overclock I'm running helps too.
 
Last edited:
Promising? How so?

-Is weaker than 4790K
-Uses the same or more energy than 4790K
-Costs more than a standard i7
-Seams to beat the A10-7870k by ~50% while costing 250% as much!!!

FAIL FAIL FAIL
 
"Integrated Iris Pro 6200 graphics steals the only edge AMD's APUs had over Intel."

Am I missing something? You are comparing an Intel processor that costs more than triple the price of the AMD processor. Are the Iris Pro 6200 graphics going to be filtered down to the i3's? If so, then AMD might be in real trouble.

"Integrated Iris Pro 6200 graphics steals the only edge AMD's APUs had over Intel."

Am I missing something? You are comparing an Intel processor that costs more than triple the price of the AMD processor. Are the Iris Pro 6200 graphics going to be filtered down to the i3's? If so, then AMD might be in real trouble.

Obviously the Core i7-5775C costs three times more than the most expensive APU. The point is Intel is now focused on improving graphics performance which is why we said AMD should take notice. We weren’t suggesting that the Core i7-5775C would cannibalize APU sales :)

The only reason we have suggest AMD APUs over Intel’s offerings for the past 4 years has been its superior graphics performance. If that goes away with how things stand now that would be very bad for AMD. Obviously the Iris Pro 6200 graphics and future versions of it will filter down to more affordable chips, it is just a matter of time.

Yeah but there is a reason it isn't in the i3's - Iris Pro with EDRAM is very expensive to produce. Even if they did make an i3 with these graphics it would likely cost over $200 and have vastly inferior drivers.

Until proven otherwise I only see Iris Pro graphics as Intel trying to prove they "can" make graphics as good as AMD. But unless they can make it as economically as AMD can this is nothing to brag about.
 
Back