Intel Haswell CPU lineup leaked online, led by flagship Core i7-4770K

If Intel is slowing development simply because they have little competition from Amd then they are shooting themselves in the foot by killing the upgrade market.

Sales are not only generated by beating your competitors but also by substantially beating your previous generation product so people want to upgrade........with my 2700k @5.5ghz looks like I won't be giving Intel any more money for at least the next 5 years at this rate.
 
Whether people care about IGP is really neither here nor there in this context, and if integrated graphics is something "very few people will care about" , then someone should tell AMD they're heading down a cul-de-sac.
I meant this chip in particular, people who buy a 4770K are either gamers who have a gaming GPU or people who use the CPU for productivity purposes and they will have either a workstation GPU or will be fine with a basic IGP. Better IGPs on lower end CPUs are definitely a good idea though.
 
slh28
Unfortunately or not, all the CPU's (4C) are exactly the same die, with exactly the same IGP. The only thing that separates a 4770K from a low power i5 is the binning process...so you could say that the IGP is a zero cost extra. If Intel had to utilize a second wafer line solely for K series CPUs with low/no IGP, I'm betting that the price would be passed on to the end user...and paying more for less generally doesn't play.
Don't use IGP myself, but if I was to ever need to sell an old discrete graphics card to fund a new one, at least Haswell's HD4600 would make needing a placeholder card redundant - if it can drive 4K displays, then a paltry 2560x1440 shouldn't represent a problem.
 
Well this is kind of frustrating for me. I was going to slowly upgrade my rig over first few months of the new year, starting with a Z77 Motherboard. But if this new Haswell chip is killing off that socket am I better off waiting?

I dont want to spend £1000(approx) on new hardware in 2013 only to have to spend it all again in 2 years time....

Also will this mean there will be DDR4 memory on the horizon as well?
 
Go ahead and buy one. the i7 920 is still an amazing chip. the 3570k is substantially faster, but no programs today can utilize that power. the i5 will probably be able to game for at least another half decade. I JUST upgraded from a core 2 quad, only because the motherboard gave out.
 
Same clock speed as the 3770K, same cores/threads, higher TDP, wow that really is disappointing. I don't know the exact percentage but very, very few people will care about the improved IGP on this chip.
I dont find a need to upgrade from my current i7 740qm, I just will upgrade to haswell since my graphics card in my laptop is lacking. I find it really disappointed intel cant even make a 8 core variant... when you see startups creates 50 core ARM chips
 
slh28,

I am not sure what you were expecting? With little competition from AMD, Intel will maximize their yields to reduce costs/improve profitability. Not ramping up clocks will help to achieve this goal. This is why it's so important that we have a competitive AMD. Otherwise we end up with a slowed pace of progress in the CPU space, as we have been seeing since Core i7 920 days frankly. There will still be an IPC increase of about 10%. Were you expecting a processor 20-30% faster than IVB 3770k? There also shouldn't be a material decrease in power consumption since Haswell uses a more complex GPU and is still manufactured on the same 22nm node as IVB is.

Speaking of the original 1st generation Core i7, if someone plays games with a single flagship GPU, if a gamer overclocks them to 3.75ghz+, they are still not a bottleneck in modern games:

http://www.guru3d.com/articles_pages/far_cry_3_graphics_performance_review_benchmark,7.html

That means for anyone with a Core i7 920 @ 3.8-4.0ghz, there is little reason to upgrade outside of motherboard features and reduction of power consumption in overclocked states. That's not necessarily a bad thing as CPUs today last longer than ever. That $ can be spent on a larger/faster SSD or a GPU upgrade.

For anyone rocking i5-2500k/2600k/3570k/3770k @ 4.5ghz+, Haswell will obviously be a waste of $ performance wise. Of course if one resells his/her parts and minimizes the upgrade costs, it's still fun to play with new hardware :)
I disagree, I am running a 1st gen and the only bottleneck I have is my weak dedicated GPU
 
UNKNOWN9122,

What do you disagree with hard real world data? Most reviewers test a stock i7 920 @ 2.66ghz against an i7-3770K. Of course the latter would beat it in games. That's not how enthusiast gamers on this site use their parts though. Many of us overclock. Take a Core i7 920 @ 3.8-4.2ghz and you'd be seriously hard pressed to tell the difference between that an an i7-3770K OC if you are only using 1 GPU. You'd need to start using 2 GPUs to really push the CPU to the limits. For most games today, you are going to be by far GPU bottlenecked - Hitman Absolution, Sleeping Dogs, Far Cry 3, not to mention games like Metro Last Light, Crysis 3 should be even more GPU demanding.

I mean if you need the absolutely faster CPU for MMOs or Starcraft 2 where most of those titles are poorly threaded and benefit a lot from IPC per core, then by all means upgrade. I will be upgrading from Sandy Bridge because I like to play with new parts, not because I will feel the difference in gaming speed.

dividebyzero,

"Don't use IGP myself, but if I was to ever need to sell an old discrete graphics card to fund a new one, at least Haswell's HD4600 would make needing a placeholder card redundant - if it can drive 4K displays, then a paltry 2560x1440 shouldn't represent a problem."

Ya, that's a good point. The problem is Intel is putting GT2 series GPU into high-end desktop Haswells, with the much faster GT3 being reserved for mobile parts. Other than driving the displays, it's unlikely that Haswell's IGP on the desktop will amount to much. People who play less GPU demanding titles like Civilization 5, Starcraft 2 or Minecraft are actually more likely to use an APU. It's not like someone with an i7-4770K is going to fire up BF3 or Crysis 3.

In those popular titles that many play on laptops, Intel's GPU actually performs the worst!

http://images.anandtech.com/graphs/graph6332/50163.png
http://images.anandtech.com/graphs/graph6332/50165.png
http://images.anandtech.com/graphs/graph6332/50122.png

^ You can see that even if Intel's 4600 GPU is 2x faster than HD4000 series, it's still way too slow for modern gaming. Alternatively, you can extrapolate based on this chart:

http://alienbabeltech.com/abt/viewtopic.php?p=41174

For bare minimum gaming of modern titles like FC3 or Sleeping Dogs, HD7750 is necessary. That GPU is 4.4x faster than HD4000 series.

Support for 4K TVs on Haswell is pure marketing and has almost no practical applications since a 32 inch 4K LCD monitor for PC costs $20,000+. Given how high the prices of 30 inch 2560x1600 monitors still are, it will be a miracle if 4K LCD monitors on the PC are affordable even for enthusiasts in the next 5 years. Most likely it will take longer.

http://www.engadget.com/2012/06/05/viewsonic-vp3280-led-4k-monitor-hands-on/

4K is not at all a selling feature for this generation CPU, maybe for a CPU that's out in 4-5 years it will be.

If someone really wanted a stop-gap budget card to drive their 30 inch 2560x1600 monitor, it's way cheaper to buy a $15 GT210 than to "upgrade to Haswell".
http://www.newegg.com/Product/Product.aspx?Item=N82E16814150640

None of the points you provided are valid enough for upgrading to Haswell from modern i7s, especially from modern SB/IVB CPUs. You sighted an example of an i5 that uses 35W but even if a CPU uses 65W, the break-even on electricity consumption cost will be like 10-20 years compared with the cash outlay to upgrade.

Like I said, upgrading to Haswell will be mostly for those who are using very old CPUs like Core 2 Duo/Quad (65nm/45nm) or for gaming enthusiasts/overclockers who like playing with new parts and would have upgraded even if Haswell was only 15% faster.

Other features might be more important like Thunderbolt for external storage. But even that is stretching it.

Summer 2006 Intel gave us E6600 2.4ghz
Summer 2007 for the same price we got Q6600 2.4ghz (2x the cores)
Fall 2008 we got i7 920 that overclocked like mad and had a real tangible difference from Q6600.

Since then, it's been mostly stagnation on Intel side unless you ponied up for i7-3930X.
 
UNKNOWN9122,

What do you disagree with hard real world data? Most reviewers test a stock i7 920 @ 2.66ghz against an i7-3770K. Of course the latter would beat it in games. That's not how enthusiast gamers on this site use their parts though. Many of us overclock. Take a Core i7 920 @ 3.8-4.2ghz and you'd be seriously hard pressed to tell the difference between that an an i7-3770K OC if you are only using 1 GPU. You'd need to start using 2 GPUs to really push the CPU to the limits. For most games today, you are going to be by far GPU bottlenecked - Hitman Absolution, Sleeping Dogs, Far Cry 3, not to mention games like Metro Last Light, Crysis 3 should be even more GPU demanding.

I mean if you need the absolutely faster CPU for MMOs or Starcraft 2 where most of those titles are poorly threaded and benefit a lot from IPC per core, then by all means upgrade. I will be upgrading from Sandy Bridge because I like to play with new parts, not because I will feel the difference in gaming speed.

hmm. something to think about lol(y)
 
The problem is Intel is putting GT2 series GPU into high-end desktop Haswells, with the much faster GT3 being reserved for mobile parts. Other than driving the displays, it's unlikely that Haswell's IGP on the desktop will amount to much
1.I think Intel are well aware that for gamers their IGP is, and will continue to lag behind discrete solutions, which is precisely why Intel have Lucidlogix Virtu as an option, and Intel aren't known for their excellence in graphics driver implementation.
2.Enthusiast PC gamers are a niche market at best, and if consoles manage to drag their performance into the twenty-first century that niche becomes smaller. More people are likely to use the IGP for video playback take advantage of the onboard encoder than serious gaming;
45954.png

. People who play less GPU demanding titles like...[long...long...long rant about a point that no one else is disputing].. Support for 4K TVs on Haswell is pure marketing
Missed the point by a country mile.
If Haswell can run 4K displays, then anyone who needs display out capability during discrete card swap outs (or for troubleshooting discrete graphics hardware/driver issues) has an at hand solution. As I said before- a zero cost extra.
None of the points you provided are valid enough for upgrading to Haswell from modern i7s....[lengthy straw man argument]
Please point me to ANY statement ANYWHERE where I made that made that assertion. In fact, there isn't a single post by anyone in this thread taking that stance
You sighted an example of an i5 that uses 35W but even if a CPU uses 65W, the break-even on electricity consumption cost will be like 10-20 years compared with the cash outlay to upgrade
Lateral thinking not your strong suit? Scenario #1: A 35W full fat desktop desktop CPU >>> passive cooling options >>> potent HTPC/SFF. Scenario #2: If Intel can fit a 4C desktop CPU + IGP into a 35W envelope, what are the chances of ULV mobile Haswell reaching the 10W target. What would be the bigger selling point; electricity draw from the wall, or the increased battery life from a 10W CPU+IGP ?
Since then, it's been mostly stagnation on Intel side unless you ponied up for i7-3930X.
Personally I blame:
1. The other x86 licence holders for treating the business like an occasional hobby
2. 90% + of the tech buying public who have no interest in what resides inside the hardware they buy
3. Software companies that have little interest in reworking code unless sponsored by an IHV

Intel have a lot of faults, but for all of them, they still offer competitive perf/$ in a vast range of options in a climate where their primary competitor has essentially been AWOL for six years. You are arguing for a quantum leap in computing performance for every Intel generation, lets have a look at an alternate reality:
Intel releases Sandy Bridge/Ivy Bridge/Haswell at the clocks it can readily sustain rather than the anaemic 3-3.5GHz they launch with - lets say 4 - 4.5GHz. Intel also sell at the miniscule profit margin that AMD utilize (so, take into account die size, yield, foundry capacity/amortization). Intel only offer unlocked CPUs (OEMs rely upon locked down BIOS options). Intel offer 8C/16T HEDT parts (as opposed to server only) as the top tier HEDT SKU's and pricing is reduced accordingly down through the product stack. Intel sell chipsets for the same contract price as AMD. Intel removes the PCI-E lane restriction from mainstream CPUs and replicates the same on-die controller found on the server parts.
Taking those factors into consideration, you would see lower binned Intel CPUs at ~<$30, and a stock 4+GHz 3770K well under half price...with an Intel socket board costing no more than an AMD socket board for a better feature set (PCI-E 3.0, native USB3.0 (FM1/2 excepted), RST etc.)
Question: How long does AMD remain viable in any x86 market ?
Maybe you should be careful what you wish for.
 
@DBZ
I no longer believe AMD is a viable competitor for Intel, the place has been taken by ARM, due to huge demand for 'mobile' SoCs, and the game is bit different, I.e. low margin/high volume. Intel has the performance lead by huge margin and ARM have power consumption advantage.

In the coming years, both sides will move towards the middle IMO, until they reach an optimum position where performance will be much improved without sacrificing too much rise in power consumption. Whoever reaches that stage first, will be winner, and keeping in mind the huge advantage of R&D and fabs, I'll put Intel in a slightly advantageous position. Beside, Intel with a mix of these two just need to convey one thing to developers that 'it is good enough' and rest will take care of itself.
 
Intel will be in the box seat while x86 remains the dominant ISA. I'd like to see how the parallelization and server initiatives shake out (industry adoption). Intel have x86 + GPGPU (Xeon Phi), while the ARM analogue would be Nvidia's Maxwell/Project Denver (GTX 800 series) ARMv8 + GPGPU for HPC (now confirmed for TSMC's 20nm node). The server/pro market -traditionally carrying inertia- seems to be making a reasonable push towards ARM, although any present x86 based ecosystem is going to be an uphill struggle for any vendor against Intel.
With Intel's S1200 recently released there seems no end of comparisons and crystal ball gazing...even AMD pretending they're at the grown-ups table. Interesting times ahead.
 
Looks like I'll still stick with my Z77 upgrade then, I don't like the new socket 1150 at all it sounds so backward and the Haswell CPU is not a big step up, Ivy Bridge is still the best choice in term of performance and power efficiency.
 
...
Yup, totally agree with everything you said, it's just disappointing that Intel are holding back due to the lack of competition. I'm pretty sure this could have been a 6 core/12 thread 4Ghz chip overclockable to 5Ghz (with the IHS issue sorted)...

I am an AMD fan, but there is no evidence that Intel is slowing down innovation because of the lack of competition from AMD. Price-wise, it is bad news for consumer, but Intel continues to invest in R&D. In fact, at the current time, ARM poses a bigger threat to Intel than AMD does, at least in the mobile devices area. Keep in mind ARM is tryingn to enter the server market, too.
 
Same clock speed as the 3770K, same cores/threads, higher TDP, wow that really is disappointing. I don't know the exact percentage but very, very few people will care about the improved IGP on this chip.

As for me, I really care the IGP. I do network research, so I need my CPU to be very fast to run my experiments. But for the GPU part, I only occasionally play light games, so I do not want a discrete GPU to waste electricity.
I would never pay for a i7 with a second-grade IGP. Maybe I should turn to AMD, but the best AMD CPUs are too slow to be comparable to i5 CPUs. I really hate Intel for its ignoring PC users. Once there is a way, I would be the first to escape from Intel.
Come on AMD and ARM.
 
Back