Intel Iris: Haswell graphics get a name, 2x-3x performance of HD 4000

By on May 2, 2013, 2:00 AM

Intel has disclosed new information about its up and coming Haswell processors, detailing the graphics specs and potential performance improvements versus today's integrated HD graphics. Intel is giving the new graphics engine a brand name: "Iris", touting 2x to 3x the performance of HD 4000 graphics.

In the past few months we've come to learn a lot about Haswell. As with Ivy Bridge, efficiency and power consumption have taken priority over raw processing power, and thus Haswell it's expected to be only marginally faster than the current generation of Intel Core CPUs. On the graphics front, however, it's a different story.

There will be five different graphics configurations, in order of performance: Iris Pro 5200, Iris 5100, HD 5000, HD 4600/4400/4200 and HD graphics. This may become even more confusing that the current line-up, although for the very most part performance-oriented desktop and laptop CPUs simply ship with HD 4000 (the fastest integrated graphics from Intel), while smaller ultrabooks and tablets get the more specialized configurations.

Here's what Intel is expecting from Iris (don't miss our extended note about performance further down below):

Three times the performance of HD 4000 sounds like an amazing feat for a single generational jump. As before, this has the potential to keep eating on the budget discrete GPU market. It's no coincidence AMD and Nvidia have nearly retired the sub $100 graphics cards, since integrated graphics have been able to cope with different kinds of typical workloads (HD video playback, etc.), usually only falling short when it comes to gaming.

When we reviewed Ivy Bridge about a year ago, we tested the Core i7-3770K's on-die GPU performance. The results were disappointing to say the least, but if Intel is able to double or even triple performance, it's going to be a very interesting landscape for PC gaming moving forward.

Starcraft 2 running at 1280x800 (medium quality) on HD 4000. More games here.




User Comments: 43

Got something to say? Post a comment
VitalyT VitalyT said:

Long story short, according to Intel, the top-level HD 5200 will be equal in performance to:

- NVidia GTX 540

- HD 6670

Doesn't matter what Intel says in addition, these aren't gamer's cards, not by today's standards.

But if you think those slugs are good for your tasks, then it is for you. Myself, I'm waiting till HD 8970 is announced and bench-marked against GTX 7x, to choose between the two.

mrcavooter mrcavooter said:

Long story short, according to Intel, the top-level HD 5200 will be equal in performance to:

- NVidia GTX 540

- HD 6670

Doesn't matter what Intel says in addition, these aren't gamer's cards, not by today's standards.

But if you think those slugs are good for your tasks, then it is for you. Myself, I'm waiting till HD 8970 is announced and bench-marked against GTX 7x, to choose between the two.

I think this is big news for laptop computing. If you can afford that desktop i7 chances are likely you are spending that much or more on discrete graphics.

GeforcerFX GeforcerFX said:

Good news for intel, problem being they have shown multiple times they don't know where to place the correct products, will the Pro 5200 be on a i3 or i5 processor, Historically no, it will only be available on high end i7's more then less likely, Which are computer that usually end up with discrete graphics anyways, and since Kaveri is shaping up to be a massive jump in GPU performance for AMD(something they already smoke intel in), and AMD will prob keep there next high end A10 right around the same price as trinity's you will be able to get better graphics on chip at lower prices from AMD. But until these chips surface for benchmarking all of it is speculation as long as the integrated graphics on the Haswell i3's can match/beat the A10 trinity then Intel improved a lot, problem being for them how much is Kaveri gonna move AMD up.

VitalyT VitalyT said:

I think this is big news for laptop computing. If you can afford that desktop i7 chances are likely you are spending that much or more on discrete graphics.

On another hand, when that Haswell comes out, a desktop discrete graphical card that outperforms it will cost about $80, which makes all that Haswell graphics useless for a desktop user. You can only go for it for a self-made home server.

In the laptop niche though, yes, Haswell is a great news altogether, though it will need better tuned PSU to handle low-voltage requirements, and some software updates, like the one MS is flogging with its Windows Blue.

ReederOnTheRun ReederOnTheRun said:

Long story short, according to Intel, the top-level HD 5200 will be equal in performance to:

- NVidia GTX 540

- HD 6670

Doesn't matter what Intel says in addition, these aren't gamer's cards, not by today's standards.

But if you think those slugs are good for your tasks, then it is for you. Myself, I'm waiting till HD 8970 is announced and bench-marked against GTX 7x, to choose between the two.

I think this is big news for laptop computing. If you can afford that desktop i7 chances are likely you are spending that much or more on discrete graphics.

Exactly. I'm pretty sure nobody reading this cares about using the integrated graphics for a desktop gaming system. Even all of the pictures are pictures of laptops. This is geared towards people who want to game on laptops without bothering with discrete graphics.

Fbarnett Fbarnett said:

Long story short, according to Intel, the top-level HD 5200 will be equal in performance to:

- NVidia GTX 540

- HD 6670

Doesn't matter what Intel says in addition, these aren't gamer's cards, not by today's standards.

But if you think those slugs are good for your tasks, then it is for you. Myself, I'm waiting till HD 8970 is announced and bench-marked against GTX 7x, to choose between the two.

I thought I read that 8000 series cards will all be tweaked 7000 series nothing new

Tekkaraiden Tekkaraiden said:

So that would put them within very close performance of AMD's current APU's then? I wonder how much better Kaveri will be when it's released.

VitalyT VitalyT said:

I thought I read that 8000 series cards will all be tweaked 7000 series nothing new

There is a rumor AMD may bump up the number of shaders at the very least and that the card may see a whole 20% improvements over 7970. But that's just rumor. We will see when it is out.

The rumors: [link]

1 person liked this |
Staff
Julio Franco Julio Franco, TechSpot Editor, said:

It's a step in the right direction for the PC platform. Although Intel is thinking long-term trying to challenge low-powered SoCs that have decent graphics capabilities, picture this: all new laptops will have at least this kind of (decent) 3d graphics performance. It's a much better alternative than what we have today.

Puiu Puiu said:

They can bump the performance in 1 or 2 games or in 3DMark all they want, they still have the worst drivers on the market.

Many games don't work, OpenGL/DirectCompute/OpenCL have many problems and they also drop support really fast (in a gen or 2). The control panel also lacks many features that normally should be supported, not to mention anything a bit more advanced.

This is where they need to work, because I'm pretty sure they can get a lot more out their hardware if they actually manage to build decent drivers.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

So that would put them within very close performance of AMD's current APU's then? I wonder how much better Kaveri will be when it's released.

I'd bet the disparity will remain roughly the same as the A8 vs Ivy Bridge numbers are now, when comparing Haswell's top of the line version with Kaveri... The amount of jump in graphics power shows that Intel is really putting some effort into the integrated graphics, but it will remain to be seen how the actual performance numbers pan out when benchmarked in real world scenarios, using a wider variety of tools than a few that were probably hand-picked to show the highest graphics gains (common practice for marketing in both AMD and Intel's camps).

Honestly, I'm liking the upward direction for Intel's graphics, but I'm not really all that excited about the end product yet... The probable result of this range of processors and performance numbers will be that the mobile sector (where the additional power would make the most impact) will rarely use the highest level chips to reduce power consumption and costs. So it's debatable whether that max improvement number will even mean anything to the consumer in the end. It will be very interesting to see how the product prices out, and how exactly it is adopted by the OEMs.

JC713 JC713 said:

Long story short, according to Intel, the top-level HD 5200 will be equal in performance to:

- NVidia GTX 540

- HD 6670

Doesn't matter what Intel says in addition, these aren't gamer's cards, not by today's standards.

But if you think those slugs are good for your tasks, then it is for you. Myself, I'm waiting till HD 8970 is announced and bench-marked against GTX 7x, to choose between the two.

GT640*. The GT640 will be better at gaming though because it has more onboard memory.

1 person liked this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

I grow tired of all the down talk from those who need and use dedicated graphics. This is progress, it should be promoted not disgraced.

Guest said:

You can put down their drivers, but every single part of them has been improving and will continue to improve.

Darth Shiv Darth Shiv said:

I grow tired of all the down talk from those who need and use dedicated graphics. This is progress, it should be promoted not disgraced.

I'm definitely an Intel graphics skeptic but mention of 4k x 2k playback support is interesting. But I'll wait until they can prove that.

Claims that it can compete with AMD's APUs is another thing. For gaming, last option still is Intel for me and I can't see this changing it. A reason to arc up about their announcement is they tout it as an onboard chip that is capable of gaming. Last perf bench I saw of Intel onboard for BF3 @ 1080p at minimum graphics settings, it was peaking at 20fps (my 15" laptop has a 1080p screen so this is the level I'm interested in). They might as well not have showed up.

JC713 JC713 said:

I grow tired of all the down talk from those who need and use dedicated graphics. This is progress, it should be promoted not disgraced.

I support it, but it is still no match, and never will be at the level of discrete GPUs. Yes, this is good news for consumers, but this is no blow to nVidia or AMD. They may lose a bit of the market in low end GPUs, but they will make up with their power houses. Once again, yes this is promising, but dedicated GPUs will always rule in terms of performance because of the constantly updated drivers.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

The down talk really gets under my skin. Some must think the level of power given be dedicated graphics, can be integrated and that CPU makers are dragging their feet. Seriously IGP has their purpose, if you need more than IGP performance get dedicated. But please stop bickering about how IGP under performs dedicated graphics. Because I know all of you are intelligent enough to know that is simply the nature of things.

JC713 JC713 said:

True :0 lol. The small package is kind of hard to beat..

3 people like this | dividebyzero dividebyzero, trainee n00b, said:

But if you think those slugs are good for your tasks, then it is for you. Myself, I'm waiting till HD 8970 is announced and bench-marked against GTX 7x, to choose between the two.

Technically, one of those is here now. [link] . From what I've heard, AMD's Sea Islands seems to have now taken a back seat probably because it seems indistinguishable for the most part with the Northern Islands series. They now seem to be talking up the Volcanic Islands series. This via AMD's de facto PR machine:

The next generation of graphics cards -- Volcanic Islands -- is coming this year and shaping up nicely... James Prior, Rage 3D

Not sure at this stage what "coming this year" entails - could be taped out, validation, shipping for revenue, or possibly (but unlikely given they are supposed to use TSMC's 20nm process which won't ramp until later this year) available in the retail channel.

In an effort to at least make a token effort in getting back to on topic with Haswell. OCaholic have news of an early Haswell overclocking attempt. 7 GHz, which confirms Haswell's higher multiplier for "K" SKUs. (80x for 100MHz strap, 64x for 125MHz strap, 48x for 166MHz strap)

[Source]

TS-56336 TS-56336 said:

Both impressive and disappointing, nice improvement but I'm vastly disappointed in how restrictive the TDP and chip levels are. I'd love to see a 5200 or 5100 in a much lower TDP set. If a Haswell CPU can get down to a 10 watt TDP for a core i5 I don't think an underclocked 5200 in a 17 watt TDP would be too much to ask.

Basically, a Core i5 is all I need for CPU power. My current Ivy Bridge Core i5 is as fast as I need, for games or lightroom or etc. The GPU on the other hand still struggles with anything beyond the most basic games. If Intel won't do it I have hope Nvidia will. I'd love to see a dedicated Nvidia mobile card fit into a 10 watt core i5 and a 13" ultrabook, whatever that would require.

Guest said:

Double the speed half the heat

2 people like this | LukeDJ LukeDJ said:

Double the speed half the heat

If you know how to, go right ahead mate.

I agree with @cliffordcooley here, quit the down talking people. It simply isn't possible with today's technology to cram in a mid or high end GPU onto a CPU, especially when current integrated solutions have nowhere near that power. A 2-3x (I'll admit some skepticism towards the accuracy of these numbers) performance boost is actually pretty great. If discrete GPUs are increasing by say, 20-30% each generation, and on board solutions are increasing 100-200%, it wont be long before onboard graphics begin to catch up.

That's pretty damn awesome in my books.

Guest said:

Intel always exaggerates its gpu performance. Gpu will improve but not that much 2x to 3x performance. at best haswell gpu will be less than 1.5x of hd 4000.

1 person liked this | mrcavooter mrcavooter said:

On another hand, when that Haswell comes out, a desktop discrete graphical card that outperforms it will cost about $80, which makes all that Haswell graphics useless for a desktop user. You can only go for it for a self-made home server.

In the laptop niche though, yes, Haswell is a great news altogether, though it will need better tuned PSU to handle low-voltage requirements, and some software updates, like the one MS is flogging with its Windows Blue.

Yea, I have a discrete "professional grade" GPU in my laptop. If I could go back a few months and purchase my laptop again I would leave it out. It sucks too much power and generates too much heat for a marginal graphics improvement. Not to mention that Nvidia opts to use the integrated graphics most times, barring a few programs. So Haswell's performance outlook definitely satisfies the niche I'm in, but I'm only one user. I can say, however, that a lot of casual user's will benefit from this, and that is where Intel is targeting. They know that enthusiasts will deck out their laptop's if they want, but those that do make up a tiny (irrelevant?) fraction of the pie.

captaincranky captaincranky, TechSpot Addict, said:

Doesn't matter what Intel says in addition, these aren't gamer's cards, not by today's standards.

But if you think those slugs are good for your tasks, then it is for you. Myself, I'm waiting till HD 8970 is announced and bench-marked against GTX 7x, to choose between the two.

For the life of me, I can't figure out why "being a PC gamer", bestows upon the individual such a sense of vastly inflated self esteem...

"I play video games, that's my contribution to society, and y'all better be damned grateful to me for it"....

We lesser beings are happy if an IGP will pump 1080p to two monitors. Then, instead of listening to, (and in the process getting sucked into), the endless, and endlessly dull tirades about how, "my video card can beat your video card", I could watch a movie, and take this up during the slow parts.

Technically, one of those is here now. [link] . From what I've heard, AMD's Sea Islands seems to have now taken a back seat probably because it seems indistinguishable for the most part with the Northern Islands series. They now seem to be talking up the Volcanic Islands series. This via AMD's de facto PR machine:
Well, AMD wins hands down against either Nvidia or Intel in the "coin a powerful name contest".

Which is what my daddy told me was a surefire path to dominance, "if you can't beat their, product, out name their product". Although, I'm still not buying a discreet video "solution" until, "H-Bomb Crater Maker" is rolled out.

VitalyT VitalyT said:

Well, you live up to your name nicely, sir!

captaincranky captaincranky, TechSpot Addict, said:

Well, you live up to your name nicely, sir!
I like to contribute, in any way I can, for the greater good of all.....

VitalyT VitalyT said:

I like to contribute, in any way I can, for the greater good of all.....

And the greater good appreciates it!

TheBigFatClown said:

Doesn't matter what Intel says in addition, these aren't gamer's cards, not by today's standards.

But if you think those slugs are good for your tasks, then it is for you. Myself, I'm waiting till HD 8970 is announced and bench-marked against GTX 7x, to choose between the two.

For the life of me, I can't figure out why "being a PC gamer", bestows upon the individual such a sense of vastly inflated self esteem...

"I play video games, that's my contribution to society, and y'all better be damned grateful to me for it"....

We lesser beings are happy if an IGP will pump 1080p to two monitors. Then, instead of listening to, (and in the process getting sucked into), the endless, and endlessly dull tirades about how, "my video card can beat your video card", I could watch a movie, and take this up during the slow parts.

Technically, one of those is here now. [link] . From what I've heard, AMD's Sea Islands seems to have now taken a back seat probably because it seems indistinguishable for the most part with the Northern Islands series. They now seem to be talking up the Volcanic Islands series. This via AMD's de facto PR machine:
Well, AMD wins hands down against either Nvidia or Intel in the "coin a powerful name contest".

Which is what my daddy told me was a surefire path to dominance, "if you can't beat their, product, out name their product". Although, I'm still not buying a discreet video "solution" until, "H-Bomb Crater Maker" is rolled out.

Well, Iris is about the gayest name one could come up with for a powerful graphics chipset. Somebody needs to be fired.

captaincranky captaincranky, TechSpot Addict, said:

Well, Iris is about the gayest name one could come up with for a powerful graphics chipset. Somebody needs to be fired.
No, that would be "Mango".....or maybe, "Liberace"......(*)

Note: I'm not saying that somebody shouldn't be fired, or "Iris" isn't plenty gay. It's just that if they put their mind to it, they could have found a gayer name. "Elton John", "Tommy Tunes", "Freddie Mercury", "Rock Hudson", also spring to mind, and the list goes on...

Although, "Nathan Lane", might appeal to home theater buyers who fancy themselves as sophisticates....

(Given Intel's heretofore penchant for naming chipsets after places, "Nathan Lane" would possibly be the best choice, given that it is ambiguously gay, To wit, "Nathan Lane", do they mean the actor, or is that a street name)?

TheBigFatClown said:

No, that would be "Mango".....or maybe, "Liberace"......(*)

Note: I'm not saying that somebody shouldn't be fired, or "Iris" isn't plenty gay. It's just that if they put their mind to it, they could have found a gayer name. "Elton John", "Tommy Tunes", "Freddie Mercury", "Rock Hudson", also spring to mind, and the list goes on...

Although, "Nathan Lane", might appeal to home theater buyers who fancy themselves as sophisticates....

(Given Intel's heretofore penchant for naming chipsets after places, "Nathan Lane" would possibly be the best choice, given that it is ambiguously gay, To wit, "Nathan Lane", do they mean the actor, or is that a street name)?

Look out peeps here comes Iris. "Oooohhhhh scary!!!!". I bought a SandyBridge CPU and an IvyBridge CPU. But I may have to pass up the Haswell and Broadwell chips because of the codenames alone.

I can't let it be known that I have an Iris chip in my possession.

Now, a viper chip? Or a cobra chip? That might be okay. Not an Iris.

captaincranky captaincranky, TechSpot Addict, said:

...[ ]..... I bought a SandyBridge CPU and an IvyBridge CPU....[ ]....
"Sandy Bridge", "Ivy Bridge", if you check your GPS, you'll see they both empty onto "Nathan Lane"....!:oops:

1 person liked this | dividebyzero dividebyzero, trainee n00b, said:

Well, Iris is about the gayest name one could come up with for a powerful graphics chipset. Somebody needs to be fired.

Tell that to SGI. Without IRIS the world wouldn't have an open source API

Look out peeps here comes Iris. "Oooohhhhh scary!!!!". I bought a SandyBridge CPU and an IvyBridge CPU. But I may have to pass up the Haswell and Broadwell chips because of the codenames alone.

Buys hardware solely based upon the products name....sign of a real enthusiast.

mrcavooter mrcavooter said:

Well, Iris is about the gayest name one could come up with for a powerful graphics chipset. Somebody needs to be fired.

Watch out Republicans, this chip has a sexual preference. Is anyone thinking of the children?

captaincranky captaincranky, TechSpot Addict, said:

Watch out Republicans, this chip has a sexual preference. Is anyone thinking of the children?
Judging by recent headlines, I'm pretty sure everybody's thinking that the children better get used to different sexual preferences, the sooner the better....

cliffordcooley cliffordcooley, TechSpot Paladin, said:

Buys hardware solely based upon the products name....sign of a real enthusiast.
I was thinking along the lines of ridiculous criticism. Almost as bad as someone who doesn't use Apple or Google products. Wait, I'm talking about myself again. Ahh well, I still can't see myself ever changing my mind about those two companies. At least I can see my criticism as potentially ridiculous.

captaincranky captaincranky, TechSpot Addict, said:

I was thinking along the lines of ridiculous criticism. Almost as bad as someone who doesn't use Apple or Google products. Wait, I'm talking about myself again. Ahh well, I still can't see myself ever changing my mind about those two companies. At least I can see my criticism as potentially ridiculous.
What's in a name, you ask? You just wait until I get my hands on the upcoming, "Nuclear Holocaust"(*) AMD GPU, my computer will smite your computer so mightily, you and all your relatives will feel it, both living and deceased.

(*) I'm going to win the also perhaps upcoming, "name that video chipset contest", that AMD will sponsor after their top execs read this post.

TheBigFatClown said:

I was thinking along the lines of ridiculous criticism. Almost as bad as someone who doesn't use Apple or Google products. Wait, I'm talking about myself again. Ahh well, I still can't see myself ever changing my mind about those two companies. At least I can see my criticism as potentially ridiculous.
What's in a name, you ask? You just wait until I get my hands on the upcoming, "Nuclear Holocaust"(*) AMD GPU, my computer will smite your compute so mightily, you and all your relatives will; feel it, both living and deceased.

(*) I'm going to win the also perhaps upcoming, "name that video chipset contest", that AMD will sponsor after their top execs read this post.

On second thought, I think you guys are right. What's the big deal about a name? I say forget Iris. I said Intel should name the next gen IGP "Gertrude". Oh, now that one gets me excited. I can't wait.

dividebyzero dividebyzero, trainee n00b, said:

I was thinking along the lines of ridiculous criticism. Almost as bad as someone who doesn't use Apple or Google products

I can understand shunning a particular hardware vendor to a degree - business practice, warranty/RMA issues, performance-per-$, implementation etc., but when you have already bought two Intel products in succession ( " I bought a SandyBridge CPU and an IvyBridge CPU" without denigrating either, only to write off any future purchase based upon what the PR division decide to name the parts....well...

What's in a name, you ask? You just wait until I get my hands on the upcoming, "Nuclear Holocaust"(*) AMD GPU, my computer will smite your compute so mightily, you and all your relatives will; feel it, both living and deceased.

Not enough hyperbole. I'd suggest something ending with " [implausible jargon] Edition ", that imbues a false sense of pride and achievement for the hapless fool discerning consumer. It's a pity that [link] has already been scooped up

(*) I'm going to win the also perhaps upcoming, "name that video chipset contest", that AMD will sponsor after their top execs read this post.

Presupposing that the theoretical contest planning survives AMD's "Trial by Committee", I look forward to the announcement and your subsequent win (I hope they frame the IOU for you) sometime around 2025. Stay healthy.

captaincranky captaincranky, TechSpot Addict, said:

Double the speed half the heat
If you know how to, go right ahead mate.
Yeah guest, you go for it. Ask "Kickstarter" to front you the money to build your own fab...

Just be patient, monkeys are going to fly out of a flying pig's butt any day now, and the day after that, your check is in the mail....

VitalyT VitalyT said:

On the latest news, DEA captures a developer who used kickstarter's money to expand his meth lab. The conspirator used BreakingBad as his nickname, while promising a break-through for BlackBerry users. DEA is investigating, whether BlackBerry users are all on drugs or part of a cartel.

P.S. I like them apples

Guest said:

Good God this is pathetic on Intels part .. merely a 3x increase in performance over a single generation,

Folks .. this is probably the biggest performance jump you will ever see .. now or in the future. Normally it takes a complete change of technology or architecture to get this kind of increase. That said , intel will package it with the most expensive CPU rather than the low and mid where it would really make a difference. This is all for the Wintablet market and to fight off Arm+ecosystem.

Sorta makes Nvidia back into play, wobbly.

SailFish <> FailSale, no, no. Tizen ? FireFox OS, if Dual boot, yes!

captaincranky captaincranky, TechSpot Addict, said:

....[ ].....complete change of technology or architecture to get this kind of increase. That said , intel will package it with the most expensive CPU rather than the low and mid where it would really make a difference....[ ]....
That's a flawed assumption, at the very least, it flies in the face of history thus far.

Intel released the "Core i3-3225". This is a mainstream dual core with hyper threading. The package includes Intel's best, (thus far) HD-4000 graphics. Granted it wasn't among the first of the Ivy Bridge CPUs released. But, perhaps that's for the better, as it might indicate that Intel is responding to consumer feedback.

So, the i3-3225 is HD-4000, 22nm, 55 watts TDP! Here, have a look: [link]

Keep in mind I bought this CPU, with a Gigabyte Z77_UD3H (ATX) @ just over $220.00 in a Microcenter "steal a bundle" promo!

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.