Intel's enthusiast-grade Gulftown CPU previewed

By on August 13, 2009, 2:53 PM
Not long after sharing its 32nm processor plans with the press earlier this year, Intel revealed that it had begun shipping the first Westmere engineering samples to a select group of laptop and desktop PC manufacturers for testing. As it is often the case, one tech site seems to have gotten hold of these parts a little early and ran some quick benchmarks for us to look at.


The part in question is a 32nm hexacore chip, known as Gulftown on the roadmaps and scheduled to debut in Q2 2010. If you are having a hard time keeping up with Intel codenameís, Westmere is basically a die-shrink of the current 45nm Nehalem family of processors (released as Core i7) and Gulftown will be the desktop variant aimed at the high-end enthusiast segment. Featuring 6 cores and 12 threads with Hyper-threading enabled, it holds 12MB of L3 cache to support the additional data load over the QuickPath Interconnect.

It is based solely on socket LGA 1366 and retains compatibility with the X58 chipset that drives all Core i7 motherboards today. Chinese site HKEPC particularly tested a 2.4GHz part and came to some interesting though not quite surprising conclusions. First off, that the six-core beast runs cooler and draws less power than current Core i7 chips, and secondly that while there is definitely a step up in processing power most software isnít ready to fully benefit from the additional cores.

As a result, the chip will likely show its worth on specific tasks such as professional image and video editing, but with an expected price between $1,000 and $1,500, this is not necessarily the best investment as far as gaming is concerned. You can find a translated version of the complete report here.

Itís also worth mentioning that unlike upcoming mainstream desktop and mobile variants, known as Clarkdale and Arrandale, Gulftown will not include an integrated graphics core alongside the 32nm CPU.




User Comments: 12

Got something to say? Post a comment
dustin_ds3000 dustin_ds3000, TechSpot Chancellor, said:

WHOOOOO Intel FTW first consumer six-core processor

9Nails, TechSpot Paladin, said:

"...a 32nm hexacore chip, known as Gulftown..." - Sweet!

"...it holds 12MB of L3 cache..." - That's more in Cache than my first 3 computers had in system RAM!

"...an expected price between $1,000 and $1,500..." - D'oh!

red1776 red1776, Omnipotent Ruler of the Universe, said:

1.1v!? ....holy hanna

red1776 red1776, Omnipotent Ruler of the Universe, said:

well....she was before....i don't want to talk about it.

snowchick7669 snowchick7669 said:

Better behave and not hijack this thread

peas said:

nothing interesting here. move along.

1st it was the MHz battle. That went to its logical conclusion (see Pentium 4 disaster).

Now it's the cores battle. Keeping adding cores, the more there are, the more useless they are.

red1776 red1776, Omnipotent Ruler of the Universe, said:

to the Luddite in the 4th row

nothing interesting here. move along.

1st it was the MHz battle. That went to its logical conclusion (see Pentium 4 disaster).

Now it's the cores battle. Keeping adding cores, the more there are, the more useless they are.

what?

a vast majority of games use two or three cores, in the near future they will use 4 , possibly more. check your productivity benchmarks and scores. they benefit greatly from four cores. as well as things like rendering, video editing, etc.

servers are faster and more productive now from using 6 core processors.

1st it was the MHz battle. That went to its logical conclusion

umm yeah competition hath wrought faster more capable and productive cpu's

while you obviously embrace the ways of the Luddite, you may have noticed that they have had spectacular success since the Pentium 4. might i suggest an abacus?

.....move along

LinkedKube LinkedKube, TechSpot Project Baby, said:

lol @ abacus. That's a 700+ year retrogress. All good points stated.

Guest said:

I think now it's all about multi-core battle that NVIDIA already won :)

Just think about having 512 cores on a chip working on a lower frequency instead of 6 cores working on higher frequency. Who knows and already saw the advantages of CUDA technology will understand what am I talking about.

red1776 red1776, Omnipotent Ruler of the Universe, said:

I think now it's all about multi-core battle that NVIDIA already won

Just think about having 512 cores on a chip working on a lower frequency instead of 6 cores working on higher frequency. Who knows and already saw the advantages of CUDA technology will understand what am I talking about.

, your not serious are you??

you have no idea what you are talking about.....thats a GPU, you know...a graphics card. this article is about a six core CPU. Geezus, do some reading.

Archean Archean, TechSpot Paladin, said:

Yup red you are right, beside nvidia is only trying to get themselves in that position not sure in the long run they can; because I dont think nvidia has the financial muscle to sustain couple of tough years ...... whereas let alone Intel, AMD has proved that they can do just that by hanging on and continuously improving their products. I believe as soon as AMD and later on Intel (with larrabee) start producing chips with something like fusion (provided they can perform well) nvidia will be in deep s****. Anyways, its just a calculated guess thats about it.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.