1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Microsoft pushes Intel for 16-core Atom processor

By Matthew · 32 replies
Jan 28, 2011
Post New Reply
  1. Microsoft has reportedly asked Intel to create a 16-core Atom processor that would be used in the company's servers. Speaking at The Linley Group Data Center Conference, Microsoft engineer Dileep Bhandarkar suggests that Intel's Xeon chips require too much power for their higher speeds and there's a "huge opportunity" to improve energy efficiency by outfitting servers with chips like Intel's Atom and AMD's Bobcat.

    Read the whole story

    R3DP3NGUIN TS Booster Posts: 152   +10

    I'm sure Intel could produce a 16-core Atom, at what price though?
  3. Their servers would run a lot lot cooler.
  4. MrAnderson

    MrAnderson TS Maniac Posts: 488   +10

    What is the relevance of an HP consumer media server to an IT server product? Can anyone say smoke screen?

    IMHO Intel could produce these low power multi-core Atom processors with little or no technical problem. However, they probably take issue with this new product?s use eating into their Xeon server chip line profit margins.

    The ?Atomic-Xeon? (if that is what they call the new green mega-core processors) would probably be cheaper too produce (with the exception of the 16 core aspect, but) given their complexity vs. a normal sized Xeon core and of course the saving once it goes into manufacturing full force.

    Before they go full steam on this, Intel will no doubt be looking for ways to shift around their resources to keep their profit margins intact. Moreover, Intel, AMD, and maybe even VIA (who is no doubt in a good position) will have no choice given the growing demand. I?ve read some articles where some network engineers were beginning to test Atom configurations at Intel?s dismay and of course without their blessing. Nvidia could be high on their tale in a few years with an ARM offering.

    Competition is coming. The x86-ers know this, but are waiting till the last minute. I'm sure Intel has been testing a configuration of this kind in wait of the competition. They did show off a test version of a 46 core wafer a few years ago?

    They can wait till the competition is almost ready to bring the disruption to the server market and drop their own product. This again allows time to shift around product offerings to lower the impact of the evident cannibalization on the Xeon.
  5. Cota

    Cota TS Enthusiast Posts: 513   +8

    @MrAnderson: xactly, but thats only 1 side of the page, i mean of course they can make a 16-core Atom processor, but to the time they get competition on the production/sells, they could had gone way far in the development of Atom processors, specially since things had been going faster these last years we cant get something witout calling it crap in less than 1 year, remember the DDR4 instead of DDR5 jump paradox? xD
  6. I think too, the governments should regulate companies that make microproccesors, so that they strive to make products more efficient.

    For example, limiting the maximum TPD on ALL cpus with more performance like the Intel Core i family and AMD Phenoms in the coming years, something

    2011 to 90 watts of TDP,
    2012 to 60 watts of TDP,
    2013 to 40 watts of TDP,
    and so ...

    Intel, AMD you can do it !

    For a green world
    Thanks to Greenpeace.
  7. ET3D

    ET3D TechSpot Paladin Posts: 1,279   +105

    Interesting idea, guest. Would probably require something a little more complex, though. After all, a 16 core 100W CPU should have a better footprint than a 2 core 40W one. Also with the integration of graphics on the CPU that might need to be counted as well.

    In any case, I think that it will go this way anyway. On the server space companies obviously want this, and at home you already see people moving from desktops to notebooks and to tablets and phones, making the less power consuming devices their main computing platform.
  8. ^ That's about the dumbest thing I've read here in the past 2 days. Government regulation of microprocessor manufacturing using TDP as a rule? That would probably stifle innovation more than doing any actual good, though I'm sure tree hugging eco-nuts would rejoice :\
  9. captaincranky

    captaincranky TechSpot Addict Posts: 12,524   +2,312

    Wow, governmental regulation of the TDP of microprocessors. In the big picture, that's getting bogged in minutia.

    You're worried whether a CPU draws 10 extra watts of power, while some fool tools down the street in a Hummer @ 8mpg.

    You might consider targeting your fanaticism at something more meaningful.

    Besides, isn't it cheaper to generate electricity, than it is to run something with fossil fuel. Doesn't a CPU run on electricity. Have you bought your Chevy "Volt" yet?
  10. MrAnderson

    MrAnderson TS Maniac Posts: 488   +10

    Nah I don't think they should regulate invention, but should offer incentive. They can do it in two ways. They can give incentives to companies to use greener technologies creating a demand for low power high efficiency products... then give incentives to to produce the kinds of products. The tablet and mobile computing market is an incentive for all processor manufacturers to create these products. But seeing that there will be a desire to have it in the server rooms and data centers will be key. Hey if someone like Microsoft asks you to do it... means you will have the business and others will follow.
  11. Ithryl

    Ithryl TS Rookie Posts: 53

    For no technological progress.
    Thanks to Ecosocialism.
  12. captaincranky

    captaincranky TechSpot Addict Posts: 12,524   +2,312

    This is why we have "guest" posting, to annoy the living crap out of the members. Well, there's the whole bizarre extremist point of view thing also.

    Here, allow me to think out the whole computer / ecology based issue for you, as it's obvious that nobody's taken the time to do so.

    Computer technology, is possibly the only sector of industry where ecological issues are being attended to by corporate greed.

    Left alone, this industry will serve the concept of energy conservation to it's own benefit. Shrinking the process size, and public outcry for less electricity consumption, are already producing much more energy efficient devices.

    Although, some of the "energy efficiency" propaganda is straight out BS. In many cases, "energy efficient" monitors, use less power, because they are less bright. Then everybody notes the "low contrast". Of course they have lower contrast, they're less bright.

    However, it's pretty much the enthusiast" section of the market, that's hanging in there, with the 300 watt VGAs, the 150 watt CPUs, and the 1000 watt PSUs,.

    And why shouldn't you, you're special, right?

    "How big a power supply would I need if I decide to overclock my computer". "I'm concerned with energy efficiency".
  13. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,223   +163

    Wow, the height of ignorance. You have to be in a trance like state of denial to make them thar comments. In case you have not noticed, everything is 'going green' The computer and electronics industry as cap pointed out, the automotive industry is getting more out of smaller displacement (and are at or near zero emissions) and to make you happy, they are all making electric cars...which is a toxic fallacy. We finally have fossil fuel combustion down to, or near zero emissions....yes lets replace this with underpowered, unsafe, a 'ton -o-battery' each toxic vehicles...thats sounds like a solid plan! BTW, carbon dioxide is not a pollutant. I myself have spent 25 years in the construction and LED/lighting industry where I am hard pressed to find a product from the foundation up that is not being manufactured in a 'green fashion. 75 watt light bulbs are being replaced by 9 watt LED's. The extinction of the incandescent bulb has been set for 2016, and while that is officially being set by the government, its actually driven by increased efficiency of industry manufacturing. Which brings me to your heroes, Greenpeace. An organization that does nothing but ***** piss, moan, and make a lot of self serving grand gestures. They do not create anything. When I shop I am not deciding between the Ford Flex and the 'Greenpeace whaler.' And when i screw in a light bulb, its a GE LED, not a Greenpeace 9w 'enlightenment', no, they run around claiming credit and 'being seen' so you and yours can wear your anti-global warming ribbon and feel good about actually doing nothing.
    Take two carbon credits, you'll feel better in the morning.


    Thank God for giant snowman balloons..."we are saved"
  14. OneArmedScissor

    OneArmedScissor TS Enthusiast Posts: 50   +7

    The big one to keep in mind is that increased efficiency ≠ less power used overall.

    Something that's "more efficient" may actually use more power than what came before it. It just goes faster.

    Everyone always seems to be pulling that crap with laptops. They say the "efficiency" is better, but then the battery life is worse, even though most people don't do anything but surf the internets.
  15. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,223   +163

    Um yeah, but I thought the point was not to 'pollute' the environment? if its more efficient and doesn't do that, who cares how much energy we use?
    Beside pointing out that you are on the internet right now. Your comment makes my point.the Global warming movement is not actually about the planets climate. Its about social re-engineering.
  16. PanicX

    PanicX TechSpot Ambassador Posts: 669

    This thread is so full of wtf, that I'm afraid asking a legit question is just going to be met by trolls, but what the hell.

    Who would ever consider running a Microsoft Server OS on an atom processor of any number of cores? Sure I get that TDP is a buzz and all, but the point of servers have always been to offload demanding tasks from client machines. Have I missed the memo where Atom processors are effective database or web or email server processors? That if only they had 16 cores, they could perform the same number of calculations as current Xeon processors?

    I've got to be missing something, because the whole idea just seems absurd.
  17. captaincranky

    captaincranky TechSpot Addict Posts: 12,524   +2,312

    Well, the computer industry is pretty much full of WTF and FUD in general.

    I forget where I read it, but somebody was going on about using a room full of Mac Minis as servers. Needless to say, I was confused as to why someone would forego enterprise class parts, in favor of a batch of glorified crap-tops.

    It's also noteworthy that M$ has had a list of major blunders as long as your arm. What's the difference if they make one more?
  18. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,262

    MS playing one competitor off against another.

    Hardly surprising that MS is putting the arm(!) on Intel...Microsoft's Research Group have invested a fair bit of time with Atom based servers for potential cloud computing/hotmail use, and given that Intel are already down the multicore road they probably don't see it as a big ask. Toss the project onto the 22nm process (along with Ivy Bridge and Knight's Corner) and it might just come to something
  19. captaincranky

    captaincranky TechSpot Addict Posts: 12,524   +2,312

    What is the significance of specifying 16 cores? I realize that 16 is 2 to the 4th, but after that, I'm lost.
  20. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,262

    Maybe MS identified a 16 core count as being a sweet spot both for power consumption, and as I understand it core count utilization under OLAP.

    Near enough a total guess on my part. Maybe someone with some detailed server knowledge could help out...............cue "Guest" showing up and extolling the virtue of their liquid helium cooled data centre/Crysis cruncher!
  21. LOL by the time you put 16 cores on an Atom processor your power consumption is backup to Xeon levels anyway...
  22. SilverCider

    SilverCider TS Rookie Posts: 71

    It's quite possible! Maybe Intel and M$ have some sort thing going off and just want to prove that the atom is capable in 'plenty'core form with no real gains other than some bragging rights!
  23. ET3D

    ET3D TechSpot Paladin Posts: 1,279   +105

    @PanicX, Atoms are already used in some servers, they are apparently powerful enough for some uses. Server workloads often consist of multiple virtual machines running simple workloads. A large number of simple cores can provide better performance/power for this case than a smaller number of faster, more complex cores. To quote another article on this news from InfoWorld, "Microsoft's data centers power mostly Web-centric applications like Bing, Hotmail and Windows Live Messenger, as well as hosted versions of business applications such as Sharepoint and Exchange."

    BTW, amazing the amount of "green" hatred on this forum.
  24. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,223   +163

    That's what I thought, MS was fixing to write software that will make use of 16 cores efficiently......okay I didn't know it was a 'hypercube' :wave:
  25. OneArmedScissor

    OneArmedScissor TS Enthusiast Posts: 50   +7

    Yes, because laptops and phones don't use batteries, servers running 24/7 don't count towards the electricity bill, and pushing CPUs beyond 4 GHz and blowing up motherboards instead of just designing them to work better is a viable alternative.

    What world do you live in? Improving power efficiency has been a major goal in almost every advance made in microprocessors of the last squillion years.

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...