TechSpot

Apple rejects Arrandale, requests GPU-less alternative?

By Matthew
Dec 7, 2009
  1. According to a rumor published by Bright Side of News (BSN), Apple may have declined to use Intel's forthcoming Calpella mobile platform and Arrandale processor in their default format. Like many of Intel's future CPUs, Arrandale incorporates a graphics core, and this has supposedly caused Apple to refuse the parts. This suggests that Intel would have to provide custom 32nm processors sans GPU in order to receive orders for Cupertino's 2010 Mac refreshes.

    Read the whole story
     
  2. treeski

    treeski TS Guru Posts: 895   +135

    Any particular reason why they would reject a processor with an integrated graphics core?
     
  3. klepto12

    klepto12 TechSpot Paladin Posts: 1,364   +9

    this is weird i wonder what is going on this is crazy apple can never get anything right they have to control every aspect of there products.
     
  4. gwailo247

    gwailo247 TechSpot Chancellor Posts: 2,105   +18

    Cause they want to make people spend money on a graphics card they're going to mark up 200%...I dunno...
     
  5. Tekkaraiden

    Tekkaraiden TS Maniac Posts: 921   +57

    Well Apple and Nvidia are pretty cozy with one another right now so that might be part of it. I know OSX interface is graphically accelerated by graphics hardware so maybe the integrated graphics are not up to the task?
     
  6. Matthew

    Matthew TechSpot Staff Topic Starter Posts: 6,094   +86 Staff Member

    I assume there's no reason that Apple couldn't use a discrete graphics chip alongside Intel's integrated graphics core -- or at least have it present.

    I guess we'll have to wait until the rumor is addressed.
     
  7. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,286   +232

    My guess is that they are rejecting an integrated core so that they have options open for the GPU. If they go with Intel, that's all they got. No integrated core = more choices.
     
  8. Matthew

    Matthew TechSpot Staff Topic Starter Posts: 6,094   +86 Staff Member

    How so Vrmithrax?
     
  9. 1. OpenCL. Apple is utilizing GPUs compatible with OpenCL/Grand Central Dispatch technology. currently that includes nVidia and AMD(ATi) GPUs. I assume that this latest intel integrated graphics thing does not support the OpenCL APIs.....so having it in an Apple machine would make no sence.

    2. Power consumption and decipation. having this component on board would mean it uses battery power and descipates heat, while adding no advantage to the platform. (Note that the MacBookPros have both integrated and dedicated GPUs in 1)

    3. Why should Apple or it's customers pay for functionality they do not want to or plan to use ?

    Personally, i love the fact that Apple controls the entire product, that way i get a tightly integrated machine with no extra bits tacked on for no reason. It also means i get the thinnest and lightest formfactor for a specific configuration running the best OS out there. efficiency in volume and power consumption.

    If Windows-PC users wonder why Apple exerts so much cotrol on the components it uses.... check out Apple's brand/consumer ranking/etc.....and then compare that to the generic PC brands.

    Also, Apple is not obligated to buy anything and everything Intel/nVidia/Ati release.
     
  10. lfg18

    lfg18 TS Rookie Posts: 86

    The only reason that comes to my mind is that they are problaby going to add graphic accelerator as and extra upgrade
     
  11. lupinnktp

    lupinnktp TS Rookie Posts: 44

    this is weird eh? and interesting too. i wonder what Apple wants to do with their new line of notebook. Stick to conventional CPU+GPU, perhaps?
     
     
  12. LightHeart

    LightHeart TS Rookie Posts: 155

    It seems Apple has to be unique, otherwise they would just be a PC :). Intel should just disable the graphics portion and sell Apple the chip.
     
  13. fref

    fref TS Enthusiast Posts: 153

    My guess is they just don't want to pay Intel extra money for a piece of silicon they won't be using. Makes sense for them, but I doubt customers will see these savings on the final price.
     
  14. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,286   +232

    Rather than paying for an integrated GPU, that money can be put into whatever GPU option they wish to use. If ATi has the better GPU options, then they can go that way. Or nVidia. Remember that Apples are all about quality, and one of the things that many of their customers gloat over is their excellent graphics capabilities. It would serve Apple's image to use the best graphics possible, and let's face it, Intel's integrated graphics offerings have been underwhelming at best.

    Plus, keeping the GPU separated means they can test and find the pairing that has the highest reliability. Apple always works on tight specs to keep their hardware as trouble-free as possible. And down the line, if a newer and much better GPU is released, Apple can throw together a new model with that product, without having to worry about the CPU side of things.
     
  15. compdata

    compdata TechSpot Paladin Posts: 604

    I wonder if this has anything to do with GPU accelerated flash support (havn't heard about Intel supporting this yet). Maybe a similar reason why Asus is including a Nvidia GPU in their netbooks.
     
  16. Matthew

    Matthew TechSpot Staff Topic Starter Posts: 6,094   +86 Staff Member

    @Vrmithrax: There is still no reason why a discrete graphics chip couldn't be used alongside Arrandale and its integrated graphics core. Plus, just because that core is present doesn't mean they have to utilize it. If it has to do with cost, fair enough, but technically it's possible.
     
  17. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,286   +232

    @Matthew - Yes, I understand they can work concurrently... But why pay more for an integrated unit and then a separate GPU, and then have to deal with making sure the integrated GPU is always defaulting off whenever you update things? This is particularly relevant if they already have a GPU picked out, and never plan on using the Intel GPU, which their refusal of integrated units sort of indicates, don't you think?
     
  18. CrisisDog

    CrisisDog TS Rookie Posts: 55

    Uuuh, because we all know Intel integrated graphics suck, which is why Apple went with the Nvidia ION platform? The only good thing about them is that they're cheap. From a performance standpoint, they are terrible, and I doubt anything will change by putting the GPU core into the processor.
     
  19. simple reason (although even if it is a rumor at this point)

    if all the other PC platform chips are designed around the non-modified version - that will put a hurt into the folks who want to hack the os into running on non -mac platforms - they are in essence creating a proprietary intel chip for themselves. It may not stop it completely, but it would definitely hinder the possiblity.
     
  20. Are you seriously asking why integrated graphics are not desirable? They are crap. They perform at the bottom of the barrel and simply don't have a place in a higher end PC.

    Apple should absolutely demand a CPU without an integrated GPU. I would never buy a Mac or any PC with integrated graphics.
     
  21. yangly18

    yangly18 TS Rookie Posts: 242

    another reason why I will never get an apple computer.
     
Topic Status:
Not open for further replies.


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...


Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.