Rumor Control: Intel and AMD to design future console GPUs?

By Justin Mann on February 8, 2009, 2:08 PM
Even though next-generation consoles are still years ahead of us (or so we are lead to believe), a number of rumors have surfaced regarding the graphics processing units that these will carry.

Sony may be struggling with the PS3, but that doesn't mean they are giving up on its successor. The latest rumor is that the PlayStation 4 will be using an Intel GPU as opposed to Nvidia's. Even though Intel has yet to prove themselves as a decent contender in hardware-accelerated graphics, Sony has apparently been tempted by their next-generation architecture "Larrabee", which Intel has been sampling for a short period. Could Sony be willing to drop IBM's highly touted Cell technology all of a sudden? Probably not.

On a similar fashion, it's been rumored that Redmond will continue to use ATI/AMD for their next GPU. The Xbox 360 relies on ATI hardware for graphics, and Microsoft's next console, not expected to see the light of day for at least three more years, will likely be an ATI solution as well.

Initial development for these consoles is already underway, and once blueprints are crafted and development begins there's no going back on major components like GPUs. Nvidia and ATI have done very well in this market for several years, but with brutal competition in embedded graphics, desktop graphics and now consoles, is there enough room for three or four major players in the near future?

User Comments: 4

Got something to say? Post a comment
captain828 said:
nuked: [url]
Julio said:
From the original story:"Probably not" linked to [url]
tel-gpu-in-ps4-rumours-525563[/url]But how many times have companies denied a partnership that is actually taking place or in negotiations nonetheless.
captain828 said:
I'm really inclined that no company in their right mind will use an untested piece of hardware in a product that is meant to sell in the millions range.Larabee is set to debut somewhere in 2010, so developing a console with hardware that is not even finished is a very weird choice and a risky one.Also, I don't see them using an x86 CPU either; they invested too much money in the Cell BE.Just my $.02 ;)
asmilon said:
[b]Originally posted by captain828:[/b][quote]I'm really inclined that no company in their right mind will use an untested piece of hardware in a product that is meant to sell in the millions range.[/quote] Call me Cell and BluRay :-)
Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.