Nvidia x86 processor rumors surface once again

By on November 4, 2009, 10:34 AM
Nvidia has reportedly begun hiring former engineers from Transmeta and is looking to develop its own x86-compatible processor core in a bid to continue its lucrative chipset business. The now-defunct microprocessor producer specialized in low-power, x86-compatible processors before turning into an intellectual property licensing firm and eventually closing doors in September of 2008.

Similar rumors have surfaced and died several times over the last few years along with speculation about an Nvidia acquisition of VIA. However, Doug Freedman of research house AmTech feels an internally developed solution is more likely and a necessity to preserve revenue from its integrated graphics solutions.

The chip maker is currently caught up in a legal battle with Intel over the scope of a 2004 chipset licensing agreement. Clearly, this is already limiting its ability to sell chipsets and it's only expected to get worse for Nvidia once Intel and AMD start including graphics functionality onto the CPU die. The company knows this and has been looking into x86 development for some time, saying it is not a question of ‘if’ but ‘when.’




User Comments: 26

Got something to say? Post a comment
red1776 red1776, Omnipotent Ruler of the Universe, said:

Is it just me, or does this seem like a losing proposition from the jump? Here comes Cyrix II

Wagan8r Wagan8r said:

The future of the CPU/GPU market is very uncertain, and I would hate to see any of the big players go away, especially Nvidia simply because they don't have any CPUs or CPU/GPU integration. I'm not a big fan of integrating the CPU and GPU into one because that makes upgrading extremely expensive, plus it would eliminate SLI and CrossFire. It also decreases the longevity of your computer, because with every graphics enhancement, you have to upgrade your CPU as well.

poundsmack said:

Transmetta was great in it's day but its technologies and IP are no longer current and better tech has come in it's place. While i think it's team could make a good CPU with Nvidia funding it and working with them it seems that Nvidia would be at a loss starting from the ground up. It's a shame VIA's processor unit isn't for sale, with nvidia's vision and VIA's tech that would be a competitor i would invest in.

MBK MBK said:

wagan8r said:

The future of the CPU/GPU market is very uncertain, and I would hate to see any of the big players go away, especially Nvidia simply because they don't have any CPUs or CPU/GPU integration. I'm not a big fan of integrating the CPU and GPU into one because that makes upgrading extremely expensive, plus it would eliminate SLI and CrossFire. It also decreases the longevity of your computer, because with every graphics enhancement, you have to upgrade your CPU as well.

I think for the reasons you give, integrated CPU/GPU probably wont kick off (at least I also hope not). Even though my upgrades used to flip-flop between processor and graphics (with the occasional memory boost), it was an affordable way to stay in loop and I found it quite fun too.

As for the x86 line, probably not a bad idea. With RAM becoming more expensive in the near future, I guess the need for 64 bit processors will decline slightly, so more powerful 32bit chips could do well. But, as was said, starting from the beginning may prove to be a bad move.

Clrabbit said:

Nvidia would be better off making there own CPU platform. Then buying some archaic company that failed the first time.

At least strike a deal with IBM, or VIA.

freedomthinker said:

This can turn out to be quite the challenge , but i stink nVidia will fail to use the right price , they will overprice it , no one will want to buy it , end of story .

Kibaruk Kibaruk, TechSpot Paladin, said:

More companies in the same industry means more competition, which brings lower prices and better hardware!

Hope it works out for Nvidia.

Guest said:

Having CPU/GPU hybrid chips is definitely the future, as proven by GPGPU applications. And they won't be necessarily more expensive than current technology, it is only a matter of yield rates. If future motherboards have slots where you place these CPU/GPU hybrid chips instead of cpu sockets, there still exists room for SLI/Crossfire. We are so accustomed to having a dedicated CPU that these extreme ideas seem science fiction.

fref said:

As much as I love nVidia, I just can't picture them as a major player into the x86 processor market. I mean, they don't have experience with that sort of product, and with Intel's Core i7 processors, they have quite a challenge in front of them if there is any truth to this article.

Good luck to them, this will be interesting to watch!

JMMD JMMD, TechSpot Chancellor, said:

I could see them maybe doing stuff with micro pc's or something maybe HTPC related but they're never going to compete with Intel or AMD unless they come up with something neither company has been able to create. I'd prefer they focus on video cards and chipsets.

buttus said:

I can't see Nvidia going into the CPU market when (a) Intel has the lion share and (b) it is already very competitive between AMD and Intel. Nvidia needs to concentrate on their integrated graphics chipsets and leave the CPU's to AMD and Intel.

Puiu Puiu said:

I wonder in how many years will we see the unification of the CPU with the GPU? 10-20?

If NVIDIA is going to make a processor then it most likely compete with NANO or Atom.

Regenweald said:

It seems obvious to me that Nvidia is coming to the cpu realm. Intel and AMD are cpu manufacturers looking to meld cpu and gpu. Once performance is acceptable/phenomenal it is obvious that OEM will look to the simpler hardware solution for mainstream and multimedia pc's Whenever that happens and it IS going to happen Nvidia will be left in no-man's-land. A gpu company with no market except for enthusiasts.

Take a look at nvidia's current offerings and their marketing terminology, they are pushing 'parallel processing' 'clusters' and 'supercomputers'. This and the release of CUDA, why not get programmers accustomed to coding for your processing platform ? On their site they have a simple image of a traditional cpu with 4 cores next to their tesla (which is basically a processor in the pci bus) with 240 cores. Any layman will think "why am I not using the 240 core one?"

They create great chipsets and powerful graphics solutions among many other products, why not go to the next logical step ? With a processor they'll complete the platform and give the company new viability. The nvidia x86 processor is inevitable.

Deso said:

This could eliminate the need for normal Size motherboards, and possibly also make them cheaper, Imagine If they can implement RAM + GPU into the CPU die, The only use for a motherboard would be hosting the cpu, sound card, usb ports and that function, and dont forget SATA ports too, anyways this could reduze the price and size of motherboard by a lot once It's implemented. But unfotunetly the cpu/gpu/ram die will probably be super expensive

JieMan JieMan said:

What a joke this is ,,, according to "1" analyst man all I can say is just ... why?

The thing is that this analyst cant fathom is the possibility of NO CPU ..

This is what NVIDIA is planning, they want to make low power, low cost solutions for the masses ( kinda like what Apple did with the Iphone and Ipod) but Apple cant touch NVIDIA's R&D in fact not many can.

NVIDIA is wanting to emulate an x86 architecture on the GPU so they can produce a system on chip with CUDA cores.

Imagine a netbook with that kind of power. I'm sure it won't do amazing things to start as there isn't much of a push in OpenCl right now ( but its coming ) NVIDIA's future depends on the movement that more and more apps are moving to parallel processing and with NVIDIA puting more and more CUDA cores on every new generation it wont be long before they will have all the computational power your PC will need.

NVIDIA making an x86 cpu ... what a joke!

JieMan JieMan said:

Regenweald said:

It seems obvious to me that Nvidia is coming to the cpu realm. Intel and AMD are cpu manufacturers looking to meld cpu and gpu. Once performance is acceptable/phenomenal it is obvious that OEM will look to the simpler hardware solution for mainstream and multimedia pc's Whenever that happens and it IS going to happen Nvidia will be left in no-man's-land. A gpu company with no market except for enthusiasts.

Intel and AMD are cpu manufacturers looking to meld cpu and gpu because the long rein of the x86 cpu is coming to and end. They know this and it is the reason AMD bought ATI and Intel is developing larrabee. The industry is pushing forward to massively parallel computational computing architectures.

Take a look at nvidia's current offerings and their marketing terminology, they are pushing 'parallel processing' 'clusters' and 'supercomputers'. This and the release of CUDA, why not get programmers accustomed to coding for your processing platform ? On their site they have a simple image of a traditional cpu with 4 cores next to their tesla (which is basically a processor in the pci bus) with 240 cores. Any layman will think "why am I not using the 240 core one?"

Get accustomed? They are catering to them.

The thing about the Fermi architecture is that it is not just designed to play games it is also going to take GPGPU to the next level with the advancements NVIDIA has made with support for CUDA, C++, Direct Compute, DirectX 11, Fortran, OpenCL, OpenGL 3.1 and OpenGL 3.2

The big thing here is native support for C++ and OpenCl

They create great chipsets and powerful graphics solutions among many other products, why not go to the next logical step ? With a processor they'll complete the platform and give the company new viability. The nvidia x86 processor is inevitable.

They are doing the logical step and the only thing inevitable about the x86 architecture is that it will end.

Xclusiveitalian Xclusiveitalian said:

I feel Nvidia should just stay making video cards, I feel they are of great quality but making processors? I think its not there business and they shouldn't go down that road.

ken777 said:

I have a hard time seeing Nvidia as a real competitor to Intel or AMD in the x86 space. Maybe they can focus on lower power, low end CPU/GPU combos for things like netbooks, but moving further up the line seems like a huge investment in time and effort ($$$). But maybe they have no other choice. Right now, Nvidia is the odd man out. Intel and AMD control their own platforms - CPU, GPU, and chipsets. Seems like Nvidia either has to go down that road too, or start shifting more into other areas (like with Tegra) where they won't be at the mercy of Intel of AMD moves to lock out competition.

Puiu Puiu said:

@JieMan massively parallel computational computing architectures won't become the standard anytime soon 9at least for another decade) and until then nvidia needs all the experience they can get from making a few CPU's even if they're not successful. I'm just hoping that GPGPU will be widely adopted by the time win8 is released.

psycholexx said:

Tegra is maybe the most powerful and full-of features ARM CPU around, if they are focusing to create a x86 netbook-like CPU, for internet browsing, light office work, and multimedia with a low power print, they could affect seriously Intel's market share, by creating a winning CPU/chipset combination, or even better, a CPU with embedded chipset for easy integration and low power. But I don't think they could affect in any way, Intel and AMD's market in medium and high performance computing market, at least not for the next 5 years or so....

Vrmithrax Vrmithrax, TechSpot Paladin, said:

There's been grumblings of this for a while. They spiked with AMD bought ATi, with industry analysts figuring nVidia might have to do some catching up if the whole CPU/GPU melding direction became lucrative. With AMD pushing the integration envelope, and Intel pushing their own CPU/GPU integrated solutions along, how can anyone complain or doubt why nVidia would want to look at the CPU side of things to remain competitive? Keep in mind, there are massive potentials for this technology, in some of the largest growth sectors of the electronics marketplaces: console games & mobile computing. If nVidia just keeps doing what they've always done, their niche market will get smaller and smaller, as the technology train passes them by.

People seem to be jumping to a conclusion that nVidia is acquiring Transmeta. They'd be getting people who worked for Transmeta, who know the x86 architecture and power savings methods, not the now defunct company. So they wouldn't have the slow outdated clunky transmeta stuff, knowing nVidia it would be shiny and fast.

Something else to consider: nVidia has a big chunk of business in powerful workstations, and parallel processing is becoming exponentially more sought after. Could be that this looking at CPU development might be tied to creating a much more powerful workstation processing product, which is the area that nVidia makes much larger profit margins anyhow.

ET3D, TechSpot Paladin, said:

JieMan has some good points. First of all, I cringe every time I see the word "analyst" in a news title. Another person's opinion. We have lots of those. (That's why I stopped reading TG Daily. Too many opinions, too little news.)

I also think that emulating x86 on the GPU isn't too far fetched. On one hand, all x86 CPU's translate x86 instructions to their own internal opcodes. On the other hand, NVIDIA is moving towards more general purpose cores. DX11 cards already have integer math, shifts, bit field extraction, all useful things for general purpose use. While the current architecture bundles too many operations together to fit the CPU paradigm well, it's probably not too much of a departure to put 4 or 8 such pipelines separately on the chip. You'll get a chip with, let's say, 4 in-order 1.5GHz cores and 4 DX11 pipelines (also capable of CPU use when not running graphics). Quite a nice netbook platform there.

Shalimar said:

It in theory could work.. but realistically they would have a massive uphill battle to get into this competition and have any chance at all. Real world though I doubt this will happen anytime soon if it ever does at all. Rumors are 'free' after all.

zaidpirwani said:

Puiu said:

I wonder in how many years will we see the unification of the CPU with the GPU? 10-20?

If NVIDIA is going to make a processor then it most likely compete with NANO or Atom.

Not in 10-20 years, it is just a matter of time, the pace at which we are advancing is greater then ever and now is the perfect time to merge the two into one die, as both Intel and AMD have successfully made multiple cores, now all that is to be left is to put a couple of those cores on graphics or just take a chip from Nvidia and try to fix that on a CPU.

We need mobility, we need small form factor, we need efficiency and we need processing and display/graphic power, and this need of the consumer market will eventually drive the giants to fulfill it in time...

waterytowers said:

GPU/CPU integration is upon us and I can't imagine it will be long before this is a reality. If it means getting a smaller, cooler and more efficient machine I welcome it. Most of the things people want to do with a PC is not graphically challenging, it will be enthusiasts and gamers that will want a dedicated GPU. I want a standard form factor for PCs that can be attached to back of a monitor. Upgrading would be changing the motherboard or small chips that can be clipped on and off. Miniaturization should include making smaller card slots for upgrading. It would be nice if laptops of the future had a standard form factor for the different sizes and we could then keep a chassis we are happy using and just replace the internals. Currently it is hard because components have been shrinking every few years but the shrinkage will eventually be small enough that this may be a reality?

Staff
Rick Rick, TechSpot Staff, said:

Putting them together doesn't help.

Do we really want GPU + CPU 'unification'? I disagree with this eerily universal consensus which says this is the best way to go.

Ultimately, stuff like CUDA, OpenCL etc.. Are supplementary technologies. They are basically frameworks / layers -- like Direct X and OpenGL -- that let developers take advantage of the underutilized hardware in your computer. I really doubt they are intended to replace your 64-bit-extended CISC CPU in any shape or form.

Think about it: It sounds like an upgrade nightmare... but more importantly, we'll lose the specific performance gains of having an actual GPU. This isn't as simple as just slapping a GPU on a CPU and getting the best of both worlds. They have to share the same space.. the same physical architecture... I think that's what most people fail to see.

The reason GPUs perform so well for graphics operations is that they have a specific instruction set and a physical architecture that focuses on just processing graphics. nVidia, ATI etc.. employ a minimalist approach, focusing on achieving more through optimization... even with smaller pipelines, minimal branch prediction, smaller cache and lower frequencies than CPUs. This works well for what they do, which is graphics processing.

However, CPUs are there to do *everything* and thus, master nothing but versatility. Unlike the "more with less" approach GPU manufacturers have been utilizing, over the years CPU manufacturers have tried to do "more with more". They've developed huge caches, massive pipelines and complex brand prediction. It's less efficient, but it has worked well to meet the needs of compatibility and versatility.

We *are* starting see CPUs becoming more simplified -- perhaps the gap between CPU and GPU is closing -- but I really don't see GPUs and CPUs being combined in any reasonable time frame (decades). In order for this to work, the way software is written and compiled for these platforms needs to be completely redesigned.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.