Nvidia boss: Moore's Law is dead, GPUs will soon replace CPUs

midian182

Posts: 9,662   +121
Staff member

Nvidia boss Jensen Huang has become the latest expert to declare that Moore’s Law is dead. Speaking at the GPU Technology conference in Beijing, China, the CEO also said that advancements in graphics processors mean GPUs will soon replace CPUs, DigiTimes reports.

For those unfamiliar with the term, Moore’s Law is the name given to an observation made by Intel co-founder Gordon Moore in 1965. He noted that the number of transistors per square inch on a dense integrated circuit doubled every year, and predicted the trend would continue in the future. He later revised this to every two years.

Speaking on the topic of “AI: Trends, Challenges, and Opportunities,” Huang claimed to be the first boss of a major semiconductor company to announce the death of Moore’s Law. He believes GPU computing capability and neural network performance are developing at a faster pace than what Moore's observation states.

Huang noted that while CPU transistors have grown at a rate of about 50 percent annually, performance has only increased by 10 percent. He added that designers can hardly work out more advanced parallel instruction set architectures for CPUs, therefore these processors will soon be replaced by GPUs.

Huang also said that Nvidia's GPUs are the perfect solution for AI-based applications, suggesting he believes GPUs are set to play a larger role in certain aspects of computing, rather than replacing desktop CPUs completely.

Unsurprisingly, Intel disagreed with Huang’s comments. "In my 34 years in the semiconductor industry, I have witnessed the advertised death of Moore’s Law no less than four times. As we progress from 14 nanometer technology to 10 nanometer and plan for 7 nanometer and 5 nanometer and even beyond, our plans are proof that Moore’s Law is alive and well," said CEO CEO Brian Krzanich last year.

Permalink to story.

 
Yeah, usual business talk. GPUs are great at some types of calculations - those that can be massively parallelized - and they ARE replacing CPUs at these workloads. It's been happening for a couple of years now, so that's nothing new. But it has nothing to do with Moore's law, it's about the architecture of CPUs and GPUs. And that's exactly the same reason why GPUs suck terribly in other types of work - the ones that make CPUs shine. Saying that GPUs will replace CPUs is like saying that trucks will replace cars, because they can transport more stuff. It's true for some specific tasks (that's why we don't see small Fords transporting cargo), but utter nonsense in general.
 
Well, Moore's law is tied to time, even after a revision from 18 to 24 months, it pretty much doesn't apply anymore. No one said Moore's law condition was just to shrink transistors as BK believes, only thing he states.
 
I think the biggest CPU nemesis is the mobile SoCs... The time I can play the equivalent of Crisis using the CPU from a mobile SoC on my PC, all we gonna need is a Arm CPU and a discrete GPU... Then AMD and Intel will have some real competition.

A11 Bionic, if proven performatic enough as said, Is a proof of it. The simple fact that most daily computing tasks can be done in a smartphone/tablet is proof of it.

Still no death to Moore's Law.
 
...except Moore's Law isn't a law at all, either of physics or engineering. Note how the article even admits he "revised it" to two years. Do you think laws of physics that are 50% wrong are actually "laws"? It's basically just a tentative observational suggestion, from a guy who at the time had no idea how tech would actually mature.

As far as decreasing the size, sure it's great, but this technology is still decades behind physics. The electron is still being used erroneously as the electrical quanta, when it's actually the photon itself, some 3.3 million times SMALLER than the electron. So we have a great deal of room to breathe as the tech catches up with modern quantum physics. No, present-day "quantum computers" don't address this, and are still operating at the electron level. It's pathetic.
 
Moore Law can only die if Nvidia together with AMD and whoever is there in graphics industry band together and develop its own x64 platform.

There are only 3 problems:

1. GPU manufacturers won't band together.
2. Profits will take big hit as even with all financial muscles Nvidia has, its bloody expensive developing new architecture, while being compatible with everything else and not infringe any patents along the way from all patent trolls in the world... which just wait and drool who to sue next.
3. Especially for AMD it would be suicidal because CPU division is the most profitable one, while GPU div delivers sub-standard products (nearly?) generation after generation and I wouldn't be surprised at all if they decide to offload Radeon part to be somebody's else problem.

For time being - status quo, but CPUs are on last legs. Not much fuel in the tank on this route. And to be honest, last thing we need is for CPUs to die. We would have giant GPU monopoly behemoth with total control of the market which would be simply catastrophic for consumers.
 
...except Moore's Law isn't a law at all, either of physics or engineering. Note how the article even admits he "revised it" to two years. Do you think laws of physics that are 50% wrong are actually "laws"? It's basically just a tentative observational suggestion, from a guy who at the time had no idea how tech would actually mature.

As far as decreasing the size, sure it's great, but this technology is still decades behind physics. The electron is still being used erroneously as the electrical quanta, when it's actually the photon itself, some 3.3 million times SMALLER than the electron. So we have a great deal of room to breathe as the tech catches up with modern quantum physics. No, present-day "quantum computers" don't address this, and are still operating at the electron level. It's pathetic.

I think someone is confused on their quantum mechanics:

http://www.umich.edu/~ners312/CourseLibrary/Dommelen.pdf

Almost every single mention of charge, voltage, current, etc., both in the classical & quantum models (I.e. Maxwell's equations) show the electron holding the charge & electrical actions being created by the motion of the electron. Photons are emitted or captured as energy levels in the electrons change, but they don't carry the charge/current itself. The only place they seemed talked about a photon or group of photons having any kind of charge was for electromagnetic waves in a complete vacuum...which would not apply to computer circuits.
 
If gpus will replace cpus, then does this mean Skynet will arise from cryptocurrency rigs in the next terminator movie?
 
It isn't dead but I hope it has a good funeral plan in place, it isn't far away now

The problem is that Moore's Law has to be modified to account for things like graphene and other future materials that will increase the speed of electronics by 100-1000 times what they are now and that says nothing of light based computers that will be millions of times faster than anything we've seen thus far.
 
Back