The end of Moore's law forced YouTube to make its own video chip

Shawn Knight

Posts: 15,296   +192
Staff member
In context: Partha Ranganathan came to realize about seven years ago that Moore's law was dead. No longer could the Google engineering VP expect chip performance to double roughly every 18 months without major cost increases, and that was a problem considering he helped Google construct its infrastructure spending budget each year. Faced with the prospect of getting a chip twice as fast every four years, Ranganathan knew they needed to mix things up.

Ranganathan and other Google engineers looked at the overall picture and realized transcoding (for YouTube) was consuming a large fraction of compute cycles in its data centers.

The off-the-shelf chips Google was using to run YouTube weren't all that good at specialized tasks like transcoding. YouTube's infrastructure uses transcoding to compress video down to the smallest possible size for your device, while presenting it at the best possible quality.

What they needed was an application-specific integrated circuit, or ASIC – a chip designed to do a very specific task as effectively and efficiently as possible. Bitcoin miners, for example, use ASIC hardware and are designed for that sole purpose.

"The thing that we really want to be able to do is take all of the videos that get uploaded to YouTube and transcode them into every format possible and get the best possible experience," said Scott Silver, VP of engineering at YouTube.

It didn't take long to sell upper management on the idea of ASICs. After a 10-minute meeting with YouTube chief Susan Wojcicki, the company's first video chip project was approved.

After a 10-minute meeting with YouTube chief Susan Wojcicki, the company's first video chip project was approved.

Google started deploying its Argos Video Coding Units (VCUs) in 2018, but didn't publicly announce the project until 2021. At the time, Google said the Argos VCUs delivered a performance boost of anywhere between 20 to 33 times compared to traditional server hardware running well-tuned transcoding software.

Google has since flipped the switch on thousands of second-gen Argos chips in servers around the world, and at least two follow-ups are already in the pipeline.

The obvious motive for building your own chip for a specific purpose is cost savings, but that's not always the case. In many instances, big tech companies are simply looking to create a strategic advantage with custom chips. Consolidation in the chip industry also plays into the equation, as there are now only a couple of custom chipmakers to choose from in a given category making general-purpose processors that aren't great at specialized tasks.

Also read: The death of general compute

Jonathan Goldberg, principal at D2D Advisory, said what is really at stake is controlling the product roadmap of the semiconductor companies. "And so they build their own, they control the road maps and they get the strategic advantage that way," Goldberg added.

Argos isn't the only custom chip to come out of Google. In 2016, the company announced its Tensor Processing Unit (TPU), which is a custom ASIC to power artificial intelligence applications. Google has since launched more than four generations of TPU chips, which has given it an advantage over its competition in the field of AI. Google also crafted its Pixel 6 series of smartphones using a custom-built Tensor SoC, bringing hardware and software under the same roof for its mobile line.

Image credit: Eyestetix Studio

Permalink to story.

 
YouTube should make a better streaming card and give free to content creators that are about to start youtube channels. Would be a WinWin. Of course could be tied to some minimum requirements or some application video the creators have to send in.
 
Yes, just another Google self promotion advertisement - "look how smart we are". A look at history shows the truth. The "law of diminishing returns" has been known by engineers since at least , in Western Civilization, from the days of Leonardo d V. If you are interested it is a consequence of the universal "Law of Entropy": which as we say universal. The law of diminishing returns show that Moores law was, and would be, a short term law I.e. it would be correct for a short period of time and then it would be incorrect. So to say that it took Pathan/Google to figure is out is a lie. From the beginning days of digital computing (early 1900's) it was understood by engineers that in order to design productive computer hardware (the tool) that the use of the hardware needed to first be determined (as are all useful tools created by humans). This is where the Law of Choice came about - "choose your software before your hardware" in the design of computer environments - the tool. Once the actuality of general purpose software came about (best example is MSDOS and MS BASIC) computer hardware needed to be designed both for specific purposes (specific applications) and for general purposes (general application not defined at the point of choice of hardware). The type of cpu (and its hardware configuration) were then broken into 2 categories, one called CISC and one called RISC. AISC is just Google name for their RISC CPU. All CPU's started out as RISC's until the desire for CISCs came about.
Nothing new here at all: so this press release is totally "fake news". So either Pathan (and his other engineers) are taking credit for something they did not invent or they are just bad engineers that should not be employed by any competent company. Which is it Pathan/Google. Lying or just dumb. Techspot: if you want to talk about technology you need to study at least some of its history to tell whether you are publishing "fake news".
 
Last edited:
True, it was more of a pattern than anything else, probably best described with a logarithmic function...
I'd describe it as an observation of high-school geometry (the relationship between length and area), coupled with the assumption that new process nodes appear in linear time.

One of the first things a physics student learns is that you can model any physical phenomena with a linear equation -- if you consider a small enough time period. Moore's first version of his law stated a doubling would occur every 12 months -- the speed in which full process nodes were being adopted. Today, we get a half-node every two years, and the process continues to slow.
 
This is interesting, just like Apple laptop ditching Intel chips

Either Intel is lagging for revenue or cost/efficiency didnt meet up
 
Yes, just another Google self promotion advertisement - "look how smart we are". A look at history shows the truth. The "law of diminishing returns" has been known by engineers since at least , in Western Civilization, from the days of Leonardo d V. If you are interested it is a consequence of the universal "Law of Entropy": which as we say universal. The law of diminishing returns show that Moores law was, and would be, a short term law I.e. it would be correct for a short period of time and then it would be incorrect. So to say that it took Pathan/Google to figure is out is a lie. From the beginning days of digital computing (early 1900's) it was understood by engineers that in order to design productive computer hardware (the tool) that the use of the hardware needed to first be determined (as are all useful tools created by humans). This is where the Law of Choice came about - "choose your software before your hardware" in the design of computer environments - the tool. Once the actuality of general purpose software came about (best example is MSDOS and MS BASIC) computer hardware needed to be designed both for specific purposes (specific applications) and for general purposes (general application not defined at the point of choice of hardware). The type of cpu (and its hardware configuration) were then broken into 2 categories, one called CISC and one called RISC. AISC is just Google name for their RISC CPU. All CPU's started out as RISC's until the desire for CISCs came about.
Nothing new here at all: so this press release is totally "fake news". So either Pathan (and his other engineers) are taking credit for something they did not invent or they are just bad engineers that should not be employed by any competent company. Which is it Pathan/Google. Lying or just dumb. Techspot: if you want to talk about technology you need to study at least some of its history to tell whether you are publishing "fake news".

If you're going to claim "fake news", at least get your facts right.

ASIC stands for Application Specific Integrated Circuit, I.e. specialized hardware that is designed for a specific purpose in mind, in this case transcoding. Has nothing to do with RISC/CISC at all.
 
MSAs (Multifunction Support ASICs (Application-Specific Integrated Circuits)) have been in use for at least 3 decades in certain military hardware. They aren't a new thing, just an unknown one, to most people.
 
Yes, just another Google self promotion advertisement - "look how smart we are". A look at history shows the truth. The "law of diminishing returns" has been known by engineers since at least , in Western Civilization, from the days of Leonardo d V. If you are interested it is a consequence of the universal "Law of Entropy": which as we say universal. The law of diminishing returns show that Moores law was, and would be, a short term law I.e. it would be correct for a short period of time and then it would be incorrect. So to say that it took Pathan/Google to figure is out is a lie. From the beginning days of digital computing (early 1900's) it was understood by engineers that in order to design productive computer hardware (the tool) that the use of the hardware needed to first be determined (as are all useful tools created by humans). This is where the Law of Choice came about - "choose your software before your hardware" in the design of computer environments - the tool. Once the actuality of general purpose software came about (best example is MSDOS and MS BASIC) computer hardware needed to be designed both for specific purposes (specific applications) and for general purposes (general application not defined at the point of choice of hardware). The type of cpu (and its hardware configuration) were then broken into 2 categories, one called CISC and one called RISC. AISC is just Google name for their RISC CPU. All CPU's started out as RISC's until the desire for CISCs came about.
Nothing new here at all: so this press release is totally "fake news". So either Pathan (and his other engineers) are taking credit for something they did not invent or they are just bad engineers that should not be employed by any competent company. Which is it Pathan/Google. Lying or just dumb. Techspot: if you want to talk about technology you need to study at least some of its history to tell whether you are publishing "fake news".

I am sorry, but you obviously lack the basic knowledge of the topic.
Please get at least terminology right and some history of RISC and CISC CPU's and what is actually an ASIC:
https://en.m.wikipedia.org/wiki/Application-specific_integrated_circuit
 
Now, what if the ASIC is superscalar? Then we get Application Specific Superscalar Integrated Circuit or in short ASSIC.
 
Back