The Rise, Fall and Renaissance of AMD

The ~unsung heroic AMD elephant is Infinity Fabric.

There are many amd barbs & some luck which get popular credit for this ascendance, but their focus on Fabric has made it the secret sauce that moderates chiplet's innate latency weaknesses, & makes cost effective chiplets, competitive with monolithic perf.

Unlike that other time Amd had a killer edge, Fabric has been a decade plus hard slog. It is not imitable (along w/ a all new range of processors & complete re-validation) in a timely way. IF, by the time intel can legally clone it, they will be enfeebled.

Fabric gives Amd a clear run of at least 5 years.

In the historical context of the article, the philosophical aspects of Fabric & chiplets are pivotal.

AMD faced oblivion at the time. What hope had such a small, weak company against such a strong foe, & they only had a budget for one shot at a product, yet they needed a range of very different processors, & sophisticated servers were a must - they were lucrative & an incubus for lesser products.

How could one product serve all markets?

The answer was of course to build big cpuS using multiple small cpuS as lego blocks. Make one cpu to satisfy multiple markets, but figure how to group them in multiples with coherent caches (via Fabric).

Even they were surprised how many advantages this approach had. Necessity proved very good mum for chiplets.
 
I just posted about the central role of Fabric - the vital interface between the multiple processors in a chiplet and in an array of chiplets.

A point worth treating separately imo, is amd had an obsession about APUs, given their unique status as both a cpu & gpu producer.

They had made a huge investment buying ATI Graphics in ~2005, which they were very keen to get more return on. They perhaps put more dev funds into apuS than sales justified in an effort to save face over the ATI purchase.

If they could cheaply combine a cpu & a respectable gpu on one chip & share resources like ram, they could snooker intel in the high volume PC market. They made some great products from 2014 on, which sheeple avoided.

IE - the ~same Fabric issues had long been on their minds w/ harmoniously combining a cpu/gpu/system resources.

This experience may have sewn the idea of the far simpler project of teaming multiple homogenous lego block cpuS, vs a diverse cpu & GPU processors
 
The first CPU I remember installing was an AMD 486 DX-2 which I still have. From there I upgraded to K6-2, Athlon, Phenom and now Ryzen 😎
 
Actually, Intel had to sue nvidia because they never had a license yet lied that they did.

AMD back then decided to re-enter the chipset market permanently and Nvidia simply dropped out.


Read the readers replies about how people were not happy with nvidia (which tells me that they were wiser than the rabid cult members we have today).

Lastly, from Wikipedia:

The original nForce chipset was let down by patchy driver support and less than optimal hardware design. Performance of the dual-channel memory controller and "DASP" did not greatly surpass the VIA Technologies KT266A chipset that was usually as fast and cheaper. The optimized parallel ATA driver support was introduced and then withdrawn after hardware incompatibilities showed up, and the much heralded SoundStorm audio was seen to crackle under heavily loaded scenarios. In fact, the ATA driver would remain an issue at least into the life of nForce4 where it was still known to cause problems with some hard drives and optical drives

Link here https://en.m.wikipedia.org/wiki/NForce#Performance_and_problems

Personally, I had a Shuttle XPC with the nForce chipset and didn’t experience any of that.
You could argue nVidia's entry into the chipset market was almost accidental. nVidia developed the nForce for the original Xbox which was supposed to use an Athlon 700, but at the 11th hour Intel made MS an offer it couldn't refuse and overnight the Xbox design changed to use a 733Mhz Pentium 3. By this stage nVidia had already developed an AMD Socket A compatible chipset, so they decided to sell it as a new Socket A chipset (nForce) competing with Via KT266/A and SiS 74X products.

The success led to the nForce 2 and 3, but as you said once AMD bought ATI and Intel sued nVidia it didn't leave them anywhere to go, and the whole chipset business was low margin, low volume anyway so it was an easy business decision to let the product line fade out.
 
I didn't get an AMD CPU until after the 90s (Apple IIe/PII/PIII did the job for Carmen Sandiego/Oregon Trail -Panzer General/Steel Panthers/Aces of the Deep -Close Combat/Quake II/Myth games respectively) and it wasn't until I bought one of those hideous green cyborg Alienwares in like 2003 that I was introduced to an AMD CPU; the Athlon XP3200+; what a fantastic CPU. Played everything I wanted including Rome Total War and Operation Flashpoint: Resistance.
 
I bought a second hand Asus AMD FX7600 Laptop in 2014. Now, 9 yrs later, it can multiboot to Windows 8.1 10 11 or Ubuntu 22.4, and runs fine.. not hugely powerful, but great all round backup laptop for online shopping banking Netflix Tube LibreOffice VeraCrypt Google Desktop Zoom etc 😜
 
When AMD bought ATi, they had a goal of Heterogenous computing... and AMD's entire existence has been moving towards this goal since then.

Later this year, you will see an AM5 APU that will have twice the graphics power of an Xbox. AMD knows what this means (taking it's time), and knows what their APU will do to the PC gaming industry.... so does NVidia.


Dr Su is the mastermind behind AMD's Infinity Fabric. It may have been an Ubisoft dev asking Dr Su about AFR and dual GPU cards (mind you at a Ryzen event)... she dismissed the notion due to simple power constraints and smugly said, we are well beyond that macro.


Later this year, We will be able to build a 1080p gaming rig, without having to buy a dGPU.. !
 
No mention of Llano? That was the first APU, was short lived (only one generation) but was still on Stars architecture, performance was probably higher than Bulldozer.
You make a good point. I have an old craptop with an A8-3500M under the hood. I managed to get it to play Skyrim at 720p medium. That craptop just has the "Radeon HD 6620G" IGP.
The low power chips like Bobcat and Puma, although they were slow compared to mainstream CPUs, moped the floor with the Intel Atoms. This was a big win in my opinion.
It was a big win but the OEMs still used primarily Atoms in their NetBooks. Stupidity and cowardice are clearly still things in the industry.
AMD back then decided to re-enter the chipset market permanently and Nvidia simply dropped out.
Yup. I remember when we said that "nForce" became "nFail". :laughing:

It's too bad that VIA pulled out too because they had some incredible chipsets.

Read the readers replies about how people were not happy with nvidia (which tells me that they were wiser than the rabid cult members we have today).
It's not just about the rabid cult members. I'll tell you a true story that just happened less than a week ago:
I was in Canada Computers getting a cheap ($20) 128MB Lexar SSD for my mom's computer. I built her a new PC out of spare parts (FX-8350, R9 Fury, 8GB DDR3) but it was laggy as hell because of the HDD (I also needed a new case for the R9 Fury to fit).

While I was waiting, I saw this young couple looking at an RTX 4080 and were completely lost. I offered my assistance and asked them what their budget was. They said $1,000CAD and I said "Well, do you do any professional work?" they said no, it was just for gaming. I asked them "Are you ray-tracing fanatics?" they said no, they didn't even know what RT was. I asked them what the specs of their monitor was and almost cracked up when they said "1080p 60Hz". I told them that they had no need to spend $1,000 for a 1080p experience. I pointed at the Gigabyte RX 6750 XT that happened to be there and said "That will be an absolute rocket at 1080p." The girl then said "oh, but I have an Intel CPU!" to which I replied "Ok, and?". She said "Isn't that for AMD? It says AMD on the box." and I said "Oh, no... it's not for AMD, it's made by AMD. You can use it just fine. It's not like nVidia has anything to do with Intel." and, to his credit, her boyfriend agreed and told her that everything works with everything else. They thanked me for saving them $400.

At that moment, I realised just what a mistake AMD had made when they dropped the ATi branding because there's no way that what this girl thought is unique. People get this idea that Radeons are only for computers with AMD CPUs. They also have this idea that they need to spend $1,000 for a card to game at 1080p. I'd like to find out who gave them that idea and shoot them.
 
Hmm, never heard neither thought about it like that.
Then again, chances are, she did some light reading about GPUS and since the influencers rulling todays media only show nvidia gpus on their videos and articles, she simply didnt knew that AMD also makes GPUS.

At least you were able to help them, I actually failed on that.
Someone that I know bought his kid a 4080 for around US$1400 so he could beat everyone in Fortnite.
The best part? The kid plays "competitive" meaning 1080p or lower, with graphics settings on low.
Everyone at Microcenter told him that he needed that 4080.
Oh well, Dear Leader Jensen wins again.
 
Hmm, never heard neither thought about it like that.
Until that day, neither had I. I'm guessing that this is because back when I worked at Tiger Direct, Radeons were still branded with "ATi".
Then again, chances are, she did some light reading about GPUS and since the influencers rulling todays media only show nvidia gpus on their videos and articles, she simply didnt knew that AMD also makes GPUS.
Yep. It's like I said, for most people, choosing between Radeon and GeForce is like choosing between GE and Whirlpool. They have absolutely no clue about either of them.
At least you were able to help them, I actually failed on that.
Someone that I know bought his kid a 4080 for around US$1400 so he could beat everyone in Fortnite.
It's not a failure if you don't have the opportunity and it doesn't sound like you had any opportunity to help them. I just got lucky.
The best part? The kid plays "competitive" meaning 1080p or lower, with graphics settings on low.
Everyone at Microcenter told him that he needed that 4080.
They're a bunch of lying a$$holes and he's a dumba$$ for not informing himself before dropping that kind of money on a video card.
Oh well, Dear Leader Jensen wins again.
For once, I can't blame Jensen because this was a case of a numbskull allowing himself to be screwed over by a bunch of crooks because he was too lazy to prevent it.
 
Back