AMD: The Rise, Fall and Future of an Industry Giant

Intel has pushed the Atom series through heavy marketing. It simply sucked, it sucked then, it sucks now. The $250 Chromebook is superior to the latest Atom chips.

The Core I series is a processing monster, but it does not deliver the type of performance graphics-wise that AMD can pull off. In the long run, AMD has the upper hand. I don't think Intel's graphics performance, especially on integrated solutions, will be superior to AMD's. And graphics performance is what the end-user will be looking for the most from now on, especially in small form factors (there are some Mini-ITX FM2 designs that will kick a**).

What really bothers me is that when Intel throws something out that clearly has sub-par performance (such as Atom or the integrated HD Graphics), the reviewers put up all sorts of caveats before showing the (horrible) numbers. When it's AMD, there's absolutely no excuse for low numbers.

You're quite the elaborate troll pleading that it's mere bias towards Intel or some other accusations. AMD integrated graphics are not meant for professional graphics and neither are Intel's in their current state. The raw CPU power matters more in most benchmarks and that's where Intel destroys AMD. This has been said a billion times.
 
Graham: I believe that this article is factually incorrect at several points, the netburst architecture was used solely in the Pentium 4 and was defined by the double pumped ALU's present in that design.

It therefore looks a little silly when you say "When the K6 hit shelves in 1997, it represented a viable alternative to the Pentium MMX, and while Intel continued to stumble along with its underwhelming Netburst architecture" that isnt a true statement. It needs to say that was a viable alternative to the Pentium Pro architecture.
 
+1 to Per's comment about FPU performance.

Also I remember PIII Coppermine held a slight performance advantage over the competing Athlons, however, once, AMD improved Athlon with on-die L2 along with die shrink they got the performance advantage. Also, once Intel launched P4/Northwood (2.4+), it generally outperformed Athlon XP.

@DBZ
In general, a nice read, keep it up and thanks.
 
AMD has serious problems in producing new products and cash flow, something that Intel and Nvidia don't. Nvidia graphics revenue alone equaled ALL of AMD and they had 200 million in profit as well. Bleeding thousands of employees in the tech world is not a good sign.
 
Article is a pretty generic copy and paste, but (other then some minor misleading/mis-information) is well portrayed and re-written for the basic reader. One thing I will never forget is the feeling I got when AMD aquired ATI, you just felt like AMD was going all in on something they didn't need to. You don't get anywhere in life without taking risks and this was no different so I didn't fault thier choice; but they have never recovered and its too bad because AMD's competition and involvement in this industry has been instrumental in the development of various chipset architecture and instruction sets as we know it.
 
Graham: I believe that this article is factually incorrect at several points, the netburst architecture was used solely in the Pentium 4 and was defined by the double pumped ALU's present in that design
I'm aware that the architecture in use at the start of the time period was P5, moved though P6, to Netburst. I hadn't intended to imply that 1997 signified the introduction of that arch, only to note that the during the era that AMD designed it's first in-house CPU's through to the K8 (I.e. 1997-2006-up until Conroe's launch) -what I termed "the golden age"- that Netburst was introduced and didn't evolve that much as a design philosophy ( process shrinks and speed increases aside).
BTW: Pentium D also used the Netburst architecture, and while technically they are P4 in an MCM package, they are a naming convention just as much as Pentium 4, and the Pentium Pro (P6 arch) you mentioned.

@ Per
Since AMD were licencing Intel's designs, I thought it would be implied that Intel would be first to market with any new design- so in effect, AMD would be competing largely against newer models from Intel at every turn -especially during the nearly five years of inactivity caused by the i386 lawsuit, where eventual Am386 CPU's would be in the marketplace alongside Intel's 486. The article was aimed at AMD in general- not the enthusiast in isolation, which is why I eschewed dissecting a poor FPU performance and adequate ALU performance (and the required explanations/clarifications as well as an inclusion of Intel's product line, and timeline for context)...which would generally also lead to benchmarks, which would in turn lead to Intel's compilers etc. etc.
I'm not entirely sure how you could condense a 43 year history and some future speculation into the word count constraints required for a generally accessible article- as it was I had omit some fairly significant content ( chipsets, brand expansion/the Siemens AG joint venture, Spansion -the AMD/Fujitsu joint venture, manufacturing expansion, the x86-64 extension and its licencing to Intel and VIA etc, 10h family, most of the bit-slice processor history and implementation- including the Am95C60- the AMD graphics chip from 1987, graphics post-ATI acquisition, Globalfoundries spin-off and a lot more)

Maybe the next article can revolve around a timeline-performance (integer and float)-price-marketing-availability interactive graph - maybe our budding Hemingway, amstech would like to tackle it.
 
"Maybe the next article can revolve around a timeline-performance (integer and float)-price-marketing-availability interactive graph - maybe our budding Hemingway, amstech would like to tackle it."

I appreciate the sentiment but I could never compare to someone with a Mustache that great.
 
Maybe the next article can revolve around a timeline-performance (integer and float)-price-marketing-availability interactive graph - maybe our budding Hemingway, amstech would like to tackle it.

He would just do a "generic copy and paste" like the rest of us ;)
 
I took the same route when building my new system the Intel i7 architecture has such a high price premium attached.
 
Unfortunately for AMD, they got decked by a combination rather than a single punch. AMD's fabrication capacity lagged majorly during a time when they offered a reasonable alternative to Intel's P5/P6/NetBurst CPU's- a lack of capacity and production generally scares away larger contracts

From what I remember this might be taken out of context. My own experience I recall that Dell stopped selling AMD computers, Even Computer Shopper stopped listing AMD machines for sale in their publications. It wasn't until AMD filed suit against Intel did Dell start selling AMD machines again. So of course they wouldn't have enough, because the balance of supply and demand suddenly got off balanced.So I have a hard time considering that as AMD bad managment, it's a result of Intel underhanded business practice suppressing supply demand for 5 years prior. After the law suit was filled all the major suppliers started selling the machines again. Shortly after that the Core2 duo's where released. Intel effectively cheated AMD of what should have been a highly profitable 5 years. Of all the factors it was Intel that underhandedness that kept AMD from becoming a household name which to me is a killing stroke.
 
From what I remember this might be taken out of context. My own experience I recall that Dell stopped selling AMD computers, Even Computer Shopper stopped listing AMD machines for sale in their publications. It wasn't until AMD filed suit against Intel did Dell start selling AMD machines again. So of course they wouldn't have enough, because the balance of supply and demand suddenly got off balanced.
The only problem with that scenario is that AMD filed suit in June 2005...and AMD faced CPU shortages from mid-2004, presumably while being shut out of the major markets-cutting prices to boost sales and not having enough fab capacity to sustain the channel. The fab capacity supposedly being constrained because of money lenders shying away from providing funds to AMD for manufacturing construction (due to Intel suit and presumably not having much faith in AMD's 1991 antitrust suit against Intel) and AMD not seeking capacity at other foundries
So I have a hard time considering that as AMD bad managment, it's a result of Intel underhanded business practice suppressing supply demand for 5 years prior
Maybe a plan B as far as manufacturing capacity?- it's not as if the signs weren't fairly obvious before things ground to a halt (this article from 2002 for example). Then of course, there were a few fingers pointed at Jerry " Mr Flamboyant" Sanders as to how AMD's relationship soured with Intel in the first place:
The hard feelings started when [Jerry] Sanders told the press how much money AMD was making on the exchange agreement. These comments prompted [Andy] Grove to write a note in October 1984 to his senior vice president David House, later entered as AMD exhibit 711 in the case:
And of course, while everyone remembers that AMD won the suit stemming from antitrust litigation ($18m), it should also be remembered that Intel also won its suit against AMD ($58m), and secured copyright for their microcode and legal precedent for limiting licenced clones as early as 1989 ( part of the judgement in the Intel v NEC case)- so maybe a plan B might not have been such a bad idea.
Of course, each to their own. As I noted in the first line of the article: "AMD has long been subject of polarizing debate among technology enthusiasts. The chapters of its history provide ample ammunition for countless discussions..."
Anyhow, thanks for the discussion. Gives a chance to add some additional content to the dialogue.
 
The only problem with that scenario is that AMD filed suit in June 2005...and AMD faced CPU shortages from mid-2004, presumably while being shut out of the major markets-cutting prices to boost sales and not having enough fab capacity to sustain the channel. The fab capacity supposedly being constrained because of money lenders shying away from providing funds to AMD for manufacturing construction (due to Intel suit and presumably not having much faith in AMD's 1991 antitrust suit against Intel) and AMD not seeking capacity at other foundries

This does more to prove my point correct than anything. AMD worked on the lawsuit for years prior to filing it. I would imagine Intel did what it could to "fix" the problem when it knew it had a problem. Intel f'd over AMD illegally to beat em down. That's what it took, Intel had 10 times the amount of cash of its rival and yet they had to resort to this, AMD has been paying for it ever since.
 
Maybe a plan B as far as manufacturing capacity?- it's not as if the signs weren't fairly obvious before things ground to a halt (this article from 2002 for example). Then of course, there were a few fingers pointed at Jerry " Mr Flamboyant" Sanders as to how AMD's relationship soured with Intel in the first place:
A plan B is out of the question. Do you know it takes a FAB machine about 3 months to print a wafer, interesting fact. Do you also know each generation of chip need new FAB's to designed built and tested before they go into production. I think I recall the cost for a new FAB to be in the billions. Having a plan B is not a option.
 
as far as I know it's the main issue.
Which I find irrelevant when looking at all the other issues. DBZ pointed out that supply was hurting as well as demand. If that's the case how can you point a finger solely at demand being the main issue? From what I'm reading, AMD was and is hurting regardless of what Intel did. The reason Intel was being blamed is because AMD wanted people to think it was anyones fault but their own.
 
A plan B is out of the question. Do you know it takes a FAB machine about 3 months to print a wafer, interesting fact.
Note the article I linked to was dated January 2002, also note that the X-bit article was dated August 2004. Allowing your 3 months (that would assuming that the process and lithography masks are directly compatible with the new foundry)- and that is assuming that AMD were aware of the same supply constraint problems no earlier than mainstream tech sites!- that would imply AMD could have alleviated the issue by Nov/Dec 2004 (or mid-2002 if they indulged in the same strategic analysis that TMF posted).
Your original premise was that Dell's change of heart and ordering caused the shortage. Dell signed contracts in late 2006, yet AMD chip shortages were reported over two years prior to this date- and predicted five years earlier.
 
Which I find irrelevant when looking at all the other issues. DBZ pointed out that supply was hurting as well as demand. If that's the case how can you point a finger solely at demand being the main issue? From what I'm reading, AMD was and is hurting regardless of what Intel did. The reason Intel was being blamed is because AMD wanted people to think it was anyones fault but their own.
A earlier post of mine argued that point. AMD supply issue wasn't a cause, but and effect of Intels underhanded business practice.
 
A earlier post of mine argued that point. AMD supply issue wasn't a cause, but and effect of Intels underhanded business practice.
How so??

Was Intel paying AMD's supplier to halt production? This would be the only way I see Intel being blamed for AMD's supply shortage that was briefly mentioned in the article.
 
How so??

Was Intel paying AMD's supplier to halt production? This would be the only way I see Intel being blamed for AMD's supply shortage that was briefly mentioned in the article.

Well I know on the gpu side it takes 5 years to design a chip, and design a fab to get a product to market, it costs billions of dollars. It takes 3 months for a single FAB maching to produce 1 wafer full to chips. CPU's aren't much different. On the surface AMD was short but the designing and planning happened years ago. AMD projects what they would expect for sales a long time in advance. On the surface it might look like AMD screwed up, but AMD was dealing with a market that was unfairly adjusted. Reality is a different story, AMD was cheated. By the time AMD had a problem sorted out the die had been cast. Intel started paying off distributor around 2000, maybe sooner, AMD noticed it fairly soon and had to build a case against Intel, it took almost of the 5 years. Meanwhile how do you project a future on something like that, it's all about supply and demand then numbers are worked out based sale projection. All I see is when AMD was better than Intel nobody would buy em, and when the lawsuit was finally out there, the damage was done. Even if AMD had unlimited cash, the time it would take to ramp up production would have been to long, and have not made a difference. AMD only had a short window of time before the Core2Duo's where out, that window was less than a year. Think about it, if it takes 3 months for a FAB to finish what it had started, then get it retooled for another process, your looking at 6 months before you see any chips. (I would need more specifics in the propef context to validate the writer opinion on AMD shortages). When AMD finally was allowed to sell it's CPU's on a fair market, of course the demand went up beyond what they could produce. It's not AMD fault. Intel payed AMD 1.25 billion in damages and was allowed to spin off it's FAB's. Prior to the lawsuit AMD's 8600 liscensing agreement with Intel limited the amount of outsourcing AMD could do at the time, something like 15 to 20 percent could be outsourced. The rest had to be made by AMD. The writer has little understanding of this time in AMD's history otherwise he wouldn't be so superficial with the context of some of the facts he's referring to.
 
Note the article I linked to was dated January 2002, also note that the X-bit article was dated August 2004. Allowing your 3 months (that would assuming that the process and lithography masks are directly compatible with the new foundry)- and that is assuming that AMD were aware of the same supply constraint problems no earlier than mainstream tech sites!- that would imply AMD could have alleviated the issue by Nov/Dec 2004 (or mid-2002 if they indulged in the same strategic analysis that TMF posted).
Your original premise was that Dell's change of heart and ordering caused the shortage. Dell signed contracts in late 2006, yet AMD chip shortages were reported over two years prior to this date- and predicted five years earlier.
So what are we comparing here the 5 years that illegally Intel payed of distributors, or AMD's supply problems for say 1 year. At what point do the illegal actions of Intel have no factor.
 
Prior to the lawsuit AMD's 8600 liscensing agreement with Intel limited the amount of outsourcing AMD could do at the time, something like 15 to 20 percent could be outsourced.
20% is the figure usually associated with the 1995 and 2001 agreements, and usually quoted in relationship to the Globalfoundries spin off.
So: AMD x86 outsourcing allowed: 20%
AMD x86 outsourcing initiated prior to the shortages apparent in 2004: 0%
In fact it wasn't until two years after the first warning signs that AMD even approached another foundry (Chartered- who started production in May 2006) and even then, AMD only used ~7% of the 20% outsourcing allocation they were entitled to under the agreement ( one thousand 300mm wafers/month compared with 30,000 200mm wafers from AMD's Fab 30) .Fab36 was also technically producing wafers also, but likely still at testing/tooling validation stage.

So, even though by (at least) January 2002 it was readily apparent that AMD needed more manufacturing capacity, they did not act until November 2004- almost three years later- to initiate a second source of production...and even then, did not utilize the full quota allowed for under the terms of the agreement
The rest had to be made by AMD.
So why didn't AMD move to maximize the 20% ( or the remaining 13% after Chartered's production) as soon as the ramping estimates had been drawn up -presumably in early 2005 ? AMD could obviously benefited from the extra capacity from the get go.

Seems like an odd coincidence that under Jerry "Real Men Have Fabs" Sanders, the idea of outsourcing never saw the light of day:
Sanders meant that while many chip companies design semiconductors and outsource the manufacturing, AMD enjoyed the relatively rare advantage of owning its factories, known as fabrication plants, or fabs.
...and that it wasn't until Hector took the reins that AMD (sans Sanders!) started softening its stance on outsourcing
I would need more specifics in the propef context to validate the writer opinion on AMD shortages
Would you like me to insert a LMGTFY link? There's more than enough evidence on the net. How about processor shortages and constrained capacity in 2001? How about this one?
An article claims AMD is running out of Athlon 64 X2 3800+ dual core processors and Athlon 64 3000+ at a time when it can least afford to....European motherboard firms, talking to the INQ on conditions of anonymity, were rather more blunt about the problem. One described the shortages as due to "bad planning".
[LEFT]
The writer has little understanding of this time in AMD's history otherwise he wouldn't be so superficial with the context of some of the facts he's referring to.
[/LEFT]
Thanks for that. I've used what little understanding I possess to provide some documented facts and some historical links. YW.
EDIT:
So what are we comparing here the 5 years that illegally Intel payed of distributors, or AMD's supply problems for say 1 year
One year? You're basically looking at 2001-2007 (when Fab 36 came online) at the very least...and that without taking into account AMD's recent problems with GloFo
At what point do the illegal actions of Intel have no factor.
Well. I don't remember saying anything of a kind either now or in the article...or is it an all or nothing proposition for you? Adding hyperbole doesn't strengthen your argument.
 
The article completely missed the misstep of trying to integrate ATI's graphics card designs with AMD's cpu designs. The teams had to use the same architectures which hurt performance.
 
Back