What Happened Last Time AMD Beat Intel?

Ive been building building PCs long enough to remember the last time AMD beat Intel. It was so long ago I didn’t have any hairs on my chest and I had not long lost my virginity..

But seriously those Athlon64 parts were very dominant, it’s such a shame that they couldn’t follow them up. I would say one thing though, AMD went nuts on pricing, way more so than Intel or AMD do today. Over £1000 for the top end Athlon 64 FX parts at a time when £1000 could buy you a whole solid rig if you decided to go for a more normal CPU. The P4 extreme editions were also heinously priced at the time. Of course most people bought midrange but today you don’t pay £1000 for the top end consumer grade parts.

These prosperous days for AMD came at roughly the same time as then ATI fell from the top spot in the GPU space. The Radeon X800 XT didn’t match the GeForce 6800 Ultra. Before that it was the worst lineup of graphics cards I’ve ever seen from any company - Nvidias FX series which were far worse than ATIs offerings. Since then ATI/AMD have never really beaten Nvidia, the 7970ghz edition was about as good as it got. Well until now, although I still think Nvidias cards are better buys because of the ray tracing and DLSS technologies.
 
It was time for something new, and that led to the development of the Core Architecture.
To get to Core, Intel went to the Pentium III and its Tualatin revision.

So... to rephrase: "It was time for something old, so Intel went back to something old."

The P4 architecture was crap from the beginning. Basically Intel couldn't really get the MHz crown away from AMD, so their innovation for P4 was to create a more inefficient processor that could clock higher, therefor marketing could pitch it as having more of the basic performance units that the general public thought translated to the speed of a CPU. At this point, the marketing department at Intel took over pretty much until they brought back the old with the P3 architecture.
 
People often wonder why AMD ended up paying this much money for ATi. The simple answer is that ATi didn't want to be bought. They basically said "Ok AMD, if you want to buy us out, you're going to be paying through the nose!".
ATI was a publicly listed company at the time. The company thus had no ability to veto a takeover bid, or set the purchase price. Their board's acquiescence simply changed it from a hostile to a friendly takeover, with no real effect on the price. AMD didn't pay much of a premium over ATI's current share value, in any case.
 
yes, the iPhone's CPUs are faster than anything from the Android camp, but most people don't have the money to buy stupidly expensive Apple products, and npricing that flatters the buyers ego).

Have you left you house lately or at least read the tech news?? Maybe you haven't heard, but Samsung's junk is as expensive (and even more expensive) than Apple's!!
 
All I know is I’m happy with my i7’s still, want a Cezanne in summer ‘21, if somebody can beat that in single core, I’ll be happy again.

Could care less which team wins, I’m only concerned I win.

 
Ive been building building PCs long enough to remember the last time AMD beat Intel. It was so long ago I didn’t have any hairs on my chest and I had not long lost my virginity..

But seriously those Athlon64 parts were very dominant, it’s such a shame that they couldn’t follow them up. I would say one thing though, AMD went nuts on pricing, way more so than Intel or AMD do today. Over £1000 for the top end Athlon 64 FX parts at a time when £1000 could buy you a whole solid rig if you decided to go for a more normal CPU. The P4 extreme editions were also heinously priced at the time. Of course most people bought midrange but today you don’t pay £1000 for the top end consumer grade parts.

These prosperous days for AMD came at roughly the same time as then ATI fell from the top spot in the GPU space. The Radeon X800 XT didn’t match the GeForce 6800 Ultra. Before that it was the worst lineup of graphics cards I’ve ever seen from any company - Nvidias FX series which were far worse than ATIs offerings. Since then ATI/AMD have never really beaten Nvidia, the 7970ghz edition was about as good as it got. Well until now, although I still think Nvidias cards are better buys because of the ray tracing and DLSS technologies.


Real quick, the x800 was faster than the 6800 in everything but OpenGL plain and simple, also this was 2004, AMD was top dog, the Athlon 64 x2 was still a year away, Socket 939 was just showing up, and the Athlon 64 3400 and 3500 where the fastest CPU's on the market.

As for video cards go, x800 beat 6800 Ultra unless it was OpenGL, but in games at the time, only Doom3 was openGL that mattered, and neither card was really fast enough for SM3, look how badly the 6800 did with FarCry when you enabled HDR, so the lack of SM3 didn't really hurt ATI when it mattered. Let's now see the x1800XT was faster than the 7800GTX, to the point Nvidia had to launch the 7800GTX 512 as a halo product, and ATI responded with the x1900XT and x1950XTX which where hands down dominating chips, Nvidia eventually got the 7950GX2 out, but it relied on working SLI profile. I'd say the period from 2002-2006 goes to ATI, and for AMD its 1999-2006. It's that 2006 that spelled diaster for both, ATI's r600 design was a total flop, and it was ATI's, AMD hadn't had a chance to have much input, and Nvidia's G80 did amazing, which suprised everyone. Core 2 was the same suprise, no one really saw it comming, and plenty said nah I'll buy an AM2 and wait for Phenom x4 because I know its faster, I was one of them, I bought an AM2 board and an x2 5000 and figured im set to upgrade to Phenom, which I did but in gaming I lost to my buddies E6600 which he'd had already.

Now back on topic, I'll say this, Intel this time doesn't have an older design to backtrack to, Netburst was a failure, and they resurrected the P6 core, that isn't an option today, P5 won't compete, we know that because of Atom which is really a Pentium classic on steroids, so Intel doesn't have old designs to tweak, that's whats diffrent this time, intel doesn't have older designs that are also worth looking at it, matter of fact everything since 2006 has been based and refined on a core design first shown off in 1994.
 
So... to rephrase: "It was time for something old, so Intel went back to something old."

The P4 architecture was crap from the beginning. Basically Intel couldn't really get the MHz crown away from AMD, so their innovation for P4 was to create a more inefficient processor that could clock higher, therefor marketing could pitch it as having more of the basic performance units that the general public thought translated to the speed of a CPU. At this point, the marketing department at Intel took over pretty much until they brought back the old with the P3 architecture.


Not quite the Pentium III only existed as we know it because netburst was delayed, you forget netburst was supposed to launch at 500mhz and be called Pentium III, they where having issues with it, so they quickly back ported SSE to the Pentium II core and called it Pentium III, and a year and half later when they had the bugs worked out we got Pentium 4. Intel likely wouldn't have backported SSE, but it was thanks to AMD's launch of the K6-3 that worried them, back then gaming was very niche on computers, 90% of all computer use relied on the ALU not the FPU, and the K6-3 was faster per clock than the Pentium II because of its on die L2 Cache, so Intel reponded by back porting SSE and taking the older 250nm node to 600mhz which was unstable as hell, but it worked to hold off AMD gaining ground in office computers with the K6-3
 
There are more reviews available than just techspot!


Er ... this talking head you posted, techspots and all others I have seen also indicate that RDNA2 is more efficient (better performance per watt) than Ampere ... :scratches head:

Jay indicated that his sample of 6900xt was drawing only 250w as reviewed ( vs 3080TDP 320w) ... did you even watch the video? His review is also bit of an anomaly compared to others.

The comment you responded to was referring to efficiency and the ludicrous efficiency claim of the post he was responding to ... :scratches head harder:
 
That is so true....
and yet, NVidia is beating AMD even while using an inferior node

Even the biggest nVidia fanboy will agree: the margin by which they are winning shrunk far more significantly than most saw coming when Navi 2 launched. We are closer to a meaningful GPU price war than we have been in a very long time. Not saying that’s going to happen... If only stock wasn’t an issue.
 
Have you left you house lately or at least read the tech news?? Maybe you haven't heard, but Samsung's junk is as expensive (and even more expensive) than Apple's!!
I have, though not much: there's a pandemic, did you know?
As for prices: are there inexpensive iPhones? No. Are there inexpensive Android phones: tons of them. Samsung (and other Android brands) have expensive phones, yes, but that's far from being it all. For iOS, there are only expensive iPhones.
But thanks for your well thought comment!
 
I have, though not much: there's a pandemic, did you know?
As for prices: are there inexpensive iPhones? No. Are there inexpensive Android phones: tons of them. Samsung (and other Android brands) have expensive phones, yes, but that's far from being it all. For iOS, there are only expensive iPhones.
But thanks for your well thought comment!
You can get an iPhone SE new for £350 or $399 which has the same processor as the pro max which is a lot faster than even a Note 20.

Samsung phones currently cost a lot more than iPhones, especially if you include the folding models. They also lose their value far quicker, you pay more for them and after a year you sell them on for less than their iPhone counterparts.
 
Lets face it. Big part of AMD's current success is the fact that Intel is in deep ****. I mean, they didn't release an actual new (not rebadged skylake parts) from 2015. The fact that they still use these parts to compete with AMD 5 freaking years later says a LOT about how good they were back in the days and how sorrow the Intel's situation is.
Sure, AMD is executing nicely these days and they are steadily improving their performance gen over gen, but lets not forget that they first overtook Skylake IPC with Zen 2 in 2019, 4 years after Skylake launch. And they needed 3 Zen iterations + Jim Keller + 7nm process, the best process there was at the time. So yeah, I think a big part of why AMD is successful today is that Intel is not competing anymore. They are more or less sitting idle because they can't fab sh&t.
IF Intel wouldn't have had those problems with 10nm, today's situation would have been totally different. Oh, and don't get me started on Intel's management. They have had a stroke of bad luck with CEOs starting with Brian K which was...appalling and now Bob Swan which, sorry to say it, but he just doesn't fit. He's a freaking business man, not a tech addict.
 
Correct me if I am wrong, and I may be. Jim Keller spearheaded the Ryzen process, he left AMD right before launch of the the Ryzen 1, and since, they have refined a few things like the interconnect and cache design, but the arch process of the node is the same with refined clocks.

Keller is no longer at AMD ...
 
Sorry neither one is very impressive relative to Bloomfield's initial entry in 2008. My i7 920 is still able to hold its own vs my 2600X in games for example, up to GTX 1070Ti level; the limitation being less the CPU and more the PCIE implementation. Overall performance improvements have been minute over the last 12 years; necessitating adding more cores rather than significant clock speed increases: oc of 1.14Ghz on air for my i7 920 and 1Ghz on my i7 930; on AIO 1.1Ghz on my i7 960 and 1.2Ghz on my 6C12T i7 980X are not the kind of OC I can get from my i7 4770K (400Mhz), i7 6850K (900Mhz), Ryzen 2600X (200Mhz). Exception is my i7 7800X@4,8Ghz; however runs much toastier than Bloomfield or Gulftown. From my foxhole CPUs have hit a wall and adding more cores; which aren't always useful, is the sidestep dance.
 
Last edited:
Ive been building building PCs long enough to remember the last time AMD beat Intel. It was so long ago I didn’t have any hairs on my chest and I had not long lost my virginity..

But seriously those Athlon64 parts were very dominant, it’s such a shame that they couldn’t follow them up. I would say one thing though, AMD went nuts on pricing, way more so than Intel or AMD do today. Over £1000 for the top end Athlon 64 FX parts at a time when £1000 could buy you a whole solid rig if you decided to go for a more normal CPU. The P4 extreme editions were also heinously priced at the time. Of course most people bought midrange but today you don’t pay £1000 for the top end consumer grade parts.

These prosperous days for AMD came at roughly the same time as then ATI fell from the top spot in the GPU space. The Radeon X800 XT didn’t match the GeForce 6800 Ultra. Before that it was the worst lineup of graphics cards I’ve ever seen from any company - Nvidias FX series which were far worse than ATIs offerings. Since then ATI/AMD have never really beaten Nvidia, the 7970ghz edition was about as good as it got. Well until now, although I still think Nvidias cards are better buys because of the ray tracing and DLSS technologies.

I'd disagree. A combination of a Athlon 64 in a era where 32 bit os was still the norm, with a 9700 Pro was just destroying intel & nvidia at the same time. That 9700 graphics card from Ati blew the competition out of the way. AMD had very good hardware, and in times of netburst AMD simply changed it's CPU's into "PR" which was equivalent to another Athlon with the same clockspeed but we all knew it was equal or better then a simular clocked Intel Pentium 4.

Ah the good days. Jumped from a AMD 486 DX2 into a Cyrix PR233 and later to a K6-2 and then a K7 Athlon, Socket 462 Athlon, Athlon XP, Palamino, X64 X2, Thuban X6, Vishera and now a Ryzen 2700x looking at a new upgrade path already such as a Ryzen 5950x series.

AMD really came a long way but they got such serious assets at the moment right now. ^^
 
Spot on!

I remember those days.

I had my Athlon 64 and was dying to get a X2, but AMD was actually charging per core!

They charged exactly double the price for a X2 CPU, compared to the one core CPU.

I just watched the reviews of the 6900XT and shake my head in disbelief and said to my self, "not again, AMD, not again".

They need to cut prices on all of their new products (ZEN3 and RDN2), instead of going greedy like now.

Then again, given that TMSC cannot produce anything more right now, AMD is simply gouging the desperate ones.

Hopefully down the line, but soon, they will cut prices.
They're playing a very dangerous game though because a lot of their success is also attached to the fact that many people (like me) absolutely despise Intel and nVidia as companies because of their past misdeeds based on corporate greed and an arrogant mindset. Compared to Intel and nVidia, AMD's practices seem almost "saintly" when one considers Intel's criminal activities and nVidia's numerous anti-consumer "D1ck Moves".

I've been accused of being an DAAMiT (AMD/ATi) fanboy in the past but that has actually never been true. I'm not really someone who buys AMD products as much as I'm someone who refuses to buy Intel or nVidia products because I don't want to support them. AMD is no charitable foundation but at least they don't break the law or screw over their customers. If AMD changes either of those, then there will be no reason for me to care about the crap that Intel and nVidia pull and AMD will lose a lot of business because of it.

Gamer Meld was apparently told by AMD directly that in 4-8 weeks, there will be a significant number of RX 6000-series cards available AT MSRP. While that is better than nVidia's situation, the RX-6000-series is badly overpriced. This may be the first time that I can remember saying that about an ATi product in 25 years. You know, it's really stupid too because nVidia screwed up and instead of capitalising on it and grabbing market and mindshare which is what AMD REALLY needs, they went the greedy route for short-term profits.

This is why no American corporation will ever be as long-lived as Lloyds Bank (I had erroneously referred to it as Lloyd's of London) or the Hudson's Bay Company. They're so focused on short-term profits and stock value that they're completely oblivious to the bigger picture.
 
Last edited:
Funny how Intel illegal tactics weren't mentioned, like bribing Dell and others to do not sell AMD powered systems.

Hell, some say that they are doing that exact thing today, given how Dell refuses to add Ryzen to their lucrative business lines, like the Optiplex and Precision lines.
When Intel's criminal activities are ignored, I call it "Revisionist History". I don't understand why Sami Haj-Assaad would avoid addressing it. Maybe because it would cause a flame war, or a bunch of whining. I guess it's fair to say that he was trying to be diplomatic.

It's like conveniently forgetting to mention the Holocaust when doing a story on the history of WWII or forgetting to mention slavery when doing a story on the US Civil War. I'm sorry to use such extreme examples because this is completely trivial in comparison but it's the same thing.
 
Last edited:
I just don't understand this position that AMD must be the budget champ regardless of performance for the end of time, but NVIDIA and Intel can keep charging how they like.

You misread my comment.

They dont need to be the budget brand, but they are definitely the underdogs and have a very small marketshare. One sure way to do this is to price your products a BIT bellow the market standard and like it or not, right now, those are Intel and Nvidia.

Then, when you are big and arrogant and have crushed everyone, then you abuse your monopoly by gauging your customers, like both Intel and Nvidia are doing.

AMD is currently capacity constrained, they are selling basically everything they can get out of TSMC. Cutting prices now would do nothing other than reduce revenue and profit at a time they need every single $ they can get to fund R&D to keep ahead of the Intel/Nvidia juggernauts who are many times AMD's size.

Agreed, see above post.

The other thing, you can't write a reasonable article about 'the last time AMD was ahead of intel' without mentioning Intel's illegal practises to shut AMD out from the lucrative OEM business. That was rife during the P4/early Core era and it significantly impacted AMDs ability to grow revenue and profit at a time they were more competitive. I remember in 2008 you could buy a home server from Dell with a Q6600 that was cheaper than buying just the Q6600 from a store, that was how large the subsidies intel was giving Dell in exchange for not using AMD processors. Effectively people were shucking Dell PC's to get the Intel processors because it was cheaper.

Yeap, I mentioned that on my other post. All these news sites always "conveniently" ignore that tiny, itsy bitty part.

That then took away from their ability to keep up R&D spend and basically guaranteed that Intel would be able to muscle past. Remember, we are talking about a company (Intel) that until recently was making more profit in a quarter than AMD made revenue in a year, that is how lobsided the market was (and still is to an extent). Getting mad at AMD for trying to increase margin on higher performing parts in that context seems irrational and short sighted.

I'm not mad per se with AMD, I simply understand that they do have an uphill battle and raising prices now is not a good thing to them in the long run, since remember, humans behind the corporations, follow some loyalty and most time, safe traditions, like buying from the same vendor, even if is now inferior, since we are familiar with them (and if bribe money is included, even better)!
 
When Intel's criminal activities are ignored, I call it "Revisionist History". I don't understand why Sami Haj-Assaad would avoid addressing it.

It's like conveniently forgetting to mention the Holocaust when doing a story on the history of WWII or forgetting to mention slavery when doing a story on the US Civil War. I'm sorry to use such extreme examples because this is completely trivial in comparison but it's the same thing.

All news sites always hide that fact and are ignoring the obvious repeat of this,

Example, try to find a Dell Optiplex or Precision computer with a Ryzen or ThreadRipper CPU.

Those are Dell most profitable lines and of course, Intel is the only one there, I wonder why...?



Talking about the horrible holocaust, nobody ever mention the fact that Hittler asked all the nations to take the Jewish population and all nations, except the Dominican Republic, refused to take them.

https://en.wikipedia.org/wiki/Évian_Conference

 
They're playing a very dangerous game though because a lot of their success is also attached to the fact that many people (like me) absolutely despise Intel and nVidia as companies because of their past misdeeds based on corporate greed and an arrogant mindset. Compared to Intel and nVidia, AMD's practices seem almost "saintly" when one considers Intel's criminal activities and nVidia's numerous anti-consumer "D1ck Moves".

I've been accused of being an DAAMiT (AMD/ATi) fanboy in the past but that has actually never been true. I'm not really someone who buys AMD products as much as I'm someone who refuses to buy Intel or nVidia products because I don't want to support them. AMD is no charitable foundation but at least they don't break the law or screw over their customers. If AMD changes either of those, then there will be no reason for me to care about the crap that Intel and nVidia pull and AMD will lose a lot of business because of it.

Gamer Meld was apparently told by AMD directly that in 4-8 weeks, there will be a significant number of RX 6000-series cards available AT MSRP. While that is better than nVidia's situation, the RX-6000-series is badly overpriced. This may be the first time that I can remember saying that about an ATi product in 25 years. You know, it's really stupid too because nVidia screwed up and instead of capitalising on it and grabbing market and mindshare which is what AMD REALLY needs, they went the greedy route for short-term profits.

This is why no American corporation will ever be as long-lived as Lloyd's of London or the Hudson's Bay Company. They're so focused on short-term profits and stock value that they're completely oblivious to the bigger picture.

Same here.

I am not an AMD fanboi just because, I am one because I hate all the illegal cr ap that both Intel and Nvidia has done to others and to me as a customer.

Example, back when ATI/AMD had better GPU's than Nvidia, but Physx was a thing, people were buying ATI/AMD cards, using them as primary and a cheaper Nvidia, just to run Physx.

Well, the di cks at Nvidia started disabling their own cards if an ATI/AMD card was installed!

Or look at RT and DLSS, Nvidia is simply trying to do the same thing, lock you to their cards and screw the market.

The sad reality is, these days, the young ones that have more money than brain and moral backbone only have personal and immediate satisfaction as their priority and forget the greater good.
 
👍This is how it's supposed to be 😁
Thanks for the recap. I feel like this piece has formally prepared me for one of the best showdowns in history.
I wasn't in the right place for the first episode (I missed it), but for this next episode am all eyes and ears.
Thanks again.
 
👍This is how it's supposed to be 😁
Thanks for the recap. I feel like this piece has formally prepared me for one of the best showdowns in history.
I wasn't in the right place for the first episode (I missed it), but for this next episode am all eyes and ears.
Thanks again.
Thanks for reading!
 
IF Intel wouldn't have had those problems with 10nm, today's situation would have been totally different.
I think Intel's willow cove at 10nm is very competitive if brought to the desktop side with more cores, as most people don't care about desktop power consumption. All they care about is performance. The same can't be said for the laptop/mobile segment though.
 
I just don't understand this position that AMD must be the budget champ regardless of performance for the end of time, but NVIDIA and Intel can keep charging how they like. AMD is currently capacity constrained, they are selling basically everything they can get out of TSMC. Cutting prices now would do nothing other than reduce revenue and profit at a time they need every single $ they can get to fund R&D to keep ahead of the Intel/Nvidia juggernauts who are many times AMD's size.
You must understand that in both the CPU and (especially) the GPU market, AMD is either completely unknown or is considered the "off-brand". This is mostly because of Intel's ill-gotten dominance certainly, but it's also because AMD doesn't advertise much, if at all. The name AMD just isn't recognised outside of the gamer and enthusiast circles and while gamers and enthusiasts aren't a small market, we're still tiny compared to commercial, industrial and general-use markets. Dell doesn't make its money selling computers to consumers, it makes its money selling computers to offices and those PCs are almost exclusively Intel-based.

The GPU side is even worse because nVidia is the name that non-enthusiasts recognise. AMD made rather stupid choice to remove the ATi branding from the Radeon GPUs because non-enthusiasts also recognised the ATi name.

AMD has to be the budget choice until they get at least close to parity with Intel an nVidia because market share equals mindshare. Right now, they are nowhere close to that. Even with the fantastic market growth of Ryzen on the Steam survey, they only have about ¼ of the Steam market. In the overall market, I'd say that AMD has maybe 5% because again, pretty much all of the "brand-in-a-box" computers used in the commercial sector are Intel-based.

This is also true about the mobile market in which AMD has only recently started to become competitive (within the last year or so). I would estimate that there are for more people who have never heard of AMD than there are people who have used AMD. To most people in general, AMD is the "off-brand" or "generic" version of Intel. Enthusiasts know that this isn't true but you'll still come across some gamers who think that they're "experts" who ascribe to this fallacy. Just think of how many people don't realise that drive space and memory aren't the same thing. When I worked at Tiger Direct, I used to wince every time I heard someone say that they had 250GB of memory. The general public's level of computer knowledge is about the same level that we have about washers and dryers. Who's going to spend more for a Haier over a Whirlpool? Not many. This is why Hisense bought Sharp. They knew that people would pay more for a TV that said "Sharp" on it than the same TV if it said "Hisense". They wanted the name.
The other thing, you can't write a reasonable article about 'the last time AMD was ahead of intel' without mentioning Intel's illegal practises to shut AMD out from the lucrative OEM business. That was rife during the P4/early Core era and it significantly impacted AMDs ability to grow revenue and profit at a time they were more competitive. I remember in 2008 you could buy a home server from Dell with a Q6600 that was cheaper than buying just the Q6600 from a store, that was how large the subsidies intel was giving Dell in exchange for not using AMD processors. Effectively people were shucking Dell PC's to get the Intel processors because it was cheaper.
I completely agree. It's like talking about WWII and not mentioning the Holocaust or talking about the US Civil War without mentioning slavery. Let's call it what it is, revisionist history.
That then took away from their ability to keep up R&D spend and basically guaranteed that Intel would be able to muscle past. Remember, we are talking about a company (Intel) that until recently was making more profit in a quarter than AMD made revenue in a year, that is how lobsided the market was (and still is to an extent). Getting mad at AMD for trying to increase margin on higher performing parts in that context seems irrational and short sighted.
Let me start by saying that the last Intel CPU I bought was a Core2Duo. After that I've had a Phenom II X4 940, Phenom II X4 965, A8-3500M, FX-8350, R7-1700, R5-3500U and R5-3600X. The last nVidia video card I bought was a Palit GeForce 8500 GT. After that, I've had twin XFX HD 4870 1GBs, an ASUS HD 5450 (HTPC), an XFX HD 6450 (also HTPC), twin Gigabyte Windforce HD 7970s twin Sapphire R9 Fury Nitros and now an XFX RX 5700 XT THICC III. Technically, I have a GeForce GTX 1050 Mobile in my current craptop but it wasn't any more expensive than other craptops without it and the main GPU in my Craptop is a Vega. It's easy to see that I'm no fan of Intel or nVidia.

Having said all that, your point isn't completely valid. This is because AMD still exists. The ATi side has only been non-competitive at the high-end relatively recently, like over the last four years and that was because AMD threw all of their R&D budget at their CPU side because they weren't worried about their GPU side. This is because they had beaten nVidia several times before and in 2015, two of the three most powerful cards in the world were Radeons. ATi had been perennial contenders against nVidia but AMD had not been competitive with Intel since the Phenom II. ATi essentially kept AMD alive between 2013 and 2017 so their GPUs weren't completely outmatched like their CPUs were. This turned out to be a smart move because it resulted in Zen (Ryzen, Threadripper, EPYC).

To do this however, they had to abandon the high-end of the GPU market for a generation or two. It wasn't that big of a deal because they had Polaris which was a great mainstream gaming at a great price. Polaris was going to be AMD's bid to gain market share and mindshare from nVidia through sheer volume. AMD knew that Polaris would be a winner against the GTX 1060 and they had made sure to have a good amount of them.

Then disaster struck in the form of the Crypto-Mining Craze. ATi's GCN architecture was a hybrid compute/gaming GPU and had far superior hash rates compared to nVidia cards, especially in Etherium which was popular at the time. Polaris was also extremely power efficient and the cards were very nicely priced so miners began snapping them up as fast as AMD could make them. This had a two-sided effect for AMD. On the one hand, they were making profit hand-over-fist but they weren't clawing back any consumer market share because the miners were grabbing all of the Polaris cards and ignoring the nVidia equivalent, the aforementioned GTX 1060 because compared to even an RX 470, the GTX 1060 sucked at Etherium mining.

The ultimate result was the exact opposite of what AMD wanted to happen. Sure, they made a crap-tonne of money but gamers didn't get the exposure to the AMD name in GPUs that they were hoping for. In fact, the gaming market has never been so lopsided towards nVidia in their history and this is because people who wanted mainstream gaming cards had no choice but to buy the GTX 1060 since there were no Polaris cards to be had and the prices on them had been inflated by the extreme demand from Etherium miners. If you were to take the GTX 1060 out of the Steam survey, nVidia wouldn't look so dominant.

The mining craze did serious damage to AMD despite the profits they made because of the number of gamers who are now familiar with GeForce instead of Radeon. That's more long-term than a few dollars from miners.
 
Back