Opinion: The shifting semiconductor sands

Bob O'Donnell

Posts: 81   +1
Staff member
A hot potato: There was a time—and it wasn’t really that long ago—when if you asked anyone who tracked the chip business about Intel, they probably would have said they were invincible. After all, they owned 99% of the datacenter CPU market, close to 90% of the PC CPU market and even made ambitious acquisitions in other “alternative” architectures, such as FPGAs (Field Programmable Gate Arrays) and dedicated AI processors. On top of that, they had a decades-long history of impeccable execution and industry-leading innovations in the process of semiconductor manufacturing.

And then, last Thursday hit.

During what was otherwise a stellar second quarter earnings report, with impressive revenues and growth numbers across the board, the company acknowledged that their already delayed transition to 7nm process technology for future generation CPUs was delayed for another six months.

Now, arguably, that really shouldn’t be that big of a deal. After all, this is ridiculously complex technology. The company said they knew what the problem was and, therefore, had a clear path to fixing it. They also certainly wouldn’t be the first major tech company to face some technical challenges that caused delays in the release of eagerly awaited new products.

"There appears to be some loss of faith in Intel’s previously irreproachable reputation for delivering what they said, when they said they would do it"

But the market didn’t see it that way, and subsequently, Intel stock has lost nearly 20% of its value in the last week. To be fair, this is also a stock market that over the last few months has shown absolutely no sense of rationality, so you have to take any dramatic stock price moves in the current environment with a very large grain of salt.

Fundamentally, however, there appears to be some loss of faith in Intel’s previously irreproachable reputation for delivering what they said, when they said they would do it. While some view the most recent news, as well as the forthcoming and likely related departure of chief engineering officer Murthy Renduchintala, as the primary catalyst for this perspective, you could make the argument that the problem started earlier.

In the case of dedicated AI accelerators, for example, Intel made a large investment in Nervana and put Nervana’s main execs in charge of their dedicated AI investments back in 2016. Then, shortly after they released their first Nervana chips to customers, they essentially abandoned all that work to purchase Habana Labs for $2 billion late last year and moved in a different direction. Obviously, cutting edge technologies like AI accelerators can certainly shift quickly, and, in this case, Intel clearly recognized that they needed to make an aggressive move. However, it certainly raised some questions.

At the same time, there are also several other very interesting developments in the semiconductor market that appear to be driving some fundamental shifts in how people (and investors) are viewing it. One, of course, is a hugely reinvigorated AMD—a fact that’s been reflected in the company’s impressive growth and even more impressive stock price run over the last several years (as well as the nice boost it received last week as a result of Intel’s news).

"AMD has been shaking up and enlivening the previously static CPU market and that it will continue to do so for many years to come"

To their enormous credit, AMD’s CEO Lisa Su, CTO Mark Papermaster and team have done a remarkable job in turning a company that some felt was headed for extinction just a few years back, into a formidable competitor and an important force in the chip industry overall. You could argue (and many have) that, from a market valuation perspective, the company has received more credit than its sales numbers reflect. However, there’s no question that AMD has been shaking up and enlivening the previously static CPU market and that it will continue to do so for many years to come.

In addition, there’s been a great deal of momentum recently towards Arm-based CPUs in both datacenters and PCs. Apple’s recent announcement to switch from Intel to its own Arm-based CPU designs in future Mac, for example, highlights some of the high-level changes that are happening in the CPU market.

Despite all this bad news for Intel, it is important to keep everything in perspective. Intel is still by far the largest CPU manufacturer in the world and will be for some time to come. The company will certainly be facing a more competitive marketplace than it has had to worry about for a very long time, but it’s undoubtedly up to the task. Also, in the long run, good competition will inevitably be better for all of us.

As a long-time Intel follower who essentially learned most everything about the importance of process technology from Intel (they’ve done a fantastic job of educating analysts and press about these issues for a very long time), I have to admit that it’s somewhat shocking to see Intel in this state. At the same time, it’s also important to remember that not all numbers in the semiconductor process game are created equal. While it’s certainly up for debate, Intel has argued for years that its 7nm process is closer to what other vendors call 5nm.

"It is clear that Intel has slipped from its mantle of invincibility and will need to reprove itself to the industry and market at large"

Regardless of the numbers, however, it is clear that Intel has slipped from its mantle of invincibility and will need to reprove itself to the industry and market at large. The fact that the company has already discussed working with third-party foundries on advanced process nodes for some of its upcoming chips (including its widely anticipated new GPU) is a testament to that.

In the Intel of old, that decision would have probably been unthinkable. But we are in a new era, and despite these short-term concerns, it is encouraging to see Intel’s CEO Bob Swan willing to admit the challenges they have and take some aggressive actions to address them.

The sands beneath the semiconductor market are clearly shifting, and it’s going to be very interesting to see how things look over time.

Bob O’Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting and market research firm. You can follow him on Twitter . This article was originally published on Tech.pinions.

Permalink to story.

 
I think the whole x86 architecture is slowly becoming redundant.

All the powerful calculations, for AI, data centers and general science are shifting toward nVidia-based systems, and other custom AI accelerators. And many specific tasks are taken by the quantum platform.

The rest of the market does not not require powerful computing, and finds that ARM is more than sufficient for every day work and gaming. It won't be long before we see a good gaming rig that uses ARM as the main platform, with nVidia video cards. Apple will be among the first companies to introduce such products next year, although they are just as likely to opt for AMD graphics yet again.
 
If Intel hadn’t suffered delays and problems with 10nm then things wouldn’t be what they are today. AMD would not have made much impact with Ryzen and we probably wouldn’t be paying more than $300 for a mainstream consumer grade CPU and it would have 6 cores max. Ok I don’t know about that last part about price and core count but suffice to say I couldn’t see Intel reaching $750 like AMD did for any of their mainstream consumer grade parts.

But Intel aren’t down and out, their 14nm stuff is competing with AMDs 7nm stuff which is impressive in its own right. They just need to get their foundries in order.

As for the enterprise space, where I work. I can’t see anything but more growth for Intel. And this is where they make all their money. Ever noticed how some of Intel’s server chips cost $20k or whatever each? That’s because Intel have a shortage and prioritise enterprise customers. So they will still sell that part individually as a piece of silicon but not without a heinous markup to the point where no one will buy it and if someone does then Intel still profit. But that gives you an idea of how much money Intel makes with their enterprise solutions.
 
If this had been the first time Intel encountered manufacturing problems after repeatedly claiming everything was fine, you‘d have a valid point. But this has repeated several times over the last few years....14 nm (did not matter back then because Intel was ahead of the competition back then), then with 10nm and rinse and repeat with 7nm. All the while TSMC executes flawlessly.

As for AMD, everyone was expecting Intel to come out with a new architecture that would crush it right away but that never materialized and it was instead more of the same, just tweaked.

Intel‘s problem is not just AMD and ARM (against which it failed miserably in the mobile market) but also nVidia (replacing Intel CPU with their GPU in datacenters) and also TSMC. This is why their old „incentive money“ strategy does not work this time.

So, for investors you had a company that was ahead in terms of technology, interfaces, process...that is now behind in all of these areas.

If Intel hadn’t suffered delays and problems with 10nm then things wouldn’t be what they are today. AMD would not have made much impact with Ryzen and we probably wouldn’t be paying more than $300 for a mainstream consumer grade CPU and it would have 6 cores max. Ok I don’t know about that last part about price and core count but suffice to say I couldn’t Intel reaching $750 like AMD did for any of their mainstream consumer grade parts.

That‘s not the right way to look at it imho. Intel had > 4 core CPUs but they cost a pretty penny. A core i9 was never a $300 CPU and you should compare them to Ryzen 9. A $ 750 Ryzen 9 3950 beat Intel‘s Skylake X CPU costing up to what....$ 2000 ?

Just look at the ten core i7-6950x that had a release price of $1700 four years ago.
 
Last edited:
If this had been the first time Intel encountered manufacturing problems after repeatedly claiming everything was fine, you‘d have a valid point. But this has repeated several times over the last few years....14 nm (did not matter back then because Intel was ahead of the competition back then), then with 10nm and rinse and repeat with 7nm. All the while TSMC executes flawlessly.

As for AMD, everyone was expecting Intel to come out with a new architecture that would crush it right away but that never materialized and it was instead more of the same, just tweaked.

Intel‘s problem is not just AMD and ARM (against which it failed miserably in the mobile market) but also nVidia (replacing Intel CPU with their GPU in datacenters) and also TSMC. This is why their old „incentive money“ strategy does not work this time.

So, for investors you had a company that was ahead in terms of technology, interfaces, process...that is now behind in all of these areas.



That‘s not the right way to look at it imho. Intel had > 4 core CPUs but they cost a pretty penny. A core i9 was never a $300 CPU and you should compare them to Ryzen 9. A $ 750 Ryzen 9 3950 beat Intel‘s Skylake X CPU costing up to what....$ 2000 ?
Core i9 is just a name for the top end of the stack. It goes in consumer grade boards and targets a. Consumer grade audience.

Whilst I don’t think we would have seen a 10 core so soon I don’t think the 6 or 8 core parts were a million miles away when Ryzen launched. I mean the 8700K launched just months after Ryzen which means it was already years in devepment when Ryzen launched.

And yes, Skylake X or rather Intel’s “prosumer” set was very expensive, it’s considerably cheaper than AMDs TR lineup which has just broken the record for most expensive CPU on sale (TR3990 for $4k). Now you do get more performance per dollar on these parts, AMD are at the same time pushing the market prices higher. I’m not condemning this, it’s clever. AMD will make more money on each unit sold. I’m not one of those people who condemn a company for trying to make as much money as possible. Intel have also matched prices on consumer grade stuff too so they are getting involved aswell.

But as someone who purchased the fastest consumer grade chip available in 2014 (i7 4790K) with an ROG mobo and fancy RAM for a total of £470. I’m now looking at spending more than double that to get the top end Intel or AMD part, an ROG motherboard and fancy RAM.
 
TR 3990x is a 64 core / 128 thread CPU....if you had told anyone in 2016 that you could buy such a beast for $ 4000 they would most likely have laughed at your vivid imagination.

But you can look at the 24C Threadripper 3960X (and the entire platform), launch price $ 1400 and compare that to previous Intel hedt CPU and what they offered for the same money.

Funny that you should mention the i7-4790k. Just compare it to the i7-7700k released two and a half years later. See a big difference between the two ?
 
TR 3990x is a 64 core / 128 thread CPU....if you had told anyone in 2016 that you could buy such a beast for $ 4000 they would most likely have laughed at your vivid imagination.

But you can look at the 24C Threadripper 3960X (and the entire platform), launch price $ 1400 and compare that to previous Intel hedt CPU and what they offered for the same money.
Yes I do agree that it offers more performance. But I would also say that if you think like that then you’ll never expect the price of tech to come down. If you spoke to someone in 2010 that they could get a 16 core chip for the price of Intel’s prosumer lineup in 2016 they would also be quite shocked still.

The prices have increased across the board since Ryzen came along.You get more for your money but you still pay more overall. I’m not saying Ryzen is bad it’s definitely the better buy. But now a midrange CPU is the cost of a flagship.

If Intel came out with a 256 core CPU next year for $8000 then you would surely praise them for making each core so cheap? Personally I’d rather they just made a 128 core CPU for the same money or less.
 
But as someone who purchased the fastest consumer grade chip available in 2014 (i7 4790K) with an ROG mobo and fancy RAM for a total of £470. I’m now looking at spending more than double that to get the top end Intel or AMD part, an ROG motherboard and fancy RAM.

This is the overall market's response to consumerism. Adding new higher tier parts on top of existing ones has been a strategy for decades now. What passed for high end consumer 5-6 years ago is now solidly mid-tier (though still notably faster), with much higher performance parts added on over the top to create the new top tier.

IMO lamenting that there are now top end parts with more than 4x the performance for ~2.2x the price (4790K vs. 3950X list prices) is wasted as that's just market economics. The real issue was that there was a market for those types of parts 5 years ago but Intel wouldn't make them at a decent market price and AMD couldn't make them at all. All that potential profit was left in the pockets of consumers. Now the market has finally caught up and can offer higher tier parts at affordable prices that clearly lots of people are willing to pay.
 
This is the overall market's response to consumerism. Adding new higher tier parts on top of existing ones has been a strategy for decades now. It seems you're lamenting that what passed for high end consumer 5-6 years ago is now solidly mid-tier (though still notably faster), with much higher performance parts added on over the top to create the new top tier.

IMO lamenting that there are now top end parts with more than 4x the performance for ~2.2x the price (4790K vs. 3950X list prices) is wasted as that's just market economics. The real issue was that there was a market for those types of parts 5 years ago but Intel wouldn't make them at a decent market price and AMD couldn't make them at all. All that potential profit was left in the pockets of consumers. Now the market has finally caught up and can offer higher tier parts at affordable prices that clearly lots of people are willing to pay.
I don’t think it’s a bad thing. And I understand the demand, in fact I think Intel and AMD are still undervaluing the market, an example is the RTX 2080ti selling out immediately on launch, Nvidia could have easily sold that card for more than they did. It’s annoying when it comes to reaching for the wallet but more and more of us will begin to settle with midrange parts rather than the best of the best, or rather the enthusiast parts. This all being said, my other hobby is scuba diving and this enthusiast pc gaming stuff is all still considerably cheaper than that!

To me, there are some similarities between AMD CPUs and Nvidia GPUs, we were shocked when the 1080ti came in at $650, just like we were shocked when we saw the 3950X for $750. I believe we will see a >$1000 Ryzen CPU from AMD soon, maybe like some kind of 24 or 32 threaded monster. The last time AMD dominated in the mainstream CPU space they begun releasing >$1000 Athlon64 FX parts.
 
If Intel came out with a 256 core CPU next year for $8000 then you would surely praise them for making each core so cheap? Personally I’d rather they just made a 128 core CPU for the same money or less.
You can still get quad core CPU. Take Ryzen 3300x - it‘s faster than an i7-7700k, has much more advanced platform features and more CPU PCIe lanes and MSRP is $ 120 including an HSF. The 7700k was over $300. So see, you get the same for a lot less. And in the 3300x‘s case, you even have an upgrade path.

I can also buy a 2C4T Athlon 3000G for €47 (Mindfactory) and it probably compares very favorably to the core i3-7100.

I have to add that I got my 2700x new on sale for €150 including taxes, shipping and Borderlands 3. It was already last gen, but I don‘t think I would have been able to get the previous gen‘s top of the line CPU for that price a few years ago.
 
You can still get quad core CPU. Take Ryzen 3300x - it‘s faster than an i7-7700k, has much more advanced platform features and more CPU PCIe lanes and MSRP is $ 120 including an HSF. The 7700k was over $300. So see, you get the same for a lot less. And in the 3300x‘s case, you even have an upgrade path.

I can also buy a 2C4T Athlon 3000G for €47 (Mindfactory) and it probably compares very favorably to the core i3-7100.
I at no point said you don’t get more for less. You do get more for less. But they have sold the extra performance they have at a premium. The average spend per user has gone up.
 
All the while TSMC executes flawlessly.
Only recently though - you don't have to go too far back in their history to see that some of their older process nodes, especially those used by AMD for GPUs, were nothing special. Things only picked up for them with their 16FF nodes and its multiple versions - even their 10FF node was only heavily used by Apple. It really turned around for them with N7, probably because it wasn't aiming for the kind of density that Intel and Samsung targeted with their latest nodes.
 
If Intel hadn’t suffered delays and problems with 10nm then things wouldn’t be what they are today. AMD would not have made much impact with Ryzen and we probably wouldn’t be paying more than $300 for a mainstream consumer grade CPU and it would have 6 cores max. Ok I don’t know about that last part about price and core count but suffice to say I couldn’t see Intel reaching $750 like AMD did for any of their mainstream consumer grade parts.

But Intel aren’t down and out, their 14nm stuff is competing with AMDs 7nm stuff which is impressive in its own right. They just need to get their foundries in order.

As for the enterprise space, where I work. I can’t see anything but more growth for Intel. And this is where they make all their money. Ever noticed how some of Intel’s server chips cost $20k or whatever each? That’s because Intel have a shortage and prioritise enterprise customers. So they will still sell that part individually as a piece of silicon but not without a heinous markup to the point where no one will buy it and if someone does then Intel still profit. But that gives you an idea of how much money Intel makes with their enterprise solutions.
It seems that you are doing another cheap and lame commercial for Intel which is quite embarrassing. If you believe in what you wrote go ahead, for the rest of us, which live in the real world, Intel is not competitive anymore, at least for 2-3 years. In fact Intel is so "competitive" that they had to book their upcoming GPU and some of their cpu to TSMC. And if you believe that Intel is too big to fall just take note that Samsung is becoming bigger and also TSMC.
 
I don’t think it’s a bad thing. And I understand the demand, in fact I think Intel and AMD are still undervaluing the market, an example is the RTX 2080ti selling out immediately on launch, Nvidia could have easily sold that card for more than they did.

Good point. Yeah, both AMD and Intel could probably sell for higher than they do now but they are still in competition with each other so that limits their top sell price somewhat. Nvidia has zero competition for the 2070 Super and higher so their top end part can still easily go for $1200.

It’s annoying when it comes to reaching for the wallet but more and more of us will begin to settle with midrange parts rather than the best of the best, or rather the enthusiast parts. This all being said, my other hobby is scuba diving and this enthusiast pc gaming stuff is all still considerably cheaper than that!

Heh, agreed as I've spent far more on astronomy equipment than computers.

To me, there are some similarities between AMD CPUs and Nvidia GPUs, we were shocked when the 1080ti came in at $650, just like we were shocked when we saw the 3950X for $750. I believe we will see a >$1000 Ryzen CPU from AMD soon, maybe like some kind of 24 or 32 threaded monster. The last time AMD dominated in the mainstream CPU space they begun releasing >$1000 Athlon64 FX parts.

I don't think we'll see 24-32 core parts from AMD in the consumer space soon as it's pretty clear that the 12c24t 3900X is actually a (slightly) better gamer than the 16c32t 3950X. That leaves the 3950X and any higher core parts exclusively for content creation and ego builds (ie: a small market).

Maybe the architecture of the Zen3 desktop CPUs will address that and we'll see those higher core count parts but I'll bet the increased core counts won't happen until at least Zen4 with a new socket and DDR5's bandwidth.

Now, if 16c32t Zen3 is significantly faster than Zen2 16c32t and games well? Yeah $1000.
 
But Intel aren’t down and out, their 14nm stuff is competing with AMDs 7nm stuff which is impressive in its own right. They just need to get their foundries in order.

Wouldn't call a generation late with 35% less performance vs AMD's flagship 3950X and higher power consumption competing. In gaming performance yes but if you are willing to cherry pick a single category and call that competing then in fact Vega was competitive due to it's insane compute performance.
 
When AMD came out with the third generation of Ryzen chips, that were very nearly the equals of the current generation of Intel chips in per-core performance, but which offered more cores at the same price... an interview with Forrest Norrod of AMD revealed that those chips were designed to compete with Intel's anticipated next generation of chips - on 10nm, and with AVX-512 support. Had those Intel chips been ready then, while still competitive, AMD's Ryzen chips would not have been as clear-cut a choice; the situation would have been similar to that in the previous two generations of Ryzen.
But although Intel is poised to introduce its first 10nm desktop chips, they will come with a big-little style architecture that means they have to disable AVX-512 on the big cores (it's there on the die!) thus losing the one big advantage they have over AMD.
So AMD designs, to compete with an anticipated Intel next generation chip, a chip that is only equivalent to Intel's previous generation (but they luck out) and now Intel (to get more energy efficiency than 10nm alone will bring) tosses its big advantage over AMD overboard. Are those companies even trying to compete?
Obviously they are in some ways, since they're both improving their CPUs in many ways at a fast clip, but I keep having these facepalm moments.
 
ShadowBoxer - the only reason you might pay more overall - is the quality of the addons - 32Gb of fast memory - Energy efficient PSU with great design - superior cases, 1Tb M2 and m/bs etc - they thing is we should be now getting to a stage where of a lot of this stuff can be recycle with new chips . I always spend NZ$2000-2500 on a new build - ignoring inflation- I brought like intel I5 chips eg 2500k 3750k - the I9 were often over $1000 - why would you buy that junk? - needed more cooling for not much day to day difference - unless for work purposes - Last chip I brought is a 3700x chip - that is not the I5 ( 3600 is ) . The main increases are M/Bs and graphics cards - but you can see where some of that increase is gone - more & better stuff included . TBF the bottom of the market always had some great options - get a case with psu included - add $50 chip , $60M/B cheap memory and cheap card or a chip with included GPU - made windows oem seem expensive
 
If Intel hadn’t suffered delays and problems with 10nm then things wouldn’t be what they are today. AMD would not have made much impact with Ryzen and we probably wouldn’t be paying more than $300 for a mainstream consumer grade CPU and it would have 6 cores max. Ok I don’t know about that last part about price and core count but suffice to say I couldn’t see Intel reaching $750 like AMD did for any of their mainstream consumer grade parts.

But Intel aren’t down and out, their 14nm stuff is competing with AMDs 7nm stuff which is impressive in its own right. They just need to get their foundries in order.

As for the enterprise space, where I work. I can’t see anything but more growth for Intel. And this is where they make all their money. Ever noticed how some of Intel’s server chips cost $20k or whatever each? That’s because Intel have a shortage and prioritise enterprise customers. So they will still sell that part individually as a piece of silicon but not without a heinous markup to the point where no one will buy it and if someone does then Intel still profit. But that gives you an idea of how much money Intel makes with their enterprise solutions.
Ohhhhh!!! they just need to get their foundries in order. No kidding... They have being fixing their manufacturing processes for years and they have solved sh*t. Intel was king of foundiries, it was years above anything else. When they started having problems mainly with their 10nm process, there wre no worries, it would take years for the competition to catch up. Guess what, now Intel is behind and now it has to play catch up.
 
I think the whole x86 architecture is slowly becoming redundant.

All the powerful calculations, for AI, data centers and general science are shifting toward nVidia-based systems, and other custom AI accelerators. And many specific tasks are taken by the quantum platform.

The rest of the market does not not require powerful computing, and finds that ARM is more than sufficient for every day work and gaming. It won't be long before we see a good gaming rig that uses ARM as the main platform, with nVidia video cards. Apple will be among the first companies to introduce such products next year, although they are just as likely to opt for AMD graphics yet again.
cold day in hell before Micros$it lets that happen... with its Direct X clawed deep into developers API,
Not that I am not supportive of ARM, believe me I would love have my whole library on Linux than on windows 10.
 
I think the whole x86 architecture is slowly becoming redundant.

All the powerful calculations, for AI, data centers and general science are shifting toward nVidia-based systems, and other custom AI accelerators. And many specific tasks are taken by the quantum platform.

The rest of the market does not not require powerful computing, and finds that ARM is more than sufficient for every day work and gaming. It won't be long before we see a good gaming rig that uses ARM as the main platform, with nVidia video cards. Apple will be among the first companies to introduce such products next year, although they are just as likely to opt for AMD graphics yet again.
It'll be a long time before ARM is even remotely viable as a gaming PC platform, simply because the single threaded performance is still very weak and even if they manage to sort that out, there's an enormous legacy of x86 games and emulating a different instruction at any sort of reasonable speed is extremely difficult.
 
Let's hope history doesn't repeat itself, with AMD becoming complacent from lack of competition on their CPU front. They might have already lost some incentive with Intel's recent news of wheelspinning on 14nm for the next 12 months. Why spend danger money on development when there's no immediate threat from the competition, and the global economy slowing down?
 
Ohhhhh!!! they just need to get their foundries in order. No kidding... They have being fixing their manufacturing processes for years and they have solved sh*t. Intel was king of foundiries, it was years above anything else. When they started having problems mainly with their 10nm process, there wre no worries, it would take years for the competition to catch up. Guess what, now Intel is behind and now it has to play catch up.

This.

Looking back, Intel was sometimes two nodes ahead of their competition and this was a big advantage for them. Yes, the architecture was a plus, but being able to clock a chip higher, have it use less energy and produce more was a great plus.

Even Ryzen 2000 was still behind Intel node wise. How well Intel managed to optimize both their process and architecture on 14nm +++++++ is impressive but at some point even a super Beetle can no longer compete with more modern competitors.

Note: I actually thing Intel is in a similar position that VW was a long time ago. They optimized their Beetle a lot (just compare a 70s 1303 Super Beetle to a mid 60s standard Beetle), and introduced other air cooled derivatives but at some point they could no longer compete with that architecture and the Super Beetle became to expensive compared to competing FWD water cooled compacts.

Note: These competing compacts were not necessarily faster than a Beetle (and the Super had a modern suspension) in the 70s, but a lot more versatile with more room to grow / build on.
 
Let's hope history doesn't repeat itself, with AMD becoming complacent from lack of competition on their CPU front. They might have already lost some incentive with Intel's recent news of wheelspinning on 14nm for the next 12 months. Why spend danger money on development when there's no immediate threat from the competition, and the global economy slowing down?
Imho, complancency was not AMD's problem in the original Athlon days but rather lack of access to large volume sales channels (OEM, large Retailers).

Not selling enough = lack of ressources to invest in R&D and fab modernization.
 
I like that Intel is under pressure. Primarily on HEDP and Enterprise tho - Gaming not so Much, I'm sure they will be back, they have the money.

Being Fabless like AMD is paying off right now.

I love my 9900K at 5.2 GHz for high fps and high refresh rate gaming and emulation at home. I would not replace it with anything else on the market really, it is delivering top notch performance in these workloads.

Intel 14nm > GloFo 12nm
Intel 10nm = TSMC 7nm
Intel 7nm = TSMC 5nm

At work I just got a Thinkpad with AMD 4000 series tho, could not care less which brand my work PC uses and it's working fine. Mostly coding and doing remote desktop sessions, I'd probably be fine with an ARM chip here.

Not sure why some people hate AMD or Intel. Seriously. If Intel begins to suck in the coming years, AMD will simply do the exact same thing Intel has been doing for the past 10 years. It's business 101. AMD is not a charity organisation neither is Intel, they wants to make money but AMD is still trying to gain marketshare, so they are still somewhat trying to satisfy buyers.

Remember when Zen 3 was not supposed to work on 300 and 400 series boards? AMD listened. For now.
 
Last edited:
Back