Opinion: Intel chip advancements show they're up for a competitive challenge

Bob O'Donnell

Posts: 110   +2
Staff member
Why it matters: If you follow developments in the semiconductor industry, it would be easy to see how some people may think Intel is in serious trouble. Several signs have not been promising after all. However, just when you begin to think that Intel’s slipping, the company has seemingly returned to its roots in semiconductor manufacturing to deliver what looks to be very promising improvements in basic transistor technology that should translate in competitive products in a relatively short window.

First, the company announced strong second quarter earnings, but its stock tanked when it revealed yet another problem and delay for its 7nm manufacturing process. Second, related to that, Intel recently lost its crown as the largest and most valuable semiconductor company—a spot it held for decades—to Nvidia. In fact, as of the market’s closing on August 17, the hard-charging Nvidia surpassed a $300 billion market capitalization, versus a $208 billion capitalization for Intel.

Finally, from a CPU performance perspective, it’s been AMD (which crossed over the $100 billion market cap recently and briefly) that has been handily beating Intel nearly across the board on PC desktop, mobile, and even server benchmarks. For a company that has traditionally dominated in all these areas, this has been quite a change.

However, just when you begin to think that Intel’s slipping, the company can surprise you. And so it did, both at Architecture Day and at this week’s Hot Chips conference. The Architecture Day event held last week offered a comprehensive look at a number of innovations that Intel is working on across process technology, chip packaging, core chip architectures, high-speed chiplet-to-chiplet interconnect, security and much more.

Importantly, while some of these developments fall into the more theoretical realm, the vast majority are going to bring practical, real-world performance improvements that should translate into offerings that will be highly competitive with AMD’s latest.

Intel returned to its roots in semiconductor manufacturing to deliver what looks to be very promising improvements in basic transistor technology.

Intel returned to its roots in semiconductor manufacturing to deliver what looks to be very promising improvements in basic transistor technology. Until just a year or two ago, Intel was able to use its long history of expertise in chip making to its clear advantage, bringing out regular improvements that other chip makers would often take 18-24 months to catch up to. As process delays and other manufacturing challenges started to pile up, however, it started to look like the company’s in-house manufacturing was becoming more of an albatross around its neck.

With the debut of 10nm SuperFin technology, the company is clearly reminding the rest of the industry that it has a number of different tricks up its sleeve when it comes to core semiconductor capabilities. More importantly, these advances can still be used to make significant progress in areas that companies who don’t make their own chips can’t get access to until their manufacturing partners develop them as well.

By leveraging a new metal capacitor technology in conjunction with the company’s enhanced FinFET transistors, Intel claims that the 10nm SuperFin technology enables it to drive higher currents, reduce resistance, and thereby significantly improve performance over traditional transistor designs. In practical terms, the first Intel part to use the new technology (a CPU codenamed Tiger Lake set to be unveiled in early September) is expected to have a performance boost that the company claims is the largest “intranode” (on the same process size) improvement it has ever had.

In fact, Intel says it would be “comparable to a full-node transition”, which is a pretty big statement—though it’s also meant to somewhat counter continued delays down to the 7nm node. Real-world performance remains to be seen, but initial commentary and analysis from those who thoroughly understand the details of chip architectures are promising.

At the same time, the company also recognizes that, as transistors get smaller and smaller and process technologies get harder and harder, it makes sense to separate, or “disaggregate” chip design advances from manufacturing improvements, so that the two elements can move forward on separate timelines. Combining this with the fact that Intel recently openly discussed the idea of using outside chip manufacturing plants, or foundries, to build some of its parts highlights a very practical, arguably even more mature approach to the situation. It also reflects a very different attitude for Intel—one that should help the company progress forward on advanced chip designs, regardless of where (or by whom) they’re ultimately built.

Intel also provided a great deal more information about its Xe GPU architecture, with a particular focus on how it will be able to scale from better integrated graphics—the first iteration will also be part of Tiger Lake—to several different types of discrete GPUs targeted at different markets. Its first standalone GPU, codenamed DG1, is expected later this year, as is a new server-focused GPU. Both of these chips, as well as the integrated GPU, will be based on what they’re calling the Xe-LP (low power) architecture. The company also unveiled a new Xe-HP (high performance) GPU architecture that’s destined for other discrete chips focused on more advanced datacenter applications, such as AI acceleration, as well as a gaming-optimized part that will support ray tracing.

Both at Hot Chips and at Architecture Day, the company also talked a great deal about its software efforts, particularly its OneAPI architecture, which is designed to make the process of programming for any type of non-CPU accelerator (from GPUs, to FPGAs to other AI-focused accelerators and beyond) much easier to do. This is clearly a monumental goal and a challenging task. However, the company seems to be making solid progress and highlighted how software enhancements alone can often lead to significant performance improvements.

The bottom line is that it’s clearly way too early to count Intel out. If nothing else, the significantly more competitive environment in which it finds itself seems to have inspired it to be doing some of its best work in years. Final details are still to come, but the good news is that the company is clearly up for competitive challenges, and regardless of what vendors you choose to support, we will all benefit in the end.

Bob O’Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on Twitter .

Permalink to story.

 
I agree that it is too early to count Intel out. Given their resources, it would be rather shocking if they did not manage to catch up. We are talking about the big incumbent after all.

But - thinking that Intel quickly becoming competitive again is good for the market is imho a misbelief.

Ideally, Intel should become competitive again once they can no longer influence what OEM do wrt competitor based systems (e.g. how they market it, what options you have, which GPU they come with....), so when they have to compete on technical merit rather than based on MDF. For this to happen, they need to be behind for quite a while longer.

I think most people don‘t want to see a repeat of the original Athlon days where Intel kept AMD out of markets long enough until they could catch up with the Core series and squash AMD.
 
Last edited:
They are way to big... to counter the inertia... intel is done... they are not agile and it is going to kill them.
 
By the way, bob swan himself said that he is not looking for cpu dominance anymore. he looks for the tam and this is a clear indicative that intel doesn't plan to take back their pseudo monopoly in the cpu market. zen 3 is going to be the dead blow.
 
Intel's all words and paper products.

Meanwhile, they have been on 14nm, now 14nm++++++++++++++++++++++, since last decade and last time I checked, paper products' market value is $0.
 
They are way to big... to counter the inertia... intel is done... they are not agile and it is going to kill them.

They may have corporate culture problems if I remember a previous Techspot(?) article, but their reputation and market influence/domination are both way too grand for them to face any severe problems. They're the biggest in this market, and even if they keep being slow or making mistakes it will stay that way for the coming decade, I reckon.
Many individuals and businesses switch brands very slowly. Only if the differences in quality between AMD and Intel products become big will Intel start to sink. And keeping up isn't that hard if you're as rich, resource-rich (chip plants and everything) and big as Intel.

I don't see how some tech enthusiasts got their heads in a bubble in such a way that they think Intel is the underdog in some important way. They're not and they won't be.
 
Though I agree with the basic sentiment of the article (Intel shouldn't be written off just yet), I couldn't shake that 'I'm trying to explain my school report" feeling while reading this article.

Anyway, it seems that Intel is finally, finally gettting the hint and getting back to what should have been in focus for the last 5 years: improving their architecture (because that hasn't changed from the 7xxx series). Manufacturing process is a different story, and a smaller node always MAY bring better results (on contrary to popular belief, it is far from being guarrantied), but frankly, I was so fed up with them just milking the existing microarchitecture endlessly and complaining on delays with the manufacturing process transition...so yeah, good, we will finally may see some real developments, hopefully translating to tangible real-world benefits.

'bout goddam' time.. :)
 
Last edited:
Intel's improved FIN technology will reduce electrical resistance. This means faster speeds will be possible even at 10 or 14nm. Having a solid 3D wire (printed up also) with no flaws, is required when they eventually go under 7nm. It's a smart way to stay in the game. Still, without lower prices they can bite me, or something similar but not so family friendly.
 
What Intel needs is innovative products, not marketing slides.

Sadly for them, marketing slides, buying benchmarks and MDF worked too well for too long. You‘d think they learned their lesson after the „Atom for Mobile“ fiasco.

I don‘t mean to be ironic or sarcastic, but being able to force their way by means of size and money has hurt them long term as it made them complacent and arrogant.
 
Last edited:
From my point of view in my profession, Intel appear to be focusing on the future which yields huge growth in the enterprise space. More and more cloud computing is needed an Intel want to be the company that provides a solution. They already do sell a lot in this area. And I think their GPU programme is a huge part of this. I do personally think that in the future most of our computing will be done on the cloud and our personal devices will serve as just portals to these services. Intel should be players when this becomes a common reality, they missed out on the rise of mobile and I don’t think they want to miss out on the next big thing.

As for Intel’s chips themselves today. I really wouldn’t be surprised if Intel drop something competitive again soon. They clearly have a lot going on, the company is vast. Who knows what projects are in the works.
 
In the earlier generations of Ryzen, while AMD was competitive again, it was still behind Intel, because Intel was including a later generation of SSE or AVX on its chips that still let themselves offer twice the floating-point muscle - a disparity that had also existed during the Bulldozer years.
The Ryzen 3000-series chips were, according to a statement by a highly-placed AMD executive, intended to compete with Intel's expected 10nm chips, and, at the time, it was believed those chips would bring AVX-512 to the consumer space. So AMD, although it again doubled vector width in that generation, was still resigned to being a generation behind.
Thus, I've been waiting all this time for the other shoe to drop. Now, Intel's 10nm desktop chips will have a big.LITTLE style architecture that will preclude them from offering AVX-512.
However, AVX-512 apparently does have thermal issues that mean that the decisions by both AMD and Intel to forego what appears to be a very valuable asset in their intense competition could well be much more reasonable than they had appeared to me.
So I've been spending the last year waiting for Intel to stomp on AMD, sending it back, not to the Bulldozer-era wilderness, but to the situation that existed in the earlier Ryzen generations. It hasn't happened. That doesn't mean, though, that Intel won't get back in the game, and back in the game in a big way.
But while AMD is going to be in for a fight, so is Intel.
 
They are way to big... to counter the inertia... intel is done... they are not agile and it is going to kill them.

considering how many markets intel is in and how much revenue they make per quarter I hghly doubt that.

Intel can afford to have AMD take the X86 crown for awhile and still be pulling in record profits.

The market is much larger than the x86 desktop space.
 
By the way, bob swan himself said that he is not looking for cpu dominance anymore. he looks for the tam and this is a clear indicative that intel doesn't plan to take back their pseudo monopoly in the cpu market. zen 3 is going to be the dead blow.
It does sound like white flag
 
I've upgraded recently from a 4790k to a ryzen 3700x and the only thing that makes the difference is the core count that's double and a bit more IPC increase like the difference between 486 points on intel side (modest oc at 4.6) and 510 on amd side (oc at 4.35).
And that's like what...5% difference between a cpu that's ~6 years old and AMD that's 1 year old?
I'm not throwing rocks at anybody but yea sure, that's a nice boost AMD can't complain but I'd appreciate if "fanbois" weren't missleading the other normal ppl about how big the difference is.
What I'd say I find it cool about AMD for now at least, is the fact that they seem to favour parallelization, as I've seen some workloads distribute better among the CPU cores but this comparison is a bit outdated because I don't own a 10-th gen to make a propper comparison.

The thing now is the power consumption, the fact is that AMD are using a 7nm lithography that allowed them to cram more cores into the CPU die because they produce less heat. That's why AMD are a bit ahead, so please stop the drama and all that crap.
 
I've upgraded recently from a 4790k to a ryzen 3700x and the only thing that makes the difference is the core count that's double and a bit more IPC increase like the difference between 486 points on intel side (modest oc at 4.6) and 510 on amd side (oc at 4.35).
And that's like what...5% difference between a cpu that's ~6 years old and AMD that's 1 year old?
I'm not throwing rocks at anybody but yea sure, that's a nice boost AMD can't complain but I'd appreciate if "fanbois" weren't missleading the other normal ppl about how big the difference is.
What I'd say I find it cool about AMD for now at least, is the fact that they seem to favour parallelization, as I've seen some workloads distribute better among the CPU cores but this comparison is a bit outdated because I don't own a 10-th gen to make a propper comparison.

The thing now is the power consumption, the fact is that AMD are using a 7nm lithography that allowed them to cram more cores into the CPU die because they produce less heat. That's why AMD are a bit ahead, so please stop the drama and all that crap.
Firstly, intel didn't advance much further in terms of IPC after Haswell, either. Small gains, yes. They only did small improvements and found ways to cram in more cores. They now mastered the 14nm process after so many years of working on the same node.

And secondly, AMD chips are optimized for multi core work loads. If you compare single core IPC, that's one side of the story (which still is a valid point but does not represent all aspects).

In my opinion, intel chips are still surprisingly competitive when we factor in the old process tech.
 
Firstly, intel didn't advance much further in terms of IPC after Haswell, either. Small gains, yes. They only did small improvements and found ways to cram in more cores. They now mastered the 14nm process after so many years of working on the same node.

And secondly, AMD chips are optimized for multi core work loads. If you compare single core IPC, that's one side of the story (which still is a valid point but does not represent all aspects).

In my opinion, intel chips are still surprisingly competitive when we factor in the old process tech.

@Alex105's data is false.

4790k fetches 411 Cinebench R20 pts @4.4 GHz Single Threaded and my own 4770k fetches 403 pts in the same benchmark single threaded at 4.33 GHz.

There is no way in hell that a 4790k fetches 486 pts at Cinebench R20 Single Thread at 4.6 GHz. My best educated guess would be around 420 pts while a Ryzen 5 3600XT, as you can see, fetches 511.

See for yourself:

https://www.cpu-monkey.com/en/cpu_benchmark-cinebench_r20_single_core-9
 
The thing now is the power consumption, the fact is that AMD are using a 7nm lithography that allowed them to cram more cores into the CPU die because they produce less heat. That's why AMD are a bit ahead, so please stop the drama and all that crap.
7nm litography helps, yes, but that is not the reason AMD could cram more cores into their dies (just think of the first Ryzen generation, they were produced on 14nm too, yet a 1700 or 1800 had twice the cores as the then-king 7700K), it's more to the architecture itself (chiplet desing, infinity fabric, different bus approach, different pipeline structure and branch prediction, that sort of things). Cramming more cores into the same area is actually very much a double-edged sword: the actual silicon used is smaller, but so is the area to transfer heat, so cooling can be a challenge, and heat is an enemy of efficient operation. Long gone are the days when moving to a smaller nm node automatically meant significant gains: these days it is just a factor, of the many, to play with (just have a look at the GPU market, Turing (Nvidia 20xx) is still produced on 12nm, and is more energy efficient than the 7nm Navi (5xxx) GPUs from AMD)
 
Given Intel's 3 to 5 percent improvements for each generation for the last 10 plus years its about time they made an actual effort to justify their pricing. Having said that I'm pleased to see them fighting back so regardless of which cpu I choose I get a decent uplift in performance.
 
Proof, as they say, is in the pudding. I agree that it's too early to count Intel out, but the reasons for that are almost wholly unrelated to any marketing slides or presentations they put out. Let's not spend too much time gushing over the advancements they(or anyone else) make until we can see the final product.
 
I agree that Intel cannot be counted out...just yet.
Their last 14nm part was a good product. Just not the best.
OTOH. Their 10nm Ice Lake uarch is underwhelming to say the least, compared to AMD.
The next year or two will tell if the marketing has been BS or not.

AMD is being hampered by the lack of mobile chip output from TSMC, and the reluctance of 3rd party manufacturers to make halo mobile products that the chips deserve.... otherwise the bloodbath could be much worse for Intel. As it is...AMD is gaining all-important mindset across the whole spectrum.
 
I agree that we will not see Intel bow out as a result of their missteps. They have a good record of making good processors and a cutting edge fab as compared to competitors. However I feel Intel won't get to go back to those relaxing days where there are no strong competition. Which I feel will be better for consumers. For too long have Intel restricted consumers from getting access to more cores at an affordable price. I recall a 6 core processor from Intel having a steep premium over the 4 core processor back in those days.
 
Back