Opinion: Intel spices up PC market with Arc GPU launch

Bob O'Donnell

Posts: 80   +1
Staff member
Highly anticipated: The tech industry and tech products are excellent examples of the principle that competition is good for consumers, with vigorous rivalries almost always leading to important new capabilities, lower prices, and just better products. With that thought it mind, it will be interesting to see exactly how Intel's return to the discrete GPU market will impact PCs, as well as the progress and evolution of both AMD and Nvidia's next offerings.

Intel officially announced today the Arc A series, which is targeted at laptops. A version of Arc for desktops has been promised for the second quarter, while a workstation version is due by the third quarter of the year. Add to that the preview that Intel gave of its impressive-looking GPU architecture (codenamed Ponte Vecchio) for datacenter, HPC (high performance computing), and supercomputer applications last year, it's clear that the company is getting serious about graphics and GPU acceleration.

To be sure, Intel has made previous (unsuccessful) runs at building its own high-quality graphics engines. However, this time the environment and the company's technology are both very different. First, while gaming remains an absolutely critical application for GPUs, it is no longer the only one. Content creation applications, such as video editing, that are very dependent on things like media encoding and decoding, have become mainstream requirements.

This is especially true given the influence of YouTube, Twitch, and social media. Similarly, the use of multiple high-resolution displays has also become significantly more common. Most importantly, we're starting to see the rise of AI-powered applications on PCs, and virtually all of them are leveraging GPUs to accelerate their performance.

From photo and video editing to game physics and rendering, the rise of PC native AI-enhanced applications puts Intel's Arc in a very different competitive light given how much focus the company placed on accelerated AI features in its new chip.

Real success in the market will only be achieved if Arc delivers a great gaming experience, however, because of the many other capabilities and applications that a modern GPU architecture can enable, I don't believe it's essential that Intel starts out with the best gaming experience.

Realistically, that would be extremely difficult for Intel to pull off at once, given the huge amount of time and investment that AMD and Nvidia have put into their own architectures over the last few decades. But as long as the Arc GPUs keep things close on the gaming side, I believe the market will be interested in hearing what else they have to offer. Plus, Intel is already starting to talk about its third generation of these chips -- codenamed "Celestial" -- and hinting that they could start to compete for the GPU gaming crown, clearly suggesting the company is in it for the long haul regardless.

Technologically, Intel is bringing several interesting offerings to this first generation of Arc GPUs (codenamed "Alchemist"). The underlying Xe HPG core architecture features hardware accelerated ray tracing support, up to eight render slices, sixteen 256-bit vector engines and sixteen 1,024-bit matrix engines. Taken together, in the entry-level Arc 3 series chips, Intel claims these features translate into 2x gaming performance over the company's Iris Xe integrated graphics and 2.4x raw performance for creative applications.

For AI acceleration, Intel's newly architected XMX matrix engine supports the acceleration of a host of different word sizes and types (from INT2 through BF16) and offers up to 16x the number of operations per clock for INT8 inferencing. One key applications that benefits from this is Xe Super Sampling, an AI-powered game resolution upscaling technology that's conceptually similar to AMD's FSR and Nvidia's Image Scaling.

One of Arc's unique features is that it's the first to support hardware encoding/decoding of the AV1 codec. It also features a new smoothing technology Intel calls Smooth Sync to improve screen tears when the frames per second output from the GPU doesn't match with the laptop display's screen refresh rate. On the display side, Arc's Xe display engine supports up to four separate 4K 120 Hz resolution monitors over both HDMI 2.0b or DisplayPort 1.4a connections.

Intel has also put a great deal of thought into integration with their own 12th-gen Alder Lake CPUs, specifically through a set of technologies the company calls Deep Link. Dynamic Power Share allows the system to automatically shift power between the CPU and GPU to optimize the performance for different types of workloads -- a capability similar to AMD's Smart Shift. Hyper Encode and Hyper Compute allow the simultaneous shared use of both the discrete GPU and the CPU's integrated GPU to accelerate either encoding or other types of computing applications, ensuring that no silicon goes wasted.

Intel will be offering Arc chips through a wide variety of partners, with Arc 3-equipped laptops expected to start at $899. Both Arc 5 and Arc 7 notebooks will be coming out later this summer. The company's initial launch partner is Samsung with the Galaxy Book2 Pro, which will include the Arc A370M, along with a 12th gen Core i7 GPU. While Samsung may not be an obvious first choice, the company has been aggressively improving its PC product line over the last few years, and Intel clearly sees it as an important player in the PC market whose presence they want to help.

The return of Intel to the discrete GPU market is bound to generate a great deal of attention and scrutiny. Initial gaming benchmarks versus discrete mobile offerings from AMD and Nvidia could prove to be tough competition, and it will be interesting to see how Arc-based laptops compete against M1-based Macs on creative applications.

However, if Intel can pull together a reasonable story across all these different areas -- and highlight the AI-focused advantages it offers -- then Arc could be off to a good start. Regardless, there's no doubt it will spice up competition in the GPU market, and that's something from which we'll all benefit.

Permalink to story.

 
This doesn't seems to take into account the very real and even likely possibility of atrocious driver and software support at launch: they're almost guaranteed to botch it the first time around imo but even being far more generous, it should still carry a "DO NOT GET THIS AT LAUNCH" warning that's far more loud than the usual warnings for any gpu launch.
 
This doesn't seems to take into account the very real and even likely possibility of atrocious driver and software support at launch: they're almost guaranteed to botch it the first time around imo but even being far more generous, it should still carry a "DO NOT GET THIS AT LAUNCH" warning that's far more loud than the usual warnings for any gpu launch.
Possibly. But the bar for software support is low when compared to Radeon. Intel have far more resources than AMD for this so it wouldn’t surprise me if Arc ends up with better drivers than Radeon cards get.
 
Possibly. But the bar for software support is low when compared to Radeon. Intel have far more resources than AMD for this so it wouldn’t surprise me if Arc ends up with better drivers than Radeon cards get.

Seeing has how AMD helped a lot with ARC, I wouldn't hold out much hope :/
 
This doesn't seems to take into account the very real and even likely possibility of atrocious driver and software support at launch: they're almost guaranteed to botch it the first time around imo but even being far more generous, it should still carry a "DO NOT GET THIS AT LAUNCH" warning that's far more loud than the usual warnings for any gpu launch.

I would be pleasently surprised if Intel hit the ground running on quality & performance, but you never know, they're not entirely new to this. Remember they've been writing drivers for their onboard GPUs for decades. Even if there hiccups, Intel are in a good position to challenge the market incumbents on price & volume. If they've established a design, manufacturing and distribution pipeline, it's a very good start.

GPU prices are skydiving atm, I doubt it's any coincidence that Intel is entering the market.
 
I am confused - is this an official press release ?

Regardless if it is, I will always dismiss anything from Intel and wait for reviews, because they have shown over and over to lie about the numbers and never bothered in showing any remorse.
 
Well written.
Some people will hate it because it's not negative enough. Those people aren't techies.

My final judgement will come after the reviews. Intel has a giant software team and I think they can pull this off. I agree, they def don't need to beat AMD and Nvidia in performance right out of the gate. Even AMD took a couple years off following Polaris, so to knock Intel for "coming up short" wouldn't be wise.
 
This doesn't seems to take into account the very real and even likely possibility of atrocious driver and software support at launch: they're almost guaranteed to botch it the first time around imo but even being far more generous, it should still carry a "DO NOT GET THIS AT LAUNCH" warning that's far more loud than the usual warnings for any gpu launch.
You talk like NV and AMD weren't atrocious.
 
Thank you Intel for piling on in terms of the TSMC shortage by refusing to use your own foundries to make GPUs.

Thank you Intel for keeping AMD and Nvidia prices inflated by refusing to compete in the serious discrete space for decades.

Sure is swell that AMD already competes directly against PC gamers by starving itself of wafer that could be used to make PC gaming GPUs (which instead go to the console parasitism) — and has a serious incentive to sandbag in the PC gaming GPU space to enable Nvidia to set prices points higher and higher. Sure is swell that Intel is helping AMD by further reducing the number of TSMC wafers going toward PC gaming GPUs.

Thanks Intel!
 
Good. We are in need of a good selection and competitive market for a long time now.
We will see a shift from a near monopoly to close to a true duopoly.

What we won't see is enough competition to give gamers the value the market could give them, via adequate regulation to enhance competition.

Counting AMD's contribution to the PC gaming GPU market is hardly a matter of the simple whole number 1. The company competes directly against PC gaming via the 'console' parasitism. That means PCs in disguise with redundant incompatible walled software gardens. It means AMD allocates wafers to 'consoles' at the expense of PC gamers, enabling Nvidia to drive prices higher (which AMD follows). It gives AMD a strong incentive to sandbag, such as by only offering low-midrange products like Polaris that have any market justification for gamers and overpriced substandard cards above that. The release of competitive parts for the first time in a very very long time conveniently coincided with the shortage. Corporations tend to be very keen on planning to maximize profit.

Intel likes fat margins and hasn't felt PC gaming is worth bothering with for decades. It's using TSMC supply to hinder AMD's ability to compete, driving prices higher. Had Intel's goal been to provide decent value and relief for gamers it would have used its foundries to produce GPUs that would have been available quite some time ago. Letting Nvidia set the prices, though, enables Intel to extract maximum revenue from its TSMC-made parts by squeezing AMD. If, for instance, every GPU Intel produces 'for gamers' could have been an AMD PC gaming GPU, there may not be much of a price reduction for PC gaming buyers. If Intel's squeeze threatens AMD's margins too much it might put more of its wafers toward producing 'console' GPUs instead. If Intel manages to get enough of TSMC's allocation, AMD will be forced to meet the 'console' companies' minimum contractural requirements — reducing the number of PC gaming chips created regardless of the current goals of AMD's execs.

What PC gaming needs, desperately, is a GPU-producing corporation that somehow will have its mission ensured to be about delivering value to PC gamers first, not some distant priority that comes and goes or which is mostly empty words. What PC gaming needs less are companies like AMD that directly compete against PC gaming. I don't know how a corporation can be structured to ensure it remains tied to its founding goal, though. Money chasing seems to be all that counts, enabling companies to morph into whatever the money masters of the moment choose. I doubt the non-profit system is well-crafted enough to handle a GPU company that will be able to compete adequately. If that were the case, it would threaten the hegemony of standard amoral business — where 'money talks' in rabid gibberish.

Perhaps the solution is simply to reverse the long trend of government-favored consolidation. How to get that, though, when regulatory capture is omnipresent?
 
Last edited:
We will see a shift from a near monopoly to close to a true duopoly.

What we won't see is enough competition to give gamers the value the market could give them, via adequate regulation to enhance competition.
They are killing the thing that feeds them. People who have seen these redicilous prices for a long time now realizing it is simply pointless to be in PC gaming.
Maybe they think that with high demand from miners they dont need their old customers, PC gamers. I dont wish GPU makers to be right on this.
 
At Wccftech there is already pictures of the desktop model. Those too have only HDMI 2.0b. The driver suite looks nicely organized, offering overclocking features, and the card is sleek, but it seems like it is little on the small side to cool a load of over 200 W. Things don't look too dandy at this point, but the Deep Link features are certainly something to expect great performance benefits from.
By the way it is still possible that custom cards support HDMI 2.1 by having some dedicated "DisplayPort to HDMI converter chip" as Hardware Unboxed states in their video about the laptop models.
 
Possibly. But the bar for software support is low when compared to Radeon. Intel have far more resources than AMD for this so it wouldn’t surprise me if Arc ends up with better drivers than Radeon cards get.
open your own stand up comedy pls.
Intel is creating gpu drivers about 15? years and still they are just piece of ...manure.
 
I would be pleasently surprised if Intel hit the ground running on quality & performance, but you never know, they're not entirely new to this. Remember they've been writing drivers for their onboard GPUs for decades. Even if there hiccups, Intel are in a good position to challenge the market incumbents on price & volume. If they've established a design, manufacturing and distribution pipeline, it's a very good start.

GPU prices are skydiving atm, I doubt it's any coincidence that Intel is entering the market.
You talk as if intel's previous decades of GPU drivers in any way makes us more confident in their product. It's quite the opposite.
 
They are killing the thing that feeds them. People who have seen these redicilous prices for a long time now realizing it is simply pointless to be in PC gaming.
Maybe they think that with high demand from miners they dont need their old customers, PC gamers. I dont wish GPU makers to be right on this.
Swing and a miss there bud.

"In October 2019, RTX 2000 series components made up 4.77% of Steam’s video card category, with the Turing products launching in September 2018. However, looking at this year’s results, RTX 3000 GPUs account for 5.56%, a 0.79% lead over the previous generation within the same timeframe."

https://www.pcgamesn.com/steam/hard...RTX 2000,generation within the same timeframe.

What nvidia and AMD ACTUALLY found was that gamers, being the eternal consoomer, were willing to pay much higher prices for their product.
 
Possibly. But the bar for software support is low when compared to Radeon. Intel have far more resources than AMD for this so it wouldn’t surprise me if Arc ends up with better drivers than Radeon cards get.

While AMD has vastly improved their drivers, yes they're still not close to where they should be. And well better then AMD's neither are Nvidia's IMHO. I have a tendency to go Nvidia but have also used many AMD cards as well, and neither have ever given me totally trouble free drivers. I've also been using computers for gaming since the Win 95 days, so yeah... I've seen quite a lot of them over the years.

All this is to say that writing GPU drivers has to be problematic since both long established developers still seem to have more misses than hits. So while Intel has a lot of resources to throw at the problem that doesn't by any means mean that it's a certainty that they'll have no issues. Fact is, after remembering dealing with cards like Rage Pro and TNTs and all the driver problems with them I think it's safe to say that Intel has a better chance of missing the mark for the first few generations than not.
 
Intel has more resources and deeper pockets to invest in a team that produces drivers, but that does not mean that they will do that or even have the experience to do that. I’ve ran into issues with their UHD graphic drivers in the past, and driver issue with their hot selling WIFI cards are also not uncommon. Assuming they poached a few experience folks to refine their software for GPU, the optimisation in games is not something that you can achieve overnight or in a near term. So without jumping to conclusions, I think we can wait and see how this performs through independent reviews. I don’t have high hopes, and I feel Intel’s launch timing is also quite off given that prices of GPU and availability are improving. And Q2 is very close to the launch date of newer AMD and Nvidia GPUs. If the sentiments are negative due to poor review results, then this is going to flop really hard unless they offer it at a price that is extremely competitive.
 
Back