Intel Arc A380 GPU disappoints in early benchmarks, but there's still hope

nanoguy

Posts: 1,355   +27
Staff member
The big picture: Intel recently confirmed that its desktop Arc GPUs are going to break cover in Q2 2022, and also promised to deliver millions of them to gamers later this year. Early OpenCL benchmarks don't seem that promising, but there's still hope that Intel's discrete GPUs could win some gamers' hearts in what is otherwise a heated market, where prices and availability are the biggest factors in making a purchasing decision.

Intel’s Arc (Alchemist) GPUs are still shrouded in mystery, and the expectations around them have been tempered by the ongoing chip shortage and rumors of slow progress on driver development. The company says we’ll see the first models in laptops starting next month, but outside of a few odd leaks and the occasional small teaser from Raja Koduri, details about the upcoming GPUs have been scarce.

Thanks to a review done by SiSoftware on pre-release hardware, we now have a slightly better idea on what to expect, at least when it comes to the entry-level “A380” graphics card in the Alchemist lineup. Spec-wise, the A380 apparently has 128 compute units for a total of 1024 shaders, paired with six gigabytes of GDDR6 operating at 14 Gbps. Additionally, there are 16 Xe matrix units (XMX) that can be used for Intel’s XeSS upscaling solution.

It’s also worth noting that SiSoftware only used OpenCL benchmarks in its testing, so the results may not be indicative of gaming performance. Despite having a 96-bit memory bus, Intel’s A380 card was slightly slower than AMD’s Radeon RX 6500 XT, which has an even more modest 64-bit memory bus. It was also significantly slower than Nvidia’s entry-level RTX 3050 and GTX 1660 Ti graphics cards.

These results don’t look good for Intel’s much-awaited discrete GPUs, save for the power draw which is rated at 75 watts — significantly lower than the 107-watt TGP of the RX 6500 XT and the 120-watt TGP of the GTX 1660 Ti. It’s possible Intel will market the A380 for small form-factor PCs where low heat and quiet operation might offset the modest performance.

That said, the “A500” series and “A700” series GPUs are expected to pack a bigger punch, and SiSoftware has yet to test these. The A700 series will have 512 compute units paired with 16 gigabytes of GDDR6 over a 256-bit bus, and leaked benchmarks so far suggest it could perform close to a GeForce RTX 3070 Ti. The mid-range A500 series is expected to be comparable to an RTX 3060 or even an RTX 3060 Ti.

Intel says its first Arc desktop graphics cards will land sometime in Q2 this year, so it won’t be long before we can see them in action. Behind the curtain, the company is also poaching AMD veterans to work on its ambition to become a proper third player in the discrete GPU space.

Masthead credit: Moore's Law is Dead

Permalink to story.

 
I mean, sure they could always win based on just price: Honestly right now I wouldn't be too picky: If you can deliver at least 3050 levels of performance and decent driver stability (And that would need to be FAR BETTER than what we've seen from intelhd graphics and iris pro graphics in the past but well, I am usually assured this will be the case even if I'm skeptical it will be) I really don't care if the power draw for the Arc A380 is even double what it is on the 3050.

All they have to do is beat Nvidia and AMD in price and availability even if their first cards that were supposed to compete on a much higher tier can't do so.

In the past however, intel isn't exactly known for bargain deals but hey they priced the 12100 and 12400 Alder Lake chips at reasonable prices so maybe they could just lower their prices to get a strong start while their tech lags behind.
 
Here's what I bet will happen:

The performance will be competitive but drivers will be horrendously, horribly bad. There will be some time where the cards sit on shelves because the majority simply aren't aware any company other than Nvidia makes GPUs at all. Seriously, ask the "normies" about GPUs and they'll only say Nvidia: they don't even know AMD exists. It'll be extra hard for Intel because you'd have to be paying attention to know that Intel is trying to make dGPUs AND you would have to have an open mind to the idea they aren't just like their very poor integrated GPUs.
Intel is going to have to do a tremendous marketing push on all fronts focusing on availability, price, and most importantly PERFORMANCE over the Nvidia alternative.
An alternate scenario is that every card is bought immediately and even with bad drivers, it's sold out and highly valued.
 
Looking at power draw alone, I don't think Intel developed a 75w part to dethrone a 3090. If it's price competitive with other 75W parts, or even faster than cards like the 6500xt, I see no issue with what's going on so far.

 
Where's Patty boy and his hubris now? Most consumers won't care if these are just ending up in laptops, but for a desktop card not looking good given how pathetic the 6500XT is.
 
As usual I wait for trusted third party benchmarks. If its a tad slower then the 6500xt while being under 75 watt its already won by a mile in my book, I cant use over 75 watt GPUs in my media PC without replacing it.

There's also the driver issue. Intel doesnt have experience with gaming drivers. I'm guessing it will take at least a few months for the big performance issues to be ironed out.
 
Back