Intel reveals full specs for upcoming Arc Alchemist desktop graphics cards

midian182

Posts: 9,748   +121
Staff member
What just happened? Intel says its more powerful Arc Alchemist A700 cards are arriving very soon, though we still don't know precisely when that will be. With the launch seemingly just around the corner, Chipzilla has now published specifications of the entire A-series lineup.

We already know about Intel's entry-level A380—read Steve Walton's review of the underwhelming desktop graphics card here—so most of the interest will be on the higher-end A770 and A750.

For the A770 flagship, Intel has confirmed that it will use the full-fat ACM-G10 GPU that packs 32 Xe cores, has a clock speed of 2,100 MHz, a total board power of 225W, comes with 8GB or 16GB of GDDR6, and a maximum memory bandwidth of 560 GB/s. Intel recently said that AIB partner cards would offer both memory sizes, while its Limited Edition (reference design) will only be available as a 16GB model.

Next in the A-series hierarchy is the A750. The card uses a cut-down version of the ACM-G10 GPU featuring 28 Xe cores, a 2,050 MHz clock speed, 8GB of GDDR6, and 512 Gbps of total bandwidth. As with the A770, the TBP is rated at 225W.

Intel has been positioning the Arc 7 A770 as a card that can outperform the RTX 3060 in 1080p ray tracing benchmarks, suggesting its performance will be closer to the RTX 3060 Ti, leaving the A750 as more of a rival for the non-Ti RTX 3060.

Next up is the A580, which has 24 Xe, a 1,700 MHz clock speed, 8GB of GDDR6 memory, and up to 512GB/s bandwidth. This presumably cheaper option will sit between the A380 and A750.

Intel said it would launch its own branded cards on "day 1" in key countries. The important element it still hasn't revealed is how much they cost. You can currently find an Arc A380 for about $140. As for the A-series, the cheapest RTX 3060 on Newegg is $349, while the lowest priced RTX 3060 Ti is $449, so Intel could be looking to price the A750 and A770 lower than their Nvidia competitors. We should find out for sure any day now.

Permalink to story.

 
Just for comparison, Intel's XMX Engines are equivalent to Nvidia's Tensor Cores (I.e. they're FP16/INT8,4,2 matrix FMA units). Each XMX Engine can do 128 FP16 ops, 256 INT8 ops, or 512 INT4,2 ops per clock; each Tensor Core (in the consumer Ampere architecture) does the same number of ops per clock as the XMX Engines, although they support sparse matrix ops for a higher throughput.

The Xe Cores are equivalent to Nvidia's SMs -- one Xe core has 16 Vector and 16 XMX Engines, with the former giving up to 128 FP32 ops per clock per Xe Core. Once again, that's the same as Nvidia's SMs. So the A770 has, for its given clock speed, a peak theoretical FP32 throughput of 17.2 TFLOPS and 34.4 for FP16. An RTX 3060 is 12.7 TFLOPS for both; a 3070 is 20.4 TFLOPS.

AMD's RDNA 2 Compute Units do 128 FP32, 256 FP16 vector ops per clock, and 512 INT8, 1024 INT4 opc for matrix stuff (no separate units -- the shader cores do everything). So a Radeon RX 6800, for example, peaks at 16.1 TFLOPS for FP32 and 32.3 TFLOPS for FP16.

With 16MB of L2 cache as well, the A770 looks great for compute work, even if it ends up sucking for gaming.
 
So the primary purpose the whole dedicated GPU project was started for, is actually its Achilles heel.

Oh, Intel.
Being a gamer doesn't have to entail myopia as well. Intel returned to the GPU market because they rightly realized that GPU growth will far outpace CPU over the next two decades. And the fastest-growing segment of the GPU market is for computing, not gaming.

Furthermore, while these cards may not blow the doors off NVidia's flagship products, they will almost certainly be priced competitively in the low- and mid-range markets. With a little more driver maturation, gamers can and will buy them.
 
The graphics clearly show Gb/s but you quote GB/s in your text. So which is it? Case sensitivity matters.
 
Intel has cancelled its entire discrete graphics card line Moore's Law Is Dead has reported. Alchemist and Battlemage are dead and everything is being shut down. Losses have been way too high and they figure they can never catch up to AMD and Nvidia in performance.

Funny, eh? For weeks now they've been telling everyone just the opposite. That they have a winner for a graphics card and they are going to stick it out through thick and thin. And now is is all gone ... 🤷‍♂️🤨
 
What I'm getting from this is that choosing a new GPU is about to become a matter of how good driver support for your use case will be.
Competition is good, so let there be more of it.
 
What I'm getting from this is that choosing a new GPU is about to become a matter of how good driver support for your use case will be.
Competition is good, so let there be more of it.
Bad news possibly incoming. According to Moore's Law is Dead, Intel is getting out of discrete GPUs and Battlemage is effectively dead. His leaks have been pretty accurate so I am inclined to trust this.

Edit: Ninja'd by Todd Sauve
 
Back