Intel offers first look at its Arc Limited Edition desktop graphics card

nanoguy

Posts: 1,355   +27
Staff member
In brief: As the dust settles on the semi-paper launch of Intel's mobile Arc GPUs, there's still a lot of mystery around the desktop counterparts. Intel has started teasing them again, but it will be interesting to see if the company can nail the performance and price aspects of these cards in time for a summer release.

This week, Intel announced its new Arc A-series dedicated GPUs for laptops, adding more competition for Nvidia and AMD. For now only the entry-level Arc 3 series are ready to hit the market, while the heavy hitters like Arc 5 and Arc 7 series which have more graphics cores, VRAM, and ray tracing units are set for a summer release. We'll have to wait and see how they perform in the real world, but Intel is now officially the third player in the discrete GPU space, even if it's off to a slow start.

That said, many gamers are eagerly awaiting the desktop Arc offerings. Intel says these are also slated for this summer, and it's already teasing what appears to be the flagship of the lineup. The new card is supposedly called Intel Arc Limited Edition Graphics, and now we have the first look at the official design with a pretty cinematic rendering.

As you can see from the video above, Intel went with a clean and minimalistic aesthetic for this model. In an era when flagship cards tend to be gargantuan in size, this looks to be a surprisingly slim, two-slot graphics card with a standard dual-axial fan cooling system. This means it will exhaust heat into the PC case rather than blow it out, but by the looks of it this won't be a space heater on par with AMD's RX 6900 XT or Nvidia's RTX 3090 Ti.

Speaking of heat, if this model integrates the full ACM-G10 die with 32 Xe cores and 16 gigabytes of GDDR6 memory, it will likely run inside a 175-225 watt power envelope. The 50-second video doesn't show any signs of external power connectors, so we can only assume the render isn't representative of the final product.

Also visible in the video are three DisplayPorts and one HDMI port, which we're hoping will support the HDMI 2.1 specification. Intel has chosen to omit this feature when it comes to its mobile Arc GPUs, and instead wants OEMs to implement it via an external chip that will convert DisplayPort signals to HDMI 2.1.

Intel's desktop Arc GPUs may also feature an AV1 hardware encoding block like their mobile counterparts, which could set them apart from the competition when it comes to media engine capabilities for content creators. The company had a rather disappointing launch for its mobile Arc GPUs, so we're hoping the desktop versions will make up for it with more exciting specs and a competitive price point.

Permalink to story.

 
In the mobile space, Intel‘s only direct GPU competitor is nVidia right now. Same will go for OEM desktops.

One nitpick:

but by the looks of it this won't be a space heater on par with AMD's RX 6900 XT or Nvidia's RTX 3090 Ti.

Really, a space heater like the 6900XT (304W in gaming) or 3090 Ti (469W in gaming) ? Yes, those two are definitely in the same power consumption league.

Thing is: If Arc‘s power consumption is good, or not depends on its performance. If it performs around the 3070 level, 225W on TSMC 6nm is not great, on a 3080 performance level it would be great, below 3070 perf that would actually be bad.
 
Nobody cares at this point. Gpu prices have went down, eth is scheduled to go PoS and next gen cards from nvidia will probably arrive in a few months.
 
Nobody cares at this point. Gpu prices have went down, eth is scheduled to go PoS and next gen cards from nvidia will probably arrive in a few months.
GPU prices haven't gone down. They dipped a small amount, but mostly for entry level and mid range cards. And even the cards that went down in price cannot be purchased because they're perpetually out of stock.

Secondly, etherium going "proof of stake" is a pipe dream. They've been saying that for a long long time, so we'll have to go with the old saying: I'll believe it when I see it.

Lastly, you say nobody cares? What you mean is that YOU don't care, and of course YOU are the only one that matters, right? Hahaha.
 
GPU prices haven't gone down. They dipped a small amount, but mostly for entry level and mid range cards. And even the cards that went down in price cannot be purchased because they're perpetually out of stock.

Secondly, etherium going "proof of stake" is a pipe dream. They've been saying that for a long long time, so we'll have to go with the old saying: I'll believe it when I see it.

Lastly, you say nobody cares? What you mean is that YOU don't care, and of course YOU are the only one that matters, right? Hahaha.
RX 6600 ~ 370 eur
RTX 3080 ~ 1050 eur
these prices are for ready to be shipped cards (in stock)

Prices are down on both mid and high end cards, and the trend is still downward.

 
In the mobile space, Intel‘s only direct GPU competitor is nVidia right now. Same will go for OEM desktops.

One nitpick:



Really, a space heater like the 6900XT (304W in gaming) or 3090 Ti (469W in gaming) ? Yes, those two are definitely in the same power consumption league.

Thing is: If Arc‘s power consumption is good, or not depends on its performance. If it performs around the 3070 level, 225W on TSMC 6nm is not great, on a 3080 performance level it would be great, below 3070 perf that would actually be bad.

Funny tho, 6900XT can peak at 619 watts spikes - which you can see proof of in this link; https://www.techpowerup.com/review/msi-geforce-rtx-3090-ti-suprim-x/38.html

6900XT are not in the same league as 3090 Ti anyway. These cards are made for 4K/UHD or higher but even 3080 Ti beats 6900XT at 4K+
Hell, even 3080 10GB beats 6900XT at 4K in most games... and DLSS is a gamechanger in high resolution; FSR is not as good but works on Nvidia cards too.

AMDs low bus width and bandwidth is probably the reason. AMD went cheap this time - 256bit and GDDR6 on high-end cards - still charges a premium ... and this is why Nvidia sits at more than 85% dGPU marketshare now.

Soon Nvidia will be slamming out 4000 series on TSMC 5nm while still blasting Ampere out on Samsung 8nm node. Smart decision and a big problem for AMD. Also, Intel is coming for AMDs prime segment; Low to Mid-End. AMD better wake up and return to good value cards instead
 
Last edited:
I just hope Arc will deliver decent performance at decent prices. I don't think anybody expects their high-end solution to get close to Nvidia and AMDs best offerings. Intel's pricing should be aggressive and eat some fast marketshare. They should probably even sell at a small loss trying to capture some marketshare fast.
 
Competition in the marketplace is a good thing. Will be interesting to see how these new graphics processors perform.
 
All you plebs do know crypto is controlled by Nvidia and AMD.

PoS is never going to happen.

The gpu prices are set the way they are to unload all the old cards as crypto buyers are saturated. When new ones are released, crypto will fire back up again and new gpu prices will be 2-3k.....mark it.
 
Having read the 3090 Ti review, the

but by the looks of it this won't be a space heater on par with AMD's RX 6900 XT or Nvidia's RTX 3090 Ti.

statement looks even more out of place considering the power consumption graph.

Power.png


One of the two is not correct.
 
I was talking about sustained power and not millisecond peaks.
Peaks are what kills PSU or makes systems reboot and 6900XT have the highest peaks.

GDDR6X is also one of the reasons why Nvidia uses more power, which is what brings them solid perf at high resolution. AMDs performance at 4K/UHD or higher is really not that good because of 256 bit bus and GDDR6 memory.

3070 and 3070 Ti performs almost the same but 3070 uses 220 watts and Ti uses 300 watts. GDDR6X is the reason.

AMDs power usage is generally lower is because of a superior node - TSMC 7nm - they lost in performance anyway...

So what will happen when Nvidia goes back to TSMC with 4000 series? Bringing superior architecture to a leading node with a huge increase in clockspeeds? AMD better wake up.. Lets hope that AMDs MCM approach does not fail hard, because this approach can introduce all kinds of issues - Will we see microstutter, incompatibility etc? They might work "fine" in DX12, but what about DX9, 10 and 11...
 
Back