Deal alert: Intel Arc graphics cards from Asrock receive attractive price cuts

Shawn Knight

Posts: 15,282   +192
Staff member
Bottom line: AIB partner ASRock has slashed pricing on some of Intel's Arc graphics cards, suddenly making them far more attractive to GPU shoppers in the market for a deal. Given the cuts and Intel's evolving drivers, maybe it is time to consider an Arc card?

Over on Newegg, the ASRock Phantom Arc A770 8GB is down to $269.99 after a $50 instant price cut and the ASRock Challenger Arc A380 6GB commands just $119.99 after instant savings of $20. This apparently makes it the most affordable AV1 solution on the market with 6GB of memory.

Intel recently lowered the price of its A750 down to $250, but ASRock is taking it a step further. The ASRock Challenger Arc A750 8GB can be had for $239.99, besting Intel's price by $10.

Also read: Intel Arc GPU Re-Review: New Drivers, New Performance?

The A770 at $270 is only $30 more expensive than the A750 at $240, and is also very attractive compared to Nvidia's far more expensive GeForce RTX 3060. Some believe additional price cuts could also be in the cards.

Our own Steve Walton recently revisited Arc GPU performance following Intel's release of new drivers that claim to deliver mega performance improvements. Across a dozen games, the Arc A770 with the newer drivers performed about nine percent faster than it did at launch when playing at 1080p and eight percent faster on average at 1440p.

Wondering what else you can currently pick up for around the same price? The ASRock Radeon RX 6600 8GB is priced at $224.99 over on Newegg while the ASRock Phantom Radeon RX 6600 XT 8GB can be taken home for $274.99. Sticking with ASRock, the Challenger Radeon RX 6650 XT 8GB is $289.99. On the more affordable end, a Radeon RX 6400 is going to set you back about $140 on average.

Are you considering any of ASRock's Intel Arc cards given their new lower price points, or perhaps you share the opinion that additional price cuts are still to come in the not too distant future?

Permalink to story.

 
I wouldn't buy the a770 from them considering they nerfed the vram in half for no reason. Dead card imo. 8 gigs of vram is dead even at 1080p with max graphics in a fair bit of titles. The new RE4 remake is a good example of showing this let alone many other titles. 8 gigs isn't enough anymore.
 
Nobody in their right mind would by this thing when you have a 6650xt for the same price.... those Arc gpus will stay on the shelves, only bought by totally uninformed ppl, or those who like to pay to play beta-testers
 
Lel :) deal alert ? price going down by 50$ when everyone else price is going up - it only tell you NOT to buy it :)
 
Nobody in their right mind would by this thing when you have a 6650xt for the same price.... those Arc gpus will stay on the shelves, only bought by totally uninformed ppl, or those who like to pay to play beta-testers
amd may as well be beta too. still can't play a youtube video in chromium based browsers with the latest drivers (on the 5700 xt at least) without flickering and u have to change angle backend to dx9 or opengl. amd sucks so much they cant even achieve proper youtube video playback and theyre a trash company thats not supporting older cards properly.
 
amd may as well be beta too. still can't play a youtube video in chromium based browsers with the latest drivers (on the 5700 xt at least) without flickering and u have to change angle backend to dx9 or opengl. amd sucks so much they cant even achieve proper youtube video playback and theyre a trash company thats not supporting older cards properly.
I had a similar problem with a 5700xt and it ended up being a bad display port cable
 
amd may as well be beta too. still can't play a youtube video in chromium based browsers with the latest drivers (on the 5700 xt at least) without flickering and u have to change angle backend to dx9 or opengl. amd sucks so much they cant even achieve proper youtube video playback and theyre a trash company thats not supporting older cards properly.
I use ATI/AMD hardware since the ATI 3D rage Pro from 1998 ( who still works btw ) and NEVER had any of those ati/amd drivers issue everybody seems to have... and my customers too... maybe I'm weird or extremely lucky ... to be fair , the only notable problems I had and some of the customers I build machines for had, was with the 5700xt blackscreens who were "patched" with messing with the windows mpo... every problems with amd seems to take enormous proportions but the nvidia ones seems to dissapear really quick from the ppl minds ... coz I had at least the same amount of problems with NV gpus...
 
When Intel starts making the GPU themselves, they will be able to sell them dirt cheap and eat up marketshare in the low to mid-end. However when TSMC makes the chips, they won't. Because TSMC rips off customers including AMD. TSMC knows that AMD have no other option and raised prices year after year after year. This is why AMD CPU and GPU prices went up. TSMC wants a bigger and bigger cut and Apple always own the best proces @ TSMC. Thankfully Intel 20A and 18A is looking good + EUV and first chips will come out in 2024 on this node.
 
When Intel starts making the GPU themselves, they will be able to sell them dirt cheap and eat up marketshare in the low to mid-end. However when TSMC makes the chips, they won't. Because TSMC rips off customers including AMD. TSMC knows that AMD have no other option and raised prices year after year after year. This is why AMD CPU and GPU prices went up. TSMC wants a bigger and bigger cut and Apple always own the best proces @ TSMC. Thankfully Intel 20A and 18A is looking good + EUV and first chips will come out in 2024 on this node.
Not for a good number of years, though. Intel will still be using TSMC for dGPU and GPU tiles fabrication for a while yet:

05iquF68nbqKUed2LSMoCZz-2.png


The 20A/18A fabs will be used for Intel's primary money earners: client, datacenters, and AI. Product lines in those sectors will take the majority share, if not all, of the foundries' output. The operating margins on low to mid-end graphics cards are very small, and Intel doesn't have sufficient dGPU market share to shift enough inventory to offset this.

Besdies, Intel's AXG sector, which includes Arc graphics cards, is ridiculously unprofitable -- it had an operating loss of $1.7 billion, in FY2022, against a sector revenue of $0.84 billion. That's worse than the year before when they didn't have any graphics cards in the general consumer market. The likes of 20A and 18A are never going to be cheap enough nor have the yields/output required to turn that around.
 
I use ATI/AMD hardware since the ATI 3D rage Pro from 1998 ( who still works btw ) and NEVER had any of those ati/amd drivers issue everybody seems to have... and my customers too... maybe I'm weird or extremely lucky ... to be fair , the only notable problems I had and some of the customers I build machines for had, was with the 5700xt blackscreens who were "patched" with messing with the windows mpo... every problems with amd seems to take enormous proportions but the nvidia ones seems to dissapear really quick from the ppl minds ... coz I had at least the same amount of problems with NV gpus...
Probably because when nvidia has a problem, they dont wait years to fix it nor have to have a media circus to force them to admit it.

AMD on the other hand......well, lets see, we've had the frametime bug, the black screen bug (which STILL isnt fully fixed) the rDNA downclocking bug, just to name a few. Oh, and dont forget rDNA2 being left in the cold for drivers for nearly 3 months until people complained.

If you're gonna charge premium price for a "premium brand", as lisa su called it, then you need to give it premium support. AMD still hasnt figured that last part out yet.
Not for a good number of years, though. Intel will still be using TSMC for dGPU and GPU tiles fabrication for a while yet:

05iquF68nbqKUed2LSMoCZz-2.png


The 20A/18A fabs will be used for Intel's primary money earners: client, datacenters, and AI. Product lines in those sectors will take the majority share, if not all, of the foundries' output. The operating margins on low to mid-end graphics cards are very small, and Intel doesn't have sufficient dGPU market share to shift enough inventory to offset this.

Besdies, Intel's AXG sector, which includes Arc graphics cards, is ridiculously unprofitable -- it had an operating loss of $1.7 billion, in FY2022, against a sector revenue of $0.84 billion. That's worse than the year before when they didn't have any graphics cards in the general consumer market. The likes of 20A and 18A are never going to be cheap enough nor have the yields/output required to turn that around.
To be fair, look how much money AMD lost straightening their shat out from 2010-2016. Building new GPU designs from scratch is expensive business.
I wouldn't buy the a770 from them considering they nerfed the vram in half for no reason. Dead card imo. 8 gigs of vram is dead even at 1080p with max graphics in a fair bit of titles. The new RE4 remake is a good example of showing this let alone many other titles. 8 gigs isn't enough anymore.
I guess we'll just ignore that the 10gb 3080 has had issues with precisely ONE (1) game at 1440p/4k. Somehow, I doubt a 3050/3060 tier card is going to struggle with 8GB at 1080p. Especially given that no reviews of the A750 have seen issues with it at 1440p. How strange.....
 
Not for a good number of years, though. Intel will still be using TSMC for dGPU and GPU tiles fabrication for a while yet:

05iquF68nbqKUed2LSMoCZz-2.png


The 20A/18A fabs will be used for Intel's primary money earners: client, datacenters, and AI. Product lines in those sectors will take the majority share, if not all, of the foundries' output. The operating margins on low to mid-end graphics cards are very small, and Intel doesn't have sufficient dGPU market share to shift enough inventory to offset this.

Besdies, Intel's AXG sector, which includes Arc graphics cards, is ridiculously unprofitable -- it had an operating loss of $1.7 billion, in FY2022, against a sector revenue of $0.84 billion. That's worse than the year before when they didn't have any graphics cards in the general consumer market. The likes of 20A and 18A are never going to be cheap enough nor have the yields/output required to turn that around.
Maybe not but it's subject to change without notice ;)

Intel never expected dGPUs to become profitable in the first year or two. It's a completely new market for them.

I don't see Intel become relevant in this marked, unless they make the chips themselves. Because using TSMC pushes the prices up too high and they can't compete (unless they improve performance to deliver solid mid-end or somewhat high-end performance). Low to mid-end GPU sales stands for like 75% of the market, if not more..
 
Last edited:
My asrock 6800xt challenger D is excellent. Built like a tank and whisper quiet.

Hell of a lot better then the garbage XFX was putting out.
Asrock generally puts out good stuff, maybe he was unlucky one time and now hates Asrock, thats the typical buyer reaction
 
trash company thats not supporting older cards properly.
Interesting. I distinctly remember AMD giving their older cards ReBAR support and FSR to even their competitors.
Meanwhile Nvidia refused to add ReBAR to ever RTX 20 series cards or restricting DLSS 3 to 40 series exclusively.
Oh and earlier restricting RTX Voice to RTX cards until modders later hacked it to work properly to work on GTX cards too.
So much for supporting older cards from AMD's competitor.
Probably because when nvidia has a problem, they dont wait years to fix it nor have to have a media circus to force them to admit it.
AMD on the other hand......well, lets see, we've had the frametime bug, the black screen bug (which STILL isnt fully fixed) the rDNA downclocking bug, just to name a few. Oh, and dont forget rDNA2 being left in the cold for drivers for nearly 3 months until people complained.
Nvidia may not wait for yeard but they sure do wait months dont they?
As if Nvidia does not have black screen issues or flickering bugs present in their driver release notes going back months. And for some reason there is very little media coverage when Nvidia does have an issue like they just had with 10% CPU usage.
But if AMD has an issue then media screams from the rooftops and naysayers say this proves how bad AMD drivers are.
 
Meanwhile Nvidia refused to add ReBAR to ever RTX 20 series cards or restricting DLSS 3 to 40 series exclusively.
The ReBAR situation is worse than that, as Nvidia only enables it in specific games -- there's no option to have it active for all applications (well, not via normal means).

But to be fair to Nvidia regarding DLSS 3, the system uses the optical flow accelerator that's only in Ada Lovelace chips (which compares sequential frames and generates a motion vector field from them). Frame generation could be done without the OFA but it wouldn't look anywhere near as good. One could compensate for this by doing the same routine using compute shaders, but the performance hit would probably notably eat into the fps gained using DLSS 3.
 
The ReBAR situation is worse than that, as Nvidia only enables it in specific games -- there's no option to have it active for all applications (well, not via normal means).

But to be fair to Nvidia regarding DLSS 3, the system uses the optical flow accelerator that's only in Ada Lovelace chips (which compares sequential frames and generates a motion vector field from them). Frame generation could be done without the OFA but it wouldn't look anywhere near as good. One could compensate for this by doing the same routine using compute shaders, but the performance hit would probably notably eat into the fps gained using DLSS 3.
Yes I know Nvidia uses whitelist approach for ReBAR on 30 and 40 series. It still sucks as 20 series owner that im missing potentially double digit performance in some games because Nvidia always likes to restrict features to push their newest series. This makes me question if I should buy Nvidia in the future if this is how they treat those who do not buy their newest series every time.

And yes I know what Nvidia says about the OFA but considering how much they've lied in the past (RTX Voice being a good example) and that they have not provided any numbers or media (screenshots/video) I cant trust what they say about 20 series OFA. As far as I know 20 series does have it but according to Nvidia it's is not as capable. Plus TV's have had something similar for a lot longer on much weaker ARM hardware.
 
Yes I know Nvidia uses whitelist approach for ReBAR on 30 and 40 series. It still sucks as 20 series owner that im missing potentially double digit performance in some games because Nvidia always likes to restrict features to push their newest series. This makes me question if I should buy Nvidia in the future if this is how they treat those who do not buy their newest series every time.
To be honest, it doesn't seem to make a huge difference in the games that Nvidia does enable it for. At best, I measured a 10% uplift with a 4070 Ti, but the average increase was around 2%.

And yes I know what Nvidia says about the OFA but considering how much they've lied in the past (RTX Voice being a good example) and that they have not provided any numbers or media (screenshots/video) I cant trust what they say about 20 series OFA. As far as I know 20 series does have it but according to Nvidia it's is not as capable.
This web document details the differences between the three OFA versions. Naturally one has to take such information at face value, but given that the figures are for developers and the like, it's probably accurate. The performance difference between the Turing/Ampere and Ada OFAs is pretty substantial, but I suspect it's the lack of small grid and ROI functionality in Turing (not to mention the weaker tensor performance) that prevents Turing GPUs from having DLSS frame generation. Ampere cards, though, are being unfairly penalized by not having it.

Anyway, best to stick to the news topic.
 
Back