Intel Arc A770 & A750 Review

It may temper your thoughts on the A770 when one releases that, in terms of hardware specifications and theoretical performance figures only, Intel's card should be competing at the top of the list of the cards in the review.

It's not a small chip -- at 21.7 billion transistors, it's around 25% larger than the GA104 (3070 Ti) and Navi 22 (6700 XT); it has more TMUs and ROPs than any of the others, and the same memory bandwidth as the 6800 XT. Yes it doesn't have anything like Infinity Cache, but it does have 5.85 of L1 and 16MB of L2 cache -- the latter is four times more than the GA104 (former is a tiny bit more).

The Xe cores can handle concurrent FP/INT threads, just like the others; each render slice (four Xe cores) has its own dedicated thread management engine, triangle setup unit, rasterizer, and z buffer processor. The A770 only has 4 render slices (roughly equivalent to Nvidia's GPCs, where the GA104 has 6 of them) so it's possibly a little lacking in the above aspects.

But even so, it shouldn't be as bad as this.

Am disappoint.
 
Before mocking Intel, just remember that architectures evolve and drivers mature slowly. Intel has the cash to burn and if they persist for a decade then the result will be really exciting. Right now, it's pretty good for a first gen but not something that most of us will buy.
Yep we do understand all that BUT - these gpus are overpriced! So I dont know how they will stay in gpu business for the next 10 years when they sell overpriced and unreliable(mostly by bad drivers) technology. If most of us wont buy who will? :)
 
Unless you are chasing maximum FPS those numbers are perfectly playable in most games. And the drivers "should" mature over time. Is it wrong I still kind of want a A770?
they'd better be, when they charge you $350

the problem with "good-enough" is there is clearance cards from others that truly own it:

this costs %10% more money for 30% higher performance!

https://www.newegg.com/asrock-radeon-rx-6700-xt-rx6700xt-cld-12g/p/N82E16814930056

or how about this reference 3060 ti for the same price?

https://www.bestbuy.com/site/nvidia...-card-steel-and-black/6439402.p?skuId=6439402

to truly justify this Arc purchase, you have to ignore sizeable discounts (just because the rest of the market takes awhile, doesn't mean you cant find good cards today!) you also lose 25% of that performance if your system doesn't support resizable bar:

relative-performance_1920-1080.png


oh, and only a fool would look at the 750 and say "good value" , as the poor compression will make newer games act like NVIDIA's 4gb cards of the past (think 1650 Super's falloff in newer games) its probably around Maxwell compression level ( so Ada is nearly twice the effective memory compression of Arc)
 
Last edited:
Before mocking Intel, just remember that architectures evolve and drivers mature slowly. Intel has the cash to burn and if they persist for a decade then the result will be really exciting. Right now, it's pretty good for a first gen but not something that most of us will buy.

Where I am, 6600XT is ~500 eur, ~400 eur used. If Arc were to show up even at 400 eur, new, people would buy them. Knowing retailers here, Arc 750, if they import any, will cost over 600 eur.

What I am saying is: It all depends on location where one is.
 
they'd better be, when they charge you $350

the problem with "good-enough" is there is clearance cards from others that truly own it:

this costs %10% more money for 30% higher performance!

https://www.newegg.com/asrock-radeon-rx-6700-xt-rx6700xt-cld-12g/p/N82E16814930056

or how about this reference 3060 ti for the same price?

https://www.bestbuy.com/site/nvidia...-card-steel-and-black/6439402.p?skuId=6439402

to truly justify this Arc purchase, you have to ignore sizeable discounts (just because the rest of the market takes awhile, doesn't mean you cant find good cards today!) you also lose 25% of that performance if your system doesn't support resizable bar:

relative-performance_1920-1080.png


oh, and only a fool would look at the 750 and say "good value" , as the poor compression will make newer games act like NVIDIA's 4gb cards of the past (think 1650 Super's falloff in newer games) its probably around Maxwell compression level ( so Ada is nearly twice the effective memory compression of Arc)

Read my post after...I'd also like every person who paid outrageous prices for graphics cards the last two years to justify it. Maybe price isn't the major factor for some.
 
Read my post after...I'd also like every person who paid outrageous prices for graphics cards the last two years to justify it. Maybe price isn't the major factor for some.
THE PROBLEM HERE IS THIS: NOBODY IS BUYING GPUs today AT ANYWHERE NEAR MSRP- TELL ME WHY THE ARC is somehow exempt from that rule?

You also ignore the 25% performance hit the cards will take on any system without working BAR (dropping them more into the 3050 / RX 6600 price category)

 
Last edited:
THE PROBLEM HERE IS THIS": NOBODY IS BUYING GPUs tioday AT ANYWHERE NEAR mrsrp - TELL ME WHY THE ARC is somehow exempt from that rule?

You al;so ignor the 25% pperformance hit the cards will take on any system without working BAR

Because its a new tech from a 3rd party and something to play with on a motherboard with resizable BAR while a slew of new drivers roll out this fall. Sounds fun to me. Oh, I'm willing to wait for the BF deals if any. I suspect some scalping will go on because, well they are scalpers if the stock is limited. If that's not good enough for you so be it. :)
 
Seems like only thing Intel needs right now is face poor sales with these, and simply write this off just to make really something impressive from 2nd gen.

Hell, maybe even ram those rtx4000 with solid price/perf. who knows...
 
If the Timespy and Furmark results are anything to go by, then there could be a lot of untapped potential in these cards if/when the drivers do the ‘fine wine’ thing.

Eposvox also ran some Davinci Resolve and Photoshop/Lightroom benchmarks, and these looked competitive.

All in all, a great result for a v1.0 release. Personally I’ll wait a few months to see how it pans out in the real world.
 
Not too bad especially given the price , but a bit disappointed they couldn't match the 6700XT/3060Ti with the A770. Maybe drivers will improve performance as it's early days. Of course these scores will look lame against 7600XT, 4060 soon.

Well as they say Alchemist is a testing ground, but Battlemage needs to step it up a lot as it will face off against RDNA4 and Blackwell let alone Lovelace and RDNA3.
 
There are a lot of 8th/9th gen Intel boards that support rebar with a bios update. I had an MSI Bazooka that did. I was using rebar with an i9-9700K and RTX 3080 before I did a platform upgrade. Intel did not bother to mention this I guess because it was not officially supported at launch and not every manufacturer updated their boards to allow it. I would imagine very few pre-built manufacturers bothered to get that update out though. However, if you built your own 8th/9th gen PC, you should check and see if a rebar update was released.
This is what I mean though its all a but complicated really for a mid range card
 
Read my post after...I'd also like every person who paid outrageous prices for graphics cards the last two years to justify it. Maybe price isn't the major factor for some.

I paid 1000 for my 3080 ftw3 at the end of August 2021
Since then I've put in 1000 hours in Control, RDR2, Cp2077, Doom Eternal, and Back4Blood collectively, all at 4k, all are games that take advantage of the RT and/or Tensor cores, which is why I wanted it to begin with...pretty 4k gaming.
Equate a dollar to play time and I've easily got my money's worth out of those games alone, nevermind anything else I play.
 
Last edited:
I paid 1000 for my 3080 ftw3 at the end of August 2021
Since then I've put in 1000 hours in Control, RDR2, Cp2077, Doom Eternal, and Back4Blood collectively, all at 4k, all are games that take advantage of the RT and/or Tensor cores, which is why I wanted it to begin with...pretty 4k gaming.
Equate a dollar to play time and I've easily got my money's worth out of those games alone, nevermind anything else I play.

Yes that was my point. Regardless of price and you can use it for a long time so what's the big deal. Some people just nick pick on cost/per fps performance.
 
If the Timespy and Furmark results are anything to go by, then there could be a lot of untapped potential in these cards if/when the drivers do the ‘fine wine’ thing.

Eposvox also ran some Davinci Resolve and Photoshop/Lightroom benchmarks, and these looked competitive.

All in all, a great result for a v1.0 release. Personally I’ll wait a few months to see how it pans out in the real world.
I don't trust Intel to "Fine-Wine" anything. It's not like they haven't been making GPU drivers for years already so they should know what they're doing.

Quite frankly, I'm thoroughly enjoying watching Intel fall flat on its face.
 
70 FPS min doesn't cut it for a competitive game, but it's not terrible for single player.
That's very true and the usability of the ARC cards isn't the issue. The issue is that Radeons are significantly more potent at the same or lower price points. It means that Intel ARC isn't necessarily a bad architecture, it's just one that's not worth buying.

I feel much the same way about GeForce cards. They're not bad products, they're just not worth it compared to the competition.
 
Back