Intel Arc A380 gaming performance disappoints in early review

Except that Intel is a behemoth that as of the time of the 1080Ti's release had a market cap 20x higher than AMD, with 100x higher earnings and 40x higher revenue. NVIDIA also had something like 80% discrete GPU market share at the time (and 90% of the profits), yet a nearly broke AMD was able to go from that to competitive/leading again two massive companies both many times their size (at the time).

You would think with those sort of resources, Intel could have done a little better than produce a GPU series that is 2 years late and can't compete with the bottom range of a soon to be superseded GPU generation.

Personally, I find that pretty easy to make fun of.
You forgot to add all the dirty cr@p they have pulled against AMD, specially how they bribe tech sites, ‘Tubers, “tech reviewers “, developers, etc to push their brands and products down todays weak minded sheep throats , hence creating the crazy rabid fanbois legion we now have.

Example, how many here are already set in buying a 4090Ti, regardless of price, power consumption and better yet, not even one confirmed benchmark to justify the purchase?

Yeah, thats how bad we have it.
 
Adrian: "begs the question" does not mean "raises the question". It actually means almost the opposite.

Why it matters: if we misuse "begs the question" long enough, the English language will lose this unique and compact expression.
Can't say I'd miss it, but it looks good to me. I've heard both used in the same context my whole life. I personally don't see the need to spend any more time on it than I already have.

I want someone to go after the people that don't know the difference between lose and loose? Those people are English serial killers!

Btw, have you seen the words they've added to the dictionary the last decade alone? Go after them!
 
Lovely!

The massacre is complete, lets move on to the other one.

bd5e07eaaa1a69eaad755c08221a3c546f42e7519c62029f2f81d68c6187d007.jpg

LOL! The pictures are mixed up. NVIDIA and AMD somehow got switched! AMD vs NVIDIA and suggesting AMD can take them. Not a fanboi either way, could care less - just gimme the best cheapest. Currently AMD lose. Cos far from the best.
 
Finally, a benchmarked Arc!

For all the bashing in the comments, it's not too bad...

...If it had launched on time.

At this point it's unclear if Arc will fully launch before Nvidia and AMD's next gen is out to truly embarrass it.

Talk about a missed opportunity too. Intel could have launched something worse than this to acclaim if it had gotten out during the GPU shortages.
 
So even though it is CHEAPER than parts it's meant to compete against it still get bad reviews? It isn't THAT much slower and a good bit faster. The Dollar per FPS is greater and this is a LOW END product and advertised as such.

OMG, SUCH BAD PERFORMANCE. Seriously, this is silly. A few percentage points for a few hundred less and people are hating this card.

Intel never had a chance is this is how the market is treating them. Lets see their high end offerings and the price to performance they have.

Going by Videocardz‘ latest article, it‘s actually more expensive than the 6400

In terms of pricing, A380 can now be found for 1399 to 1499 CNY, reports @Greymon55. That’s more than official MSRP of 1030 CNY and still more than AMD RX 6400 graphics cards

Also worth noting that the Gunnir model (that was benchmarked) uses considerably more power than the 6400 and 1650.
 
Except that Intel is a behemoth that as of the time of the 1080Ti's release had a market cap 20x higher than AMD, with 100x higher earnings and 40x higher revenue. NVIDIA also had something like 80% discrete GPU market share at the time (and 90% of the profits), yet a nearly broke AMD was able to go from that to competitive/leading again two massive companies both many times their size (at the time).

You would think with those sort of resources, Intel could have done a little better than produce a GPU series that is 2 years late and can't compete with the bottom range of a soon to be superseded GPU generation.

Personally, I find that pretty easy to make fun of.

All of this is accurate but I'd like to explore why I went as far as to add why intel will probably never be able to work with game developers and engineers to improve their software stack.

And that's because it's just not on their culture: several decades ago they were able to start with a few successful products simply because they were among the first widely available computer CPUs (Think all the way back to 186, 286, 386, etc.) It didn't went without any speed bumps as the early AMD efforts with the K2 and later the Athlon were really close to reverting a lot of that advantage.

But intel's corporate culture quickly grew up to be much of what we also saw with close collaborator Microsoft during that same late 90s early 00s era which was just monopolistic practices: Why compete on performance and value if you can just insinuate yourself directly with backroom deals to the people who buy the most PCs which are other corporations. Specially on the server side they quickly established attractive deals with very effective long term deals that locked down intel as an institution.

This means intel very rarely needs to compete too much and why AMD is able to make some comebacks the current one being the strongest so far. But the reason intel has the space to regroup and come back with more compelling products it's because they still have those very large overall deals all around the enterprise and data center world that they've built over their 90s reputation all those years back.

So the corporation might hire a 'Serious' team of engineers for Raja and such and fully commit to enter the GPU market to disrupt it but well, Nvidia has been able to establish a very solid lead in the compute world for servers in much the same way intel themselves established their dominance many years before them. So they don't have the same advantage and honestly, they don't have a good company culture to work with other companies: they're simply not used to actually consider and value the input from their partners when it comes time to optimize drivers because well, they never had to do that to get to where they are now as the top dog of the server and enterprise dog (Still, though it's rapidly declining, rapid by the super slow rate of renovation in data centers at least) so they just don't have a culture of actually working with software engineers outside their own company or the level of dedication it takes to actually optimize drivers.

That's why I think that they're just not a good company to work with overall: they're very inflexible and always want absolute control of what products they sell and don't react to business partners and customer demands, they have to be literally forcefully dragged forward by much more flexible companies in terms of what they want to deliver for servers like AMD and on the GPU side Nvidia that while controlling, they're willing to actually put down a ton of money into not just optimizing drivers but making sure game devs are very invested in working closely with Nvidia because of their endless sponsorship deals: Would you rather work with an intel engineering team that probably ignores most of your white papers and request for feedback when you can't seem to get their new GPU to perform as it should or would you rather work with the guys that after just a few public nods show up with a brief case full of money and fancy new tech for your game engine and an overall deal to heavily market the fact that they're partnering you for extra features?

Who do you think will be more invested in working with you to optimize drivers and issues on your game engine (If my speculation turns out to be accurate which I think there's a good chance)
 
I know a lot of people with pretty basic PC's that would gladly take one of these for $150.00 just to be able to do a little more gaming, even with eye candy on lower settings.

Not everyone has to spend $$$$ to game. They just want to game, that's it.
 
Sorry, but thats a lie.

AMD gpus were and are cheaper and somewhat easier to buy, yet the nvdrones ignored that and instead provided nvidia more lube for their master jensen .

If that were true, I would have picked up an AMD GPU over Nvidia's offering. I would have settled for a 6700XT, but when the 6700XT averaged $1k or higher, that was just a stupid way to waste your money considering the 3070 was going for around $800 and the 3080 were just over the $1k price. It was better to spend the same amount or slightly more on the 3080 for a better card or less on a 3070 for a card that performs similar to the 6700XT.

AMD GPUs were certainly easier to find, my local MicroCenter stores had inventory, but due to the extreme prices these cards held they rarely moved. The cards for AMD that did move were the 6800, they'd hit the shelves (far and few between from what I saw) and were only priced slightly higher over the 6700XT. The 6800 cards flew off the shelves.

Next where the 6800XT cards, they were extremely overpriced, usually within around $200 of the 6900XT. Spending around $1600 on a 6800XT that has similar performance of a 3080 that you could find for around $1100-1200......that's just stupid to spend an extra $300-400 on the AMD card. Folks would either opt for a 3080, if they could find one for that $1100-1200 range or they'd just skip the 6800XT and spend closer to $2k for the 6900XT.

Clearly, this was pricing in the States when cards were harder to come by - the pricing could certainly be different in other places around the world, but for me, you got more performance for your money going with Nvidia.
 
Glad to see Intel finally has a product and we can see some real benchmarks, underwhelming, however this is a new product, they'll have bugs to work out and drivers to improve upon.

We now officially have another series of products for the AMD crowd to post rage hate against as well now, so that's always fun...

Wonder how quickly Intel will have drivers that are on par with AMDs in quality and stability, it only took AMD the better part of a decade to do this, Intel will be there in less than a year no doubt.

Looking forward to more benchmarks for their higher end products, not because I expect them to compete toe to toe with the likes of nVidia, but because it's good to see competition in a market that has long been dominated by a single player.
 
Clearly, this was pricing in the States when cards were harder to come by - the pricing could certainly be different in other places around the world, but for me, you got more performance for your money going with Nvidia.

More performance, more features, better support, etc. you go nVidia.

If you have a rage boner against nVidia and don't care for any of the above, then you go AMD. :joy:
 
Last edited:
It's only a disappointment to anyone who was delusional enough to think that Intel could just catch up to ATi and nVidia that quickly. Intel doesn't have the decades of experience that you'd find from their competitors.

If suddenly S3 jumped back into the game with some new version of their Chrome GPU, would you expect them to be competitive? Of course not, because they haven't produced gaming-grade GPUs in literally DECADES. Intel is no different. Besides, when Intel (or nVidia for that matter) fails, I'm not disappointed, I'm elated! I have no sympathy for those two scummy companies. :laughing:
 
If that were true, I would have picked up an AMD GPU over Nvidia's offering.

Read my post again, I clearly said a bit easier, not that they were always available. Do note, I was able to get a 6900 XT at mrsp at amd.com with little effort, but that was before they implemented a queue, yet I never received a notification from EVGA (since I had a 970 from them and they were giving priority to the cult members first).
I would have settled for a 6700XT, but when the 6700XT averaged $1k or higher, that was just a stupid way to waste your money considering the 3070 was going for around $800 and the 3080 were just over the $1k price. It was better to spend the same amount or slightly more on the 3080 for a better card or less on a 3070 for a card that performs similar to the 6700XT.

If a 6700 XT was 1K, no way in hell that a 3070 was sold cheaper, knowing how the nvdrones would pay regardless.
my local MicroCenter stores had inventory, but due to the extreme prices

I lost all respect for MC since they either got bribed by nvidia or they simply decided on their own will to shaft AMD.

I went several times and the story was the same, they placed ridiculous prices on AMD (even the open box ones) and then the sales people would double down and tell you "you dont want that, you want a nvidia gpu". Which strangely, was priced a tiny bit lower than the AMD ones.
That was in their Yonkers, NY store.

Really bizarre.
 
Last edited by a moderator:
I don't see why people honestly care how the low end card performs. Who here is honestly going to buy it? Anyone....?

I am definitely interested in sub-$200 GPUs - you know, the ones that middle class families can actually afford. High end cards grab the headlines, but Nvidia especially seems to have forsaken the budget range. I would love to update my daughter's homework PC to a step above its integrated graphics, but she does not need a $300 RTX 3050 to occasionally play Fortnight on her Pentium G4560.
But I also want the best performance for my scarce dollar. I will be keeping an eye on these and their driver development.
 
Read my post again, I clearly said a bit easier, not that they were always available. Do note, I was able to get a 6900 XT at mrsp at amd.com with little effort, but that was before they implemented a queue, yet I never received a notification from EVGA (since I had a 970 from them and they were giving priority to the cult members first).


If a 6700 XT was 1K, no way in hell that a 3070 was sold cheaper, knowing how the nvdrones would pay regardless.


I lost all respect from MC since they either got bribed by nvidia or they simply decided on their own will to shaft AMD.

I went several times and the story was the same, they placed ridiculous prices on AMD (even the open box ones) and then the sales people would double down and tell you "you dont want that, you want a nvidia gpu". Which strangely, was priced a tiny bit lower than the AMD ones.
That was in their Yonkers, NY store.

Really bizarre.

They were, but they didn't sit on the shelves because of it. AMD cards were way, way, way over priced whereas Nvidia cards were only way, way overpriced.

6700XT cards sat for weeks, even months at the local stores because they were $1k+. The 3070s were around $800-900, but they flew off the shelves because they were generally around $150 cheaper than the 6700XT and you get a little better performance from them over the 6700XT.

The only way to get a decently (if you can call it that) priced 3070 was if you got to the store early enough to wait in line and hope that you get picked to purchase a GPU before they were all sold.

When I happened into my local Micro Center last November, I was looking to pick up a CPU/MB/RAM and it was around 3pm when I got there. I walked into the store and I saw a few guys standing in line by the DIY section, I thought they were waiting to get help for getting a CPU so I stood in line. After about 5 minutes one of the employees came up to me and asked what GPU I wanted, they had some 3090s a couple 3080s some 3060Tis and 3060s left. All the 3070s were taken, I asked about the prices for them and they were around $900 (the 6700XT cheapest model was priced over $1k still at this time) The 3080s were priced almost $1200 and the lowest priced 3060Ti was $480, so I purchased a 3060Ti for close to MSRP and I ended up with a GPU/MB/CPU/RAM for a build.

I do recall there was a time that someone with MicroCenter blasted AMD for having inferior GPUs (can read about here in this story: https://www.tomshardware.com/news/micro-center-slams-amd-gpus-ceo-issues-apology ) and the CEO had to issue an apology about it. It wasn't just Micro Center that had outlandish prices for AMD cards, but similar prices surrounded AMD cards at other companies, such as Newegg, BestBuy, B&H Photo and so on....they all had AMD cards priced above Nvidia.
 
Can't say I'd miss it, but it looks good to me. I've heard both used in the same context my whole life. I personally don't see the need to spend any more time on it than I already have.

I want someone to go after the people that don't know the difference between lose and loose? Those people are English serial killers!

Btw, have you seen the words they've added to the dictionary the last decade alone? Go after them!
I see "loose" instead of "lose" used 99% of the time. Can`t agree more. I`m tired of pointing that out.
 
AMD cards were way, way, way over priced whereas Nvidia cards were only way, way overpriced.
Then that matches my experience at MC.
I do recall there was a time that someone with MicroCenter blasted AMD for having inferior GPUs (can read about here in this story: https://www.tomshardware.com/news/micro-center-slams-amd-gpus-ceo-issues-apology ) and the CEO had to issue an apology about it. It wasn't just Micro Center that had outlandish prices for AMD cards, but similar prices surrounded AMD cards at other companies, such as Newegg, BestBuy, B&H Photo and so on....they all had AMD cards priced above Nvidia.
Like I said, I dont know the reason behind, sounds like money exchanged hands and AMD got shafted, same as what Intel did (still does?) with Dell and others back then.

Those are reasons why I despise Intel and Nvidia.
 
There is an interesting article over at Tech Power Up about this. Apparently, the Arc A380 performs even worse with an AMD CPU than it does with an Intel CPU.

Intel surely knows how to win and keep customers. 🤣
Actually, I can see them doing this on purpose, so their loyal fans would believe the lie that "it works better with our CPUS!".

They did that back then with a compiler.

Telling you, they and nvidia would do things like this (screw their own customers) just to keep you locked in.
 
Actually, I can see them doing this on purpose, so their loyal fans would believe the lie that "it works better with our CPUS!".

They did that back then with a compiler.

Telling you, they and nvidia would do things like this (screw their own customers) just to keep you locked in.
I agree. It reminds me of that cooler Intel was hiding under the table when Intel was doing a demo a few years back. They got caught then, and they got caught now. I guess that the trouble is that the average PC customer would have no knowledge of Intel's shenanigans, and not realize they are being fleeced, and worse yet, such a customer might not even care.

I have not visited TPU in a while, and just happened to visit today where the article was prominently featured. Not, of course, that I would consider buying this card.
 
Is it really that Intel can't build drivers, or is it maybe that until now there has been no reason for anyone to prioritize how their game performs on them?

Put another way, if we were to say six months ago have swapped Intel and Nvidia's entire driver teams, do we really think these charts would look any different? Heck, some of these games being benchmarked were probably developed and shipped without the studio ever having seen a single Intel dGPU.

My point is less to defend Intel's team which I know nothing about, but simply to point out the obvious that established installed base is probably a major if not the dominant factor here.
 
Is it really that Intel can't build drivers, or is it maybe that until now there has been no reason for anyone to prioritize how their game performs on them?

Put another way, if we were to say six months ago have swapped Intel and Nvidia's entire driver teams, do we really think these charts would look any different? Heck, some of these games being benchmarked were probably developed and shipped without the studio ever having seen a single Intel dGPU.

My point is less to defend Intel's team which I know nothing about, but simply to point out the obvious that established installed base is probably a major if not the dominant factor here.
Intel has stated that they are still working on [mobile] drivers before they can begin a wider rollout.

Their dGPU can't be much better than that currently. They need more time and they can have it.

Why would you think AMD or Nvidia could familiarize themselves with an architecture they didn't design and do it better than Intel??? 😂
 
Back