Intel's Arc A750 outperforms the RTX 3060, according to Intel

Daniel Sims

Posts: 674   +27
Staff
Why it matters: As Intel rolls out its freshman line of dedicated graphics cards, the first benchmarks are emerging. The company's latest official test tries to make the case for the A750 against Nvidia's midrange GPU, but many questions remain as both companies and AMD prepare to launch new cards in the second half of this year.

This week, Intel published a brief benchmark of its upcoming Arc A750 graphics card, claiming it compares favorably with Nvidia's RTX 3060 in a few popular games. Previous Arc benchmarks, both official and unofficial, have been mixed thus far.

The primary title Intel tested in its video (above) is Cyberpunk 2077, but it also showed comparisons for Control, F1 2021, Borderlands 3, and Fortnite. All the comparisons show the A750 performing about 15 percent better than the 3060. Intel's card manages around 60 frames per second in Borderlands, Control, and Cyberpunk at 1440p using the high graphics preset. F1 and Fortnite run well north of 120fps.

However, the comparison doesn't mention image reconstruction -- another area where Intel wants to compete with Nvidia. All the tested games except Borderlands support Nvidia's DLSS, which would likely give the 3060 much higher framerates. Currently, none are on the list of games planning to feature Intel's XeSS. Only Cyberpunk and F1 support AMD FSR.

The other Arc GPUs are also set to compete with Nvidia's entry-level and mainstream cards, but some tests have been disappointing. Official A730M and A770M benchmarks put them close to the RTX 3050 Ti and 3060. The A770 is supposed to match the 3070 Ti, but tests in April showed markedly worse results, possibly owing to incomplete Intel drivers. Last month, an early review of the budget A380 was even more dismal.

While Intel's graphics cards are still only available in China and South Korea, the company maintains that it plans to launch them in other markets later this summer. Nvidia and AMD will each launch a new generation of GPUs later this year, which will likely further outperform Intel's products.

Permalink to story.

 

yRaz

Posts: 4,817   +6,027
For awhile I was worried about price performance but now the market is getting flooded with used mining cards and the Chinese cards haven't even hit the US market yet.

I estimate mining cards will hit 25% of MSRP and at that price I'm more than willing to take the risk.

But going back to Intel, even if they sell their cards at a loss power consumption is a big issue. They can sell a card with 3080 performance but even if they sell it at half the cost and uses twice as much power are you really saving money when you have to buy a new PSU?

I wish Intel the best, I really do, but they don't even have cards for sale and theyre entering the market at a bad time.
 

emmzo

Posts: 766   +1,184
Oh, Intel, this is going to be hilarious when real benchmarks are done. I bet the pricing will be matching the best of Amd, Nvidia or higher. See you in 3 to 5 years, until then I'll be enjoying the sh1tshow.
 

McMurdeR

Posts: 615   +849
Gamers nexus did a few hands on vids and a first review of the A380. It's going to do do pretty well in new and recent titles, and will struggle hard in older games until Intel get optimisation profiles into their drivers. Consistency will be an issue for a couple of years, but in time it will be resolved so long as Intel remain committed. None of this is a big shock.
 

takaozo

Posts: 426   +654
What happened with Raja, why not in front of ARC press release anymore?
Did they fire him?

Or just affraid he can drop the card from his hand again?
 

Irata

Posts: 2,221   +3,857
I'll wait until the HUB/TS and TPU reviews. And then probably a little longer...

But the more competition the better, so I hope Intel can pull a rabbit out of their hat, at least by the second or third generation.
Absolutely agree but let‘s see if games with RT on are included as standard in the overall benchmark suite.
 

paul1122

Posts: 249   +265
This isn't competition that's good for the consumer, this is Intel lying to you to steal your money. 15% of my dollar says they will try to convince anyone the cards they are selling will be easy to buy. That's it.
Starforce i740 all over again, just worse this time around.
 

Hooda Thunkett

Posts: 20   +23
I'll wait until the HUB/TS and TPU reviews. And then probably a little longer...

But the more competition the better, so I hope Intel can pull a rabbit out of their hat, at least by the second or third generation.
I'm not yet convinced that Intel has the attention span for them to get to a competitive 3rd generation. Too many of their non-x86 products in the past seemed to get deprioritized before they had a chance to get them well tuned, and they're going to have to work extra long and hard to prove to me that they are serious about graphics this time.
In short, I wouldn't even consider buying their new graphics card until they hit their third competitive generation.
 

veLa

Posts: 1,176   +850
I'm sure that it's going to be "fine" if you just temper your expectations.

It's a first generation product, and I, for one, welcome Intel to the discreet graphics market and hope they provide some friendly competition to the other boys on the block.
 

Eldritch

Posts: 499   +910
The claim may be exaggerated but reaching near 3060 performance in first gen is really impressive. Let see if they can improve a bit in next gen.
 

Tams80

Posts: 130   +88
What happened with Raja, why not in front of ARC press release anymore?
Did they fire him?

Or just affraid he can drop the card from his hand again?

Likely because he is not a nice person and all that stems from that.
 

Faelan

Posts: 183   +240
I play mostly older and niche games, including flight simulators, so it’ll probably be many years before I’ll even consider Intel. Heck, it’s only just this gen where I honestly considered AMD. If it’s true that Nvidia is going to delay their launch, I may just pick up my first AMD card this year and give it a 14 day test run. If performance turns out to be great at 4K while consuming less than 400w max and I don’t run into any major driver issues, then it’s good bye Nvidia. If not, 14 day no-question/no-fee return policy FTW.
 

Ludak021

Posts: 754   +572
Oh, Intel, this is going to be hilarious when real benchmarks are done. I bet the pricing will be matching the best of Amd, Nvidia or higher. See you in 3 to 5 years, until then I'll be enjoying the sh1tshow.

You seem to be cheering for $150 cards costing $500 permanently...edit: let me put this in another way: 3060, if it came out 10 years ago, would have cost <$200 since it's low middle tier card.
 

lripplinger

Posts: 382   +172
I'm interested in ARC, after seeing Gamers Nexus videos. What piqued my interest was how ARC will work with Intel's integrated graphics on their CPUs, in future implementations. That could be really cool to be able to offload some work to the IGP, leaving ARC to do heavier rendering loads, or what not.
 

BadThad

Posts: 1,223   +1,491
Please, just flood the market already! Competitors MSRP's are off the charts insane. AMD/nvidia need a wake-up call to CUT their ridiculous MSRP's - especially on the low to mid range. In the past 2 years we've seen $150 cards with MSRP of $500. ENOUGH!