More benchmarks of Intel's Arc A770 desktop GPU surface online

nanoguy

Posts: 1,237   +24
Staff member
In context: Intel's Arc A770 is just one step down from the Arc Limited Edition desktop graphics card, and gamers are hoping it will be powerful enough to rival Nvidia's RTX 3070 Ti and AMD's RX 6700 XT. Early benchmarks suggest the Intel part will be weaker than either of the two, but there's still hope since Team Blue appears to be burning the midnight oil to get the drivers right before the much-awaited summer launch.

Intel's mobile Arc A-series GPU launch was largely a paper launch, with only a couple of laptop models available on the South Korean market as of writing this. The company managed to brew some enthusiasm amid a troubled GPU market, but it appears the drivers for Team Blue's dedicated graphics solutions are far from ready with some features having a significant impact on the overall performance.

This has delayed the launch of the desktop Arc graphics cards by months, but that hasn't stopped models like the A770 from making an appearance online. Although this is supposed to be one of the higher-end Alchemist GPUs, early OpenCL benchmarks suggest it will more likely compete with mainstream GPUs like AMD's RX 6600 XT and Nvidia's RTX 3050.

The A770 GPU has also surfaced on Puget Bench's database with a new benchmark result for Davinci Resolve workloads. Interestingly, the test system uses an older, Intel Core i5-9600K CPU, and the motherboard designation suggests this is an internal development kit. There are two scores — 39 and 45 points, respectively, suggesting the Arc A770 is capable of roughly half the performance of an RTX 3070.

These results should be taken with a grain of salt, but there is one interesting detail that suggests Intel is hard at work trying to improve the drivers for Arc GPUs. The driver installed on the test system is designated as version 30.0.101.1723, newer than what's publicly available for Arc mobile GPUs as well as Intel Xe integrated graphics.

Intel has the difficult task of working on game-specific optimizations for a large number of PC titles that already run well on AMD and Nvidia hardware. The company says it will prioritize the top 100 most popular games for the desktop Arc GPU launch and then expand the list of certified titles from there.

If early benchmarks are any indication, Intel will also have to price its Arc GPUs aggressively to spur adoption.

Permalink to story.

 

Achaios

Posts: 389   +1,079
They say the A770 scored half the score of a 3070, so if we port the results over to 3D Mark Firestrike score results and we assume 1:1 ratio, the A770 would score around the same score as a 1070ti in 3D Mark Firestrike.

Not bad but nowhere near the 3070 or even 2070.
 

Dimitriid

Posts: 2,216   +4,268
With much worst driver support and probably much worst power consumption levels as well intel would need to reaaaally undercut AMD and Nvidia if they want anyone to even consider them: if this is 3050 performance with crap drivers and double the power but sells for like USD 180 then maybe they'd have something compelling.

But again I must reiterate that I consider this effort mostly wasted on gamers and I think they'll quickly shift focus to just supporting workstation and AI/ML loads (And crypto too) instead as the main focus.
 

emmzo

Posts: 638   +836
By summer or late summer when Intel is supposed to come up with discrete cards, Nvidia`s and AMD`s will hit MSRP or very close to it, so, I wish them best of luck. I think they`ll have to sell at a loss a few years, before they can mature and be a real challenge.
 

yRaz

Posts: 4,526   +5,394
With much worst driver support and probably much worst power consumption levels as well intel would need to reaaaally undercut AMD and Nvidia if they want anyone to even consider them: if this is 3050 performance with crap drivers and double the power but sells for like USD 180 then maybe they'd have something compelling.

But again I must reiterate that I consider this effort mostly wasted on gamers and I think they'll quickly shift focus to just supporting workstation and AI/ML loads (And crypto too) instead as the main focus.
What do you base power consumption numbers on? Also, power consumption means that miners will likely not be interested in it.
 

OortCloud

Posts: 771   +744
This was never going to be straightforward. They have to catch up with twenty years of hardware and driver development. It would be like nVidia building a high-end CPU. I just hope they knew this going in and are willing to release these cards to the market at competitive prices based on the performance they manage to extract from them at the time they are released. They can then iterate on drivers and hardware. The problem is exacerbated by the really nasty driver software nVidia and AMD produce which essentially bespokes the driver on a game by game basis to get the most out of the hardware.

Eventually you would expect a company of Intel's size to succeed if they poach the right people. Just have to hope they have the stomach for a long-term investment like this...
 

Vanderlinde

Posts: 148   +102
Raja has done it again.

No with all seriousness; its a failed product. Simple as that. The months of delay was simply due to disappointing outcomes with its chips, clocks and power consumption. Driver fixes are'nt going to bring magical 50% of FPS.