Intel Arc GPU Re-Review: New Drivers, New Performance?

For sure, Intel Arc's been an easy target playing against the big boys. But, I for one, think it's nice to see an arguably viable third option from Intel to keep Nvidia and AMD on their toes, especially considering their recent release concerns. Yes, they're charging too much, but this is first-gen. If they continue putting as much effort here as they did to get from zero to almost hero, we may see some serious competition next-gen. They already have appeared to defeat both the RX 6600 and the RTX 3060 in this price class.
 
I think I tell has done a great job straightening out their driver's, and now their big focus should be on improving the unoptimized "safe" path that games default to, so they don't need to optimize for every title to hit 6650 performance.

The a770 also still seems to bottleneck at 1080p.
 
Great job intel

Should only help lower pricing of all as long as they can keep of with Nvidia features over time

Otherwise Nvidia is going to remain most expensive as their cards are doing more per game (DLSS and RTX)

Def recommending a 750 for a friends first pc. Try to get our hands on something for around ~$200
 
So, judging from what those few possesing Arcs say regarding Intel's gfx drivers, A770&750 are decent 1080Ti counterpart if you wanna smth new, is that true?
 
Nice to see reviews on these cards.
In the cost per frame charts, I think you need to change the "Higher is better" to "Lower is better".
 
FYI you can get a 6600XT for $259.99 and 6750xt for $371.99
Save: $30.00 (10%)
ASRock Challenger D Radeon RX 6600 XT 8GB GDDR6 PCI Express 4.0 Video Card RX6600XT CLD 8GO
https://www.newegg.com/asrock-radeon-rx-6600-xt-rx6600xt-cld-8g/p/N82E16814930063?Item=N82E16814930063&nm_mc=AFC-RAN-COM&cm_mmc=afc-ran-com-_-Future+Publishing+Ltd&utm_medium=affiliate&utm_campaign=afc-ran-com-_-Future+Publishing+Ltd&utm_source=afc-Future+Publishing+Ltd&AFFID=2294204&AFFNAME=Future+Publishing+Ltd&ACRID=1&ASUBID=pcg-us-1022092712969272800&ASID=https://www.pcgamer.com/this-radeon-rx-6600-xt-is-a-great-budget-option-for-dollar100-under-msrp/&ranMID=44583&ranEAID=2294204&ranSiteID=kXQk6.ivFEQ-cLkIujijga4Ne_ZwZ_duOA

and $429.99
+ $38 off w/ promo code VGAEXCMSJZ376, limited offer
$409.99 after $20.00 rebate card
MSI Mech Radeon RX 6750 XT 12GB GDDR6 PCI Express 4.0 Video Card RX 6750 XT MECH 2X 12G OC
 
For sure, Intel Arc's been an easy target playing against the big boys. But, I for one, think it's nice to see an arguably viable third option from Intel to keep Nvidia and AMD on their toes, especially considering their recent release concerns. Yes, they're charging too much, but this is first-gen. If they continue putting as much effort here as they did to get from zero to almost hero, we may see some serious competition next-gen. They already have appeared to defeat both the RX 6600 and the RTX 3060 in this price class.
I agree but is intel keeping Nvidia and AMD on their toes? It seems they are distancing themselves from Intel with new generation in terms of price and delaying the mid to low end tears not to directly compete with intel. If you ask me they are in cahoots with each other to prevent flooding the market and causing a crash in gpu prices which we all thought would happen by now. Thoughts?
 
Value is improving. Just not there yet. Still waiting on another 30% drop. Content with GTX 1050 Ti (net cost $125 from NewEgg many years ago).
 
A770 would be pretty good deal at $300, but even at $350 I would buy it over a 3060. Still it's mostly a 1080p card unless you use upscaling where 1440p is good. But now game developers have to supports three upscaling techniques. This all sucks, we should have an open source hardware agnostic approach. Oh wait that describes FSR, which is open source and can run on any hardware, but it can't leverage say tensor cores on Nvidia. If AMD released FSR 3.0 that could leverage tensor cores, developers could stop worrying about DLSS.
 
Intel cards still have 3 issues and I'm not counting divers:
1. hi power usage at idle 44W and 250W when used
2. price is 400 Euro
3. hard to teardown and re-assamble because of the glue used

For this price you can find better cards.
 
Am I reading the results correctly? To test Intel's claims for DX9 performance improvements, you ran 11 DX12 titles and one DX9 title? LOL

And then the overall blended graph showing improvement overall is basically entirely skewed by the results of the DX9 title. Take that away and its probably a 1%-2% improvement only.
 
Come on ! We don't need another AMD without AMD/ATI's decades in the GPU market. Who would spend on an Intel GPU in their first generation with absolutely no guaranty for the future for about the same fps/dollar ?

We need SERIOUS price cuts and returning to fair prices.
 
Am I reading wrong or are they saying that f1 only has 9% drop in fps from 1080 to 1440 while also showing it going from 106fps to 76fps? I'm thinking that would be more like a 30% hit
 
Nice to see reviews on these cards.
In the cost per frame charts, I think you need to change the "Higher is better" to "Lower is better".
Sorry about that - should be fixed now.

Am I reading wrong or are they saying that f1 only has 9% drop in fps from 1080 to 1440 while also showing it going from 106fps to 76fps? I'm thinking that would be more like a 30% hit
And apologies for that one too!

Am I reading the results correctly? To test Intel's claims for DX9 performance improvements, you ran 11 DX12 titles and one DX9 title? LOL
Intel has been claiming performance improvements for DX9, DX11, and DX12 titles, as clearly shown in the third image in the article:

2023-03-01-image-2-j_1100.webp


Besides, it's not like the article specifically states that we were only looking at DX9 performance improvements -- it was an overall look, comparing the launch driver used in the first review to the latest one available, at the time of testing.
 
Sorry about that - should be fixed now.


And apologies for that one too!


Intel has been claiming performance improvements for DX9, DX11, and DX12 titles, as clearly shown in the third image in the article:

2023-03-01-image-2-j_1100.webp


Besides, it's not like the article specifically states that we were only looking at DX9 performance improvements -- it was an overall look, comparing the launch driver used in the first review to the latest one available, at the time of testing.
Dollar normalized appears to indicate they’re factoring in the price drop, not just the performance increase. That might explain why there was minimal improvement in some of the games you tested that were included in the presentation.
 
Dollar normalized appears to indicate they’re factoring in the price drop, not just the performance increase. That might explain why there was minimal improvement in some of the games you tested that were included in the presentation.
That's certainly a possibility. Taking Steve's A750 overall figures of 97 fps from the launch review and 105 fps from the recent quick run-through of the same benchmarks, factoring in both the performance increase and price drop would result in a 26% increase on Intel's chart. However, that does include CS:GO -- my estimates of what the increase would be like without that title puts the chart gain at roughly 18%. Despite the lack of scale on the chart, Intel's increase looks closer to 16-18%, which is on par with the above.

The only problem is that when one draws a horizontal line from the top of the Tina Tina chart, it's clear that every other game Intel tested has an apparent gain in fps, not just fps with the price drop. Yes, Intel only said "some games get a modest improvement of a few percent with better stability being the focus" but a 10% increase in Far Cry 6 isn't modest, and the chart does suggest that there is some performance improvement.
 
Come on ! We don't need another AMD without AMD/ATI's decades in the GPU market. Who would spend on an Intel GPU in their first generation with absolutely no guaranty for the future for about the same fps/dollar ?

We need SERIOUS price cuts and returning to fair prices.
Well, its a good thing we are not getting another AMD, as intel has been far more responsive on actually fixing issues then AMD has ever been. They've already shown more dedication and that they will listen to their consumers.

Intel is also the reason the 6600 and 6650 are decent prices now. It would not have happened if the A750/770 didnt come out and manage to hit AMD market share in just under a year.
 
In Europe the Rx 6650 XT is 300 Euro, VAT included. That is 33% lower than Arc A770 with a 2-3% better perf also. That would look real great if not 2 year old tech and that low 128 bit bus.
 
Intel has been claiming performance improvements for DX9, DX11, and DX12 titles, as clearly shown in the third image in the article:

2023-03-01-image-2-j_1100.webp


Besides, it's not like the article specifically states that we were only looking at DX9 performance improvements -- it was an overall look, comparing the launch driver used in the first review to the latest one available, at the time of testing.

You're right, they are showing a small increase in DX12 performance. It's interesting that in almost all of your tests the performance between launch and latest drivers was identical or within a frame or two. Does that surprise you? Do you think Intel is lying about their efforts?

Of course, there wasn't a DX12 performance problem, so even if those didn't improve (potential dishonesty aside) it's less of a concern for gamers. It was primarily a DX9 problem, and your one test validated their claims on those improvements.
 
Does that surprise you? Do you think Intel is lying about their efforts?
Not surprised at all, but I don't believe for one moment that Intel is falsifying results. The situation is no different from that where different reviewers get rather different fps in the same games, despite using very similar hardware. It comes down to matters such as whether an in-game benchmark is being used or the actual gameplay, and in the case of the latter, exactly where and in what scenario the gameplay is being recorded can make a huge difference.

That said, all of the top vendors cherry-pick benchmark results to show products in the best possible light, or at the very least, that's what is attempted. Steve's Far Cry 6 figures, for example, suggest a better improvement than that suggested by Intel, but it's not necessarily indicative of the entire game. It is, though, a valid relative assessment, because all GPUs in any review are getting measured in the same situation (or as similar as it is possible to achieve).
 
I agree but is intel keeping Nvidia and AMD on their toes? It seems they are distancing themselves from Intel with new generation in terms of price and delaying the mid to low end tears not to directly compete with intel. If you ask me they are in cahoots with each other to prevent flooding the market and causing a crash in gpu prices which we all thought would happen by now. Thoughts?
AMD doesn't want to hurt its console business by making lower-end PC gaming cost-effective. Nvidia... I suppose its executives think mindshare alone will sell its low-end cards. It may have to adjust that strategy a bit if Intel becomes more serious about price-to-performance. Since this is Intel we're talking about I'm not going to hold my breath, especially as long as it has to order from TSMC.
 
Back