New Intel Arc drivers bring some massive improvements to DirectX 9 performance

AlphaX

Posts: 98   +20
Staff
What just happened? One of Intel Arc's most significant flaws is its performance within games using the DirectX 9 API. Some competing cards can outperform it by over 100 percent. However, this week, Intel released a new driver update that boasts "up to 1.8x" frame rates in DX9 titles.

In October, Intel released the second and third entries into the "A-series" of its Arc line of graphics cards, the A750 and the A770. Our review of both products showed they had promise, but they could only reach their potential if Intel put in the effort and care to make the much-needed improvements. The largest of which was its DirectX 9 (DX9) performance.

Shortly before launch, Intel announced that Arc would not feature native DX9 support and would instead emulate the API. Intel expected that this change would allow more focus to be put on the card's DirectX 12 (DX12) support while not sacrificing any notable performance in DX9 titles. Unfortunately for Intel, that was not the case.

When testing the A750 and A770 in Counter-Strike: Global Offensive (CS:GO), the most popular game on Steam, which also happens to use DX9, the performance was simply jarring. While every other card tested in the review hit the 350–360 FPS range, the Arc graphics cards only managed a rather embarrassing 146 FPS average.

Intel did warn buyers that the initial launch of Arc may not always be smooth sailing, but losing 200 FPS is unacceptable. However, in-house developers have been working hard to release new drivers to improve performance in all games, especially those using DX9. This week, Intel released Arc Driver 3953, which Intel says can boost frame rates by "up to 1.8x."

While frame rates may not improve significantly in some games, such as Guild Wars 2 or Payday 2, others either come close to or reach the reported 1.8x increases. League of Legends, Stellaris, and Starcraft 2 all show good performance boosts, with the first showing improvements of nearly 40%, but the real winner in these results is CS:GO.

Intel's tests on CS:GO showed the game's average frame rate rising from 177 FPS to 318 FPS with the new drivers, which brings it closer to the FPS range from the other cards when we tested them in October. It is worth noting that Intel tested at 1080p with high settings, whereas we used medium settings. CS:GO's results were the same at 1440p, which we also saw in our review.

Overall, Intel making these changes to the Arc drivers is a great sign. More competition in the graphics card market is a big deal for consumers. It is nice to see Intel showing its effort towards continuously improving Arc to make it a viable third option for buyers.

You can grab the new drivers from our downloads section.

Permalink to story.

 
I'm an honest guy and I will eat my own words. INTEL is really committed and I can't believe it. Nvidia should be worried. I do understand Nvidia in the business sector is doing very well though. At the same time INTEL should be doing well with the amount of money and muscle so no pitty from me. They have joined at a very easy time and we need better choices when it comes to gaming.
 
I'm an honest guy and I will eat my own words. INTEL is really committed and I can't believe it. Nvidia should be worried. I do understand Nvidia in the business sector is doing very well though. At the same time INTEL should be doing well with the amount of money and muscle so no pitty from me. They have joined at a very easy time and we need better choices when it comes to gaming.

For me it will take 1-2+ years of good drivers updates before I would even consider an Intel GPU. Nv and AMD don't have anything to worry about yet.
 
For me it will take 1-2+ years of good drivers updates before I would even consider an Intel GPU. Nv and AMD don't have anything to worry about yet.
Same here I will be watching how well old games will play and future ones and if AMD gives me a reason to jump ship I will but I doubt that maybe I will re examine my options in 3-5 or so years. I usally buy a new card around that time frame.
 
And just a few months ago the early headlines and reactions were...
"DOA! DOA! DOA!"
"Slower than a 3060!? lol"
"Poor DX9 support?!"

There is little passion and optimism in the tech community. It wasn't always like this. Everyone would rather predict failures and attack companies. It's gross, boring and annoying.
 
Intel is a superpower. I expect them to make the A790 with twice the transistors and power of the A770 with also twice the VRAM (32gb we are in the era of the big neural network models) of the A770 and sell it for half the price (~$150, and yes it will still be profitable) and remove entirely all the others models (how can game developers do their work efficiently with so many different gpu models in the market? In the long run this is toxic for the market).

Why to pay for marketing to enter a new market when they can convince with the product?
 
Intel's latest move to take back the #1 spot again ..... and we the consumers will benefit .... I like it!
 
And just a few months ago the early headlines and reactions were...
"DOA! DOA! DOA!"
"Slower than a 3060!? lol"
"Poor DX9 support?!"

There is little passion and optimism in the tech community. It wasn't always like this. Everyone would rather predict failures and attack companies. It's gross, boring and annoying.

I think there’s just a lot of anti intel sentiment online. No matter what it does, some find a way to spin the situation into a negative. Raptor lake and alder lake provide healthy competition vs zen3 and zen4. When Rocket lake couldn’t keep up with zen3, the 5950x was $800 and was routinely sold out. Then alder lake came out and completely turned the situation around and then raptor lake added even more grunt. To the point where AMD felt the heat and priced the 7950x at $699 at launch. Then with the strong market reaction to raptor lake and competitive pricing from intel, AMD dropped the price of the 7950x during Black Friday to $549. That is unprecedented in recent pandemic years.

Yet many ridicule raptor lake say it’s a power hungry beast that needs a nuclear reactor to power it. Etc. not realizing that stock board voltages are too high driving unnecessary power consumption and heat under load, and that under light loads the chip only consumes 15-20W.

Intel announces it’s on track with future nodes like intel 4 and intel 3 and intel 20A and some people say it’s BS it’s corporate spin or that they hope intel is delayed because AMD needs to remain on top. Or say intel pays companies like dell and hp to not use AMD and intel is evil incarnate.

If another company launched ARC like a startup or even AMD, the reaction wouldn’t have been so negative, in my humble opinion.
 
Alchemist has been a disaster IMO< but I hope they persist and get out Battlemage on time and are targeting RDNA3.5 and Lovelace+ levels of performance, as it'll be up against RDNA4 and Blackwell. God we need competition more than ever in gpu, especially now Nividia have officially lost the plot under the money grubbing Huang.
 
Its time to test Arc GPU again with latest drivers

The review in Techspot used the launch drivers for Arc 770 and 750
 
And just a few months ago the early headlines and reactions were...
"DOA! DOA! DOA!"
"Slower than a 3060!? lol"
"Poor DX9 support?!"

There is little passion and optimism in the tech community. It wasn't always like this. Everyone would rather predict failures and attack companies. It's gross, boring and annoying.
Nop, it's not. They came out with an unfinished product, they deserved some a$$ kicking, just as much as they deserve a thumbs up when doing something right. Competition is good, but it's not of my concern to cheer them up on every failure. The way I see it, the more people are more demanding, the end product will be better, so, "negativity" in a way, drives competition to a higher level. It's not like a multi billion dollars company like Intel is going to say: "buhuhu, some much negativity hurt our feelings, we better shelf this project." They don't need our emotional support, they need our money, so, it's up to them to get better and earn it.
 
Last edited:
And just a few months ago the early headlines and reactions were...
"DOA! DOA! DOA!"
"Slower than a 3060!? lol"
"Poor DX9 support?!"
You're overselling the "everyone's a hater" thing a little too much. Many of us DO want a 3rd player in the GPU market, especially low-end where AMD / nVidia offerings are dire, but people bashed Intel for poor DX9 support because of benchmarks like this in literally the single-most widely played title on Steam. The reason for that abysmal performance (where the £400 A770 is getting savaged by a £240 RX 6600) is due to the rather stupid design decision to remove hardware acceleration for DX9 for literally no gain at all. No die space / monetary savings were made (if anything having to go back and hand-optimise support per title into the driver probably cost them more) which is why AMD & nVidia haven't removed them. Years old 2200G series AMD APU's and even Intel HD 630 Kaby Lake era iGPU's have hardware capability that a £400 Arc dGPU lacks = "what's wrong with this picture"...

I hope Intel GPU's do succeed but "pay £400 for a GPU that needs software emulation yet still loses to a competing GPU half the price" was not their cleverest way of entering the low-end market (where people will typically play a wider mix of titles old and new alike...) and a lot of criticism at the time was based on that rather than blind 'Intel hate'.

Edit: Even with these drivers, some gains are said to due to title specific optimisations, ie, it's great that the 6x titles mentioned in the charts run faster, but many of the thousands of other DX9 games made (link) between 2002 through to AAA titles well into the 2010's, eg, Dishonored (2012) may not have improved anywhere near as much, and I'd certainly like to see a far wider range of games benchmarked, not just performance but other stuff like ruling out the glitches that often come with API wrapping.
 
Last edited:
Its just not easy to make a graphics card at all while attempting to compete with the 2 giant Red and team Green here. They have excellent driver experience, excellent hardware. Pretty much 2 decades of experience in GPU's. You just dont compete with that in not one or two generations. It will take a few and at best they actually might accomplish something.

I'd personally would not buy a Intel graphics card. That is more personal for the business practices it done to AMD. The fines intel was presented for that was minimum because people where led to believe for years that intel due to its higher clockspeed (Pentium 4) it should be faster then a AMD... right?

Or how HP, Compaq and all those brands where almost forced to only sell Intel products, while AMD was the better chip overall. Sigh. Lets not forget the recent move of intel's sapphire CPU where you can "unlock" certain parts of hardware using a software license. I mean what are we doing. Products are no longer ours it seems.
 
The reason for that abysmal performance is due to the rather stupid design decision to remove hardware acceleration for DX9 for literally no gain at all. No die space / monetary savings were made (if anything having to go back and hand-optimise support per title into the driver probably cost them more) which is why AMD & nVidia haven't removed them.
AMD and Nvidia's GPUs don't contain hardware just for D3D9 applications. The same internals get used, no matter what API the application was coded with. D3D9 came out before the days of unified shader architectures and those first GPUs were structured in such a way that the SIMD units were best suited for those types of operations and data formats.

Those first unified shader architectures were SIMD16 (ATI) or SIMD8 (Nvidia), before moving to become SIMD64 (ATI) or SIMD16 (Nvidia) -- they're now SIMD32/64 (AMD) or SIMD16/32 (Nvidia). Intel stuck with SIMD4 units all the way through until they reached Gen 12, which is still only SIMD8 (as is Alchemist).

Intel chose a D3D9-to-D3D12 mapping layer for apps creating D3D9 devices, because the architecture in Alchemist works best when loaded with lots of thread-heavy workgroups, something the D3D12 handles far better than any previous API version. It didn't matter that much in the original D3D9-era because game performance was heavily dominated by fill-rate, not compute performance. Specifically, they're using two layers (D3D9on12 and DXVK), with the most appropriate one pre-selected for the respective games in the drivers.
 
'd personally would not buy a Intel graphics card. That is more personal for the business practices it done to AMD. The fines intel was presented for that was minimum because people where led to believe for years that intel due to its higher clockspeed (Pentium 4) it should be faster then a AMD... right?
I hate to compare this to politics, but "truth", is a relative term, unfortunately coalescing with any individuals belief system.
A unique and salient example of this is the "stolen election" claims of 2020. So, people will believe what they were told to believe, which doesn't always coincide with the truth.

Where the typical AMD fan leaves off in this saga, is how "evil Intel deceived people into believing that clock speed (Intel), is superior to dual core performance". Need I remind you that AMD "deceptively", marketed their CPUs with numbers coinciding with the apparent single core performance they exhibited. OK, that's, "self defense", and I get it. But that's where the story conveniently ends.

When Intel finally got the hang of dual core performance, they released the Core 2 Duo E6300. It was good enough, as were its successors, to just about bankrupt AMD, leaving them in Intel\'s wake for the better part of a decade. So, "AMD is back with a vengence". "Hoo rah"*, as they say in the marine corps.

Intel seems to have rested on its laurels, and should be castigated for that, true.
I'd personally would not buy a Intel graphics card. That is more personal for the business practices it done to AMD. The fines intel was presented for that was minimum because people where led to believe for years that intel due to its higher clockspeed (Pentium 4) it should be faster then a AMD... right?
The art of spending money shouldn't depend on raw emotion. Again this finds a parallel in politics. When someone doesn't vote on "principle", it leaves the result to chance. That chance being the person you least wanted in office winning the race.

In the GPU field , that leaves Nvidia in charge. Granted the RTX-4090 has no equal as of today. Still, there are less lofty goals to be pursued with AMD, and hopefully Intel soon as well. It depends on how badly your "want", turns to "need", which ultimately can morph into "greed", plain and simple.

From this non-gamers perspective, building gaming computers is naught but, "hot rodding for nerds".

The trap here is when your computer's capabilities exceed your own, and you have nobody to blame for losing a game but yourself.
 
And just a few months ago the early headlines and reactions were...
"DOA! DOA! DOA!"
"Slower than a 3060!? lol"
"Poor DX9 support?!"

There is little passion and optimism in the tech community. It wasn't always like this. Everyone would rather predict failures and attack companies. It's gross, boring and annoying.
And when you get legitimately excited about a new & interesting product, you're declared a "fan boy". The internet sucks.
 
You're overselling the "everyone's a hater" thing a little too much. Many of us DO want a 3rd player in the GPU market, especially low-end where AMD / nVidia offerings are dire, but people bashed Intel for poor DX9 support because of benchmarks like this in literally the single-most widely played title on Steam. The reason for that abysmal performance (where the £400 A770 is getting savaged by a £240 RX 6600) is due to the rather stupid design decision to remove hardware acceleration for DX9 for literally no gain at all. No die space / monetary savings were made (if anything having to go back and hand-optimise support per title into the driver probably cost them more) which is why AMD & nVidia haven't removed them. Years old 2200G series AMD APU's and even Intel HD 630 Kaby Lake era iGPU's have hardware capability that a £400 Arc dGPU lacks = "what's wrong with this picture"...

I hope Intel GPU's do succeed but "pay £400 for a GPU that needs software emulation yet still loses to a competing GPU half the price" was not their cleverest way of entering the low-end market (where people will typically play a wider mix of titles old and new alike...) and a lot of criticism at the time was based on that rather than blind 'Intel hate'.

Edit: Even with these drivers, some gains are said to due to title specific optimisations, ie, it's great that the 6x titles mentioned in the charts run faster, but many of the thousands of other DX9 games made (link) between 2002 through to AAA titles well into the 2010's, eg, Dishonored (2012) may not have improved anywhere near as much, and I'd certainly like to see a far wider range of games benchmarked, not just performance but other stuff like ruling out the glitches that often come with API wrapping.
Everyone? No. I was quite specific who I was referring to. I put quotations on actual examples even....
 
I usually rely on facts. In the news piece context, the facts are as follows:
- massive performance boost comes from the fact of initial massive performance loss, simply put: the base line was set low
- Intel claims something, but there's no independent tests yet
- there's no single user in the community who said he owns Intel A7x0 board and happy to get these updated drivers
- there's no evidence that these new drivers are stable, and that there's no regression in other games (I.e., drivers became better, I didnt mean they're 'good').
 
I usually rely on facts. In the news piece context, the facts are as follows:
- massive performance boost comes from the fact of initial massive performance loss, simply put: the base line was set low
- Intel claims something, but there's no independent tests yet
- there's no single user in the community who said he owns Intel A7x0 board and happy to get these updated drivers
- there's no evidence that these new drivers are stable, and that there's no regression in other games (I.e., drivers became better, I didnt mean they're 'good').
Steve (Hardware Unboxed) has Arc A750 and A770 GPUs. I'm sure we'll see a test in the near future.
 
Back