Intel Arc A350M GPU sees significant performance boost in games with Dynamic Power Share...

nanoguy

Posts: 1,355   +27
Staff member
Bottom line: As is usually the case with the first generation of every product, Intel's Arc A-series GPUs seem to be suffering from driver issues that don't allow them to show their true potential. It's still early days for Team Blue's dedicated GPUs, but the company will have to get the drivers right as most gamers will be reluctant to adopt a new solution as opposed to mature offerings from Nvidia and AMD.

Intel's laptop Arc GPUs were announced at the end of March, but they've only recently started trickling down to consumers in South Korea. Perhaps even more disappointing is the fact that only a few laptops from Samsung and HP are equipped with Team Blue's dedicated GPUs, and even there we're only talking about entry-level models like the Arc A350M and Arc A370M.

Some have speculated that Intel wanted to achieve the promised Q1 2022 launch and stall the retail availability as much as possible while engineers perfect the software side of things. According to early adopters like South Korean YouTuber BullsLab who have taken a closer look at the new GPUs, this is a likely explanation for Intel's rather subdued Arc A-series launch.

To put things in context, leaked benchmarks involving these entry-level Arc 3-series GPUs have long suggested Intel's parts are a work in progress, with performance only comparable to low-end Nvidia and AMD counterparts. Even AMD took a jab at the Arc A370M GPU by posting benchmarks online that suggested the Intel part was far from being able to keep up with the Radeon 6500M despite packing more transistor logic.

Intel seems to have focused on the performance-per-watt with its mobile Arc A-series GPUs, with things like Dynamic Tuning Technology (DTT) to manage the power-sharing between the CPU and the GPU as well as a different approach to adjusting clock speeds based on the workload and the power envelope. In the case of Intel Arc, this technology is called Dynamic Power Share, but it is essentially the same as Nvidia's Dynamic Boost and AMD's SmartShift.

The problem is that it doesn't seem to work properly with the current drivers, and disabling it can yield some rather significant performance gains. BullsLab looked at performance in six games and noticed that turning DTT off allowed GPU usage to go up from around 50 percent to over 90 percent and clock speeds to climb from less than 2,000 MHz to around 2,200. The CPU was also able to boost higher, and the overall performance immediately improved anywhere between 60 to 100 percent on average.

What is even more interesting is that even with DTT or Dynamic Power Share disabled, the GPU still ran within its 30-watt thermal envelope, and the CPU power usage only climbed to around 28 watts. This may be a case of aggressive throttling to keep the thermals of the relatively slim Galaxy Book2 Pro in check, but we won't know for sure until these units land in the hands of more independent reviewers.

BullsLab also noted that stuttering was present in most of the games they tested, so we can only hope Intel can iron out these issues with driver updates over the coming months. The mid-range Arc A 5-series and A 7-series are expected to arrive in early summer with better specs and higher power envelopes, so it will be interesting to see how they perform against AMD and Nvidia offerings.

Permalink to story.

 
Well this should surprise nobody: I actually think that this is a rare case where intel probably already has enough stock of cards to release to more laptops or even the desktop gpu market but are likely holding it back and just beta testing in Korea to get a bit more maturity out of their drivers before a large western release.

It's honestly not looking good: Hopefully their ML and other workstation and data center workloads work out better for them because I seriously doubt they have enough legs to even get to a generation 2 products for consumers.
 
It was always going to be about the drivers and as expected it's a mess currently. I wouldn't touch a first gen product with a 100' pole and Intel's driver support for their iGPU's over the years has been pathetic.

Also these parts will be soon have to be compared to Lovelace and RDNA3, so Intel better get its drivers sorted real fast.
 
It was always going to be about the drivers and as expected it's a mess currently. I wouldn't touch a first gen product with a 100' pole and Intel's driver support for their iGPU's over the years has been pathetic.

Also these parts will be soon have to be compared to Lovelace and RDNA3, so Intel better get its drivers sorted real fast.

Totally agree. Any serious enthusiast should leave the growing pains to less demanding users IMHO. As for drivers as I mentioned in a different thread, performance expectations between high to mid level GPUs and CPU IGPs are totally different. The latter simply don't stress the hardware to the same level on average, so drivers don't need to be as tight. Intel is entering a totally different market here and as both AMD's and NVIDIA's occasional (or not) struggles with driver issues show, getting the software right is the battle, hardware is relatively easy in comparison.
 
Back