Intel is reportedly disabling AVX-512 instruction set on Alder Lake CPUs

jsilva

Posts: 325   +2
In brief: Intel 12th Gen Core desktop processors never officially supported AVX-512, but we have workarounds to enable this instruction set. However, new firmware releases might render these methods useless, as Intel is disabling AVX-512 support on Intel 12th-Gen Core CPUs at the BIOS level.

When Intel introduced the Intel 12th Gen Core/Alder Lake desktop processors to the public, the chipmaker never mentioned its products would support the AVX-512 instruction set. However, that didn't stop users from bypassing what turned out to be a soft lock. Users could enable AVX-512 on the P-cores via BIOS by disabling the E-cores on these processors.

Those running their current BIOS can still trigger the workaround, but new firmware might prevent you from using it down the line. A new report claims Intel will block AVX-512 support at a BIOS level, meaning that disabling the CPU's E-cores to enable support for AVX-512 won't work anymore.

Intel 12th-Gen Core processors will most likely employ the AVX2 instruction set as an alternative. Unlike AVX-512, Intel limits AVX2 to a maximum frequency of 5.1GHz regardless of power limits, BIOS settings, and thermal headroom, limiting the maximum performance it can offer. Intel never explained AVX2's limitations, but we believe they may be an attempt to prevent hardware degradation.

The reason for disabling the AVX-512 instruction set is still unclear, but a couple of reasons come to mind. The first would be efficiency, as it consumes more power than other AVX instruction sets. In specific use cases, the additional power consumption translates into noticeable performance gains, but the rarity we witness these improvements makes us believe it won't affect most users.

Another possible reason would be forcing those who benefit from AVX-512 to opt for workstation and server CPUs. The AVX-512 instruction set is still barely used in the mainstream market, but its implementation is much broader in productivity and enterprise applications.

Also Read: Intel 12th-Gen Core Alder Lake Architectural Benchmark

Intel's removal of AVX-512 support wasn't enough to completely block users from using AVX-512 on Alder Lake desktop processors. By injecting microcode from older BIOS releases into a new one with AVX-512 disabled, users created a custom BIOS that still allows them to use the instruction set. As a custom BIOS, it's worth warning you about the risks that come with it, like bricking the motherboard. Install it at your own risk!

Permalink to story.

 
(Honest question) Are any of the commonly measured benchmarks dependent on AVX-512? Because if they are well, lots of implications there.
 
There are a few specific benchmarks that target AVX-512 but they're run specifically for that reason, to test that capability. None of the "regular" CPU benchmarks we see like all the rendering and media encoding, nor the CPU-heavy game benchmarks seem to use AVX-512.
 
(Honest question) Are any of the commonly measured benchmarks dependent on AVX-512? Because if they are well, lots of implications there.
Few benchmarks support AVX-512 but large majority does not. Disabling AVX-512 has near zero effect on desktop benchmarks.

Edit: a bit late but yes, agreed with Lew Zealand.
 
I've no idea what good is this avx512 for, what type of apps use this instruction set?
 
I've no idea what good is this avx512 for, what type of apps use this instruction set?
Software that want to calculate heavy parallel calculations, especially FPU ones, with low latency. Very much simplified, AVX units on CPU do much same as modern GPUs do (parallel workloads) but since AVX units are inside CPU they have very low latency. GPU calculations needs to be feeded via PCIe bus that is quite slow.

Main problem with heavy AVX use is ultra high power consumption. Also consumer software rarely have much use for AVX instructions.

It seems Linus Torvalds will get what he wanted: https://www.phoronix.com/scan.php?page=news_item&px=Linus-Torvalds-On-AVX-512
 
Avx512 is used for plugins in audio workstation software. These are things like compressors, distortion, etc. Some actually require it, will be interesting to see what they have to do for mitigation.

Latency in audio breaks the ability to record so these types of instructions are thoroughly used. I'd imagine it is the exact same (including w/ plugins) for video as well.

I've no idea what good is this avx512 for, what type of apps use this instruction set?
 
Side note: Games have used AVX in the past but never AVX-512. The last one I remember using it was Assassins Creed: Odyssey. Great game.
 
There are a few specific benchmarks that target AVX-512 but they're run specifically for that reason, to test that capability. None of the "regular" CPU benchmarks we see like all the rendering and media encoding, nor the CPU-heavy game benchmarks seem to use AVX-512.
Don‘t we have at least CPU-Z, Aida64, Passmark and Geekbench ? Afaik, Time Spy Extreme also supports AVX-512, so there‘s a few that are commonly used.
 
Could this perhaps be related to the ‚p core only‘ ADL release (12600 and below) ?

Technically, there is no reason why AVX-512 shouldn‘t work perfectly fine on the budget Alder Lake CPU since they lack e cores, but I guess this would look bad if budget models were better than higher end ones.
 
The RPCS3 emulator benefited a lot from AVX-512, though on Alder Lake you had disable the e-cores to use it anyway
 
Don‘t we have at least CPU-Z, Aida64, Passmark and Geekbench ? Afaik, Time Spy Extreme also supports AVX-512, so there‘s a few that are commonly used.

I should have specified "Application benchmarks," that is benchmarks that represent something an actual program does, like Blender or Cinebench. In theory some people could use these to do actual work. The above are purely calculation benchmarks alone, which are fine for specific tests.

Time Spy Extreme could stand in for a game benchmark but are there any actual games that use AVX512? I'd rather see a CPU-intensive game tests like in CP2077, BF2042, Watch Dogs, etc. to see actual CPU effects on gaming.
 
Time Spy Extreme could stand in for a game benchmark but are there any actual games that use AVX512? I'd rather see a CPU-intensive game tests like in CP2077, BF2042, Watch Dogs, etc. to see actual CPU effects on gaming.
Main reason why games rarely use AVX heavily is high power consumption that usually means lower clock speeds. And lower clock speeds mean less single thread performance and perhaps much less multi thread too.

There are also CPUs around that slow down to "AVX-clocks" if single AVX2 instruction is used. Like this https://www.agner.org/optimize/blog/read.php?I=415

No wonder many game developers don't want to take risk with heavy AVX use. If game does not support AVX-512, it's waste of time to test it with and without AVX-512 support enabled on CPU. Difference is near zero.

Edit: quick searching does not say any of those games support AVX-512, as expected.
 
Back