Intel appears to have pulled a fast one on benchmark tests for its Xeon processors

emorphy

Posts: 64   +0
Staff
Doh! It's not unheard of for a company to play it loose in a benchmark test, but the fallout can be embarrassing when they get caught. Intel finds itself in this situation, as SPEC reports that the chipmaker allegedly used a custom-designed compiler for thousands of its benchmarks.

When third parties benchmark IT products, everyone assumes that the tests are conducted on a level playing field – and that the results shown have not been subject to sleight of hand. The occasional discovery that this was not the case can be maddening, considering the weight the industry and purchasing companies put on benchmarks.

Prepare to be maddened.

The Standard Performance Evaluation Corporation (SPEC) has amended around 2,600 records covering SPEC CPU 2017 results for Intel CPUs with a note indicating that the results for Intel's Xeon processors – the older ones primarily – were invalid. An investigation stemming from a SPEC internal audit found that Intel had used compilers that performed transformations with narrow applicability, meaning the results were unlikely to be anything end users could expect to experience.

Intel used a compiler – specifically the oneAPI DPC++/C++ – optimized for the specific SPEC CPU 2017 benchmark rather than the tested workloads. The benchmark in question was 523.xalancbmk_r / 623.xalancbmk_s. To put it bluntly, Intel cheated.

Michael Larabel at Phoronix believes that Intel's specially-designed compiler could have inflated speed uplift by 9 percent and around 4 percent for the SPECint rate.

ServeTheHome said that the 4th Gen Intel Xeon Sapphire Rapids results seemed to have been the most impacted based on several spot checks of SPEC's noted records. It noticed that the optimizations were in the 2022 compiler, but the latest version (2023.2.3), generally used for the latest 5th Gen Intel Xeon Emerald Rapids launch, no longer had the optimizations.

This deception is hardly the first time a company's integrity around benchmark results has been questioned. Tom's Hardware points out some of the more infamous cases, such as when Nvidia allegedly performed a driver-side optimization to boost the performance of its GPUs in 3DMark 2003. Seven years later, in 2010, Nvidia accused AMD of using a different image quality setting to boost benchmark results against comparable GeForce cards. Mobile chip suppliers Qualcomm, Samsung, and MediaTek supposedly faked Android performance results in 2020.

Permalink to story.

 
They were too much in a hurry to release the next generation and had too much work to do, so much so that they didn't have the time for custom compilers :)
 
Last edited:
Ladies, ladies, don't get your panties all in a bunch.

Just read the bottom line of any spec sheet, and all deceits will be revealed.

"Specifications subject to change without notice."

It's a technological parody on this line from he bible, "forgive us our trespasses". And as for manufacturers, "forgiving those who trespass against them"; c'mon, nobody here has ever sent anything back that they broke, but claimed it was a, "defect in materials and workmanship"?
 
Back