Leaked benchmarks show the Snapdragon 8cx can rival Intel's i5-8250U

midian182

Posts: 9,738   +121
Staff member
Something to look forward to: We know Windows 10 on ARM laptops boast several benefits over machines sporting Intel and AMD processors, but performance hasn’t been one of them. With Qualcomm’s Snapdragon 8cx, however, that’s starting to change, as shown in leaked benchmarks.

While Snapdragon-powered PCs have brought incredibly battery lives of over 20 hours, “always-on, always-connected” features, fan-less designs, and super-fast boot times, their performance compared to traditional laptops has been poor.

Back in December, Qualcomm announced the Snapdragon 8cx, the company's third compute chipset and the first 7nm platform built for PCs. It’s the largest processor Qualcomm has ever made and features eight custom cores in its Kryo 495 CPU. These four high-performance Cortex-A76 cores and four low-power Cortex-A55 cores are all clocked higher than the Snapdragon 850.

Qualcomm later showed off benchmarks that compared the 7W Snapdragon 8cx against Intel’s 15W Core i5 8250U mobile processor using 3D Mark and PCMark, with the 8cx ahead of Intel’s chip in more tests.

Now, Geekbench numbers have backed up these reports, showing a Snapdragon 8cx clocked at 2.84GHz gaining a single-core score of 3,327 and a multi-core score of 11,154. This puts it within touching distance of the i5-8250U, which has a 3,659 single-core score and an 11,192 multi-core score.

At the same Computex event where it announced the benchmarks, Intel revealed it is partnering with Lenovo on the first 5G-enabled always-connected laptop, currently known by the code name Project Limitless. It features a Snapdragon 8cx SoC and a second-generation X55 5G modem alongside Cat 22 LTE.

Last month, Samsung revealed that the successor to its Galaxy Book 2, the Galaxy Book S, would be powered by the Snapdragon 8cx and offer up to 23 hours of continues video playback on a single charge. Expect to see this laptop and other 8cx-powered machines to arrive in the run-up to the holidays.

Permalink to story.

 
Good work Qualcomm, but Isn't the iPad Pro already at 17000?

Oh, and primate labs just changed the methodology, so expect a more modest score when you see it on Geekbench 5
 
Hurray, it just about matches a budget mobile processor from 2 years ago.... maybe an apples to apples comparison would shed a bit more light?
 
Good work Qualcomm, but Isn't the iPad Pro already at 17000?

Oh, and primate labs just changed the methodology, so expect a more modest score when you see it on Geekbench 5
Is it directly comparable though? Phone benchmarks have always felt wanky to me.
 
Last edited:
It's a shame they're going to try to run Windows on it. Of course it will be fine if you stick to apps from the windows store, but who does that? How do steam games perform on it? X86 emulation has always been what kills windows on Snapdragon.
 
@midian182 I know "continues" passes spellcheck in this sentence.. "the Galaxy Book S, would be powered by the Snapdragon 8cx and offer up to 23 hours of continues video playback on a single charge", but I think (IMHO), you meant to write "continuous".
 
Different OS on each... and vastly different hardware as well... which is why I'm not really sure why these comparisons are being taken seriously - they're clearly for marketing purposes only.
It seems you are yet another person who does not understand benchmarking. It is perfectly practical to benchmark between different hardware devices and different OS's, It simply has to be calibrated along a known and measurable index. Time is a good one.

While I have some issues with the absolute magnitude of Primatelabs benchmarking calibration between OS's, the issues are not large, but are reasonable, and clearly not "marketing"

The purpose of a benchmark is not to declare an "absolute winner", it is to calibrate different hardware's relative performance levels on known and measurable tasks, and determining a relative performance level. Thus an A12X Bionic (with a GB4 of 17000) is significantly faster than another SOC (with a GB4 of 11000) while running similar tasks to those chosen in the benchmarking suite.
 
It seems you are yet another person who does not understand benchmarking. It is perfectly practical to benchmark between different hardware devices and different OS's, It simply has to be calibrated along a known and measurable index. Time is a good one.

While I have some issues with the absolute magnitude of Primatelabs benchmarking calibration between OS's, the issues are not large, but are reasonable, and clearly not "marketing"

The purpose of a benchmark is not to declare an "absolute winner", it is to calibrate different hardware's relative performance levels on known and measurable tasks, and determining a relative performance level. Thus an A12X Bionic (with a GB4 of 17000) is significantly faster than another SOC (with a GB4 of 11000) while running similar tasks to those chosen in the benchmarking suite.
Oh I understand it... but as many would tell you, iOS vs android and windows can never be fully compared as there are no equivalent software comparisons to make - they run quite differently on iOS compared to windows/android.

And even if you could... so what? If you’re interested in running an app/program on iOS, you are locked into that ecosystem anyways..

The original article posted has Qualcomm comparing their SoC to the intel i5... at least that will eventually be running the same programs... of course, it’s still flawed since an old budget mobile CPU doesn’t really serve the same purposes as the new snapdragon...
 
This would be better served running an always on Linux machine. If you want to run Windows apps, there's Wine.
 
It seems you are yet another person who does not understand benchmarking. It is perfectly practical to benchmark between different hardware devices and different OS's, It simply has to be calibrated along a known and measurable index. Time is a good one.

While I have some issues with the absolute magnitude of Primatelabs benchmarking calibration between OS's, the issues are not large, but are reasonable, and clearly not "marketing"

The purpose of a benchmark is not to declare an "absolute winner", it is to calibrate different hardware's relative performance levels on known and measurable tasks, and determining a relative performance level. Thus an A12X Bionic (with a GB4 of 17000) is significantly faster than another SOC (with a GB4 of 11000) while running similar tasks to those chosen in the benchmarking suite.
We've known for quite some time that software runs differently between systems, so it's not an apples to apples comparison. If it was Android vs Android then yes, it works (all the time 90% of the time), but comparing iOS, Android and Windows with vastly different hardware and software is not easy at all. Benchmarks have been really unreliable on mobile.
You just can't answer this question: does 10k points on iOS equal 10k points on windows?

Synthetic tests are very wanky on mobile. Even testing regular apps is not perfect since one app might be better optimized than another for a certain os. Which do you choose to test?
 
Back