Microsoft is leveraging Intel CPUs to help detect crypto-mining malware

Shawn Knight

Posts: 15,306   +193
Staff member
The big picture: Cyber criminals are in a constant game of cat-and-mouse with security experts that goes a little something like this: criminals discover a new attack vector and start exploiting it, only for the good guys to bolster their defenses to thwart the attack. With the old strategy no longer effective, the bad guys set out in search of a new angle or product to exploit. Rinse and repeat.

We’re seeing the cycle repeat itself once again, this time with attackers hot on the trail of one of tech’s latest trends, cryptocurrency.

Rather than inject compromised systems with annoying viruses or malware that steals login credentials, some hackers are now covertly installing crypto-mining software on targets. This type of malware utilizes a system’s processing power and resources (like electricity) to mine for crypto on the victim's dime. Other types of crypto malware look for and steal cryptocurrency wallets on infected systems.

It's a growing problem, too. According to security software company Avira, coinminer malware attacks increased 53 percent in the fourth quarter of 2020 compared to the previous quarter.

In an effort to better protect users from such attacks, Microsoft is partnering with Intel. Specifically, Microsoft will leverage Intel’s Threat Detection Technology (TDT) through Microsoft Defender for Endpoint to apply machine learning to CPU telemetry to detect malware code execution fingerprints, even if they’ve been tweaked to fly under the radar.

Intel said its threat detection doesn’t create a performance hit, as TDT can offload performance-intensive workloads to the integrated graphics controller, ensuring the CPU is ready for other tasks.

The new threat detection capabilities are native to Intel Core and vPro platforms, and with nearly a billion Intel TDT-capable PCs on the market today, this could be a serious blow to malicious actors.

Image credit Yevhen Vitte

Permalink to story.

 
The paragraph stating that Intel claims no performance hit sounds a little suspect to me. What if someone is playing a video game and they're already using all their GPU cores?
Nothing is free in this world. On a laptop this will consume more power to use those GPU cores, will it not?
 
Back