Why it matters: WhatsApp is currently used by more than 2 billion people, and uses end-to-end encryption that makes it hard for Facebook to filter misleading content. However, the company has chosen to limit users' ability to spread viral messages on the platform, which appears to be striking a good balance between allowing people to continue sharing information and preventing the wrong kind of information from being broadcast to a large number of users.

Earlier this month, Facebook introduced stricter limits for WhatsApp message forwarding in an effort to curb the spread of Covid-19 misinformation and fake news. Specifically, it made it impossible for any user to share viral messages with more than one chat a time in the hopes that they're less inclined to do it if it requires more work.

It turns out that it works in practice, and WhatsApp reported that it saw a 70 percent reduction in the number of messages that are highly forwarded on the platform globally. And by "highly forwarded," it means any message that has already been forwarded more than five times.

Facebook first introduced these safeguards in 2018, when it limited the forwarding of viral messages to 20 and later just five chats at a time, which translated into a 25 percent reduction in their ability to spread on the platform.

Not all forwarding is bad, but the social giant has no way of knowing exactly what its users share in chats. However, given the scale of WhatsApp – over two billion monthly users as of writing – and a 40 percent increase in overall activity, Facebook has chosen a sensible solution that appears to work.