TechSpot means tech analysis and advice you can trust. Read our ethics statement.
In context: Facebook has a history of mishaps related to exercising poor control over the information disseminated on its platforms. Now that WhatsApp uses end-to-end encryption, it has found a new problem - the company can't see what people forward to each other in chats, allowing misinformation to become viral and fly under the radar.
Messaging apps with the sheer scale of Messenger and WhatsApp are the platforms of choice for malicious actors looking to spread misinformation around Covid-19. Messages can be forwarded and spread like wildfires, persuading people to treat conspiracy theories and medical quackery as facts.
To some, these are great sources for a laugh, but for others it fuels outrage and irresponsible behaviour. In the UK, several 5G towers were set on fire as a direct result of people falling prey to misinformation. And with an estimated 40+ percent increase in WhatsApp, Facebook, and Instagram usage since the start of the Coronavirus pandemic, there's an increased chance of leading to tragic events like those that happened in India in 2019.
Facebook today introduced a more restrictive policy on message forwarding in a bid to limit the ability of people to spread fake news and conspiracy theories. The way it works is that once you receive a message that has been forwarded more than five times, you'll only be able to forward it yourself to one chat at a time.
You'll be able to recognize these messages quite easily, as they'll be marked with a double arrow icon. If you see such a message, the best thing you can do for yourself and everyone else is to fact-check it before sending it to others. You can also use a bot made by the World Health Organization to get easy access to information that has been carefully vetted by healthcare professionals.
As noted by Facebook, the new policy won't stop people from spreading viral messages around. This is because messages are end-to-end encrypted which means the company can't implement an outright ban on specific content.
The current decision to restrict everyone's ability to forward messages strikes a balance between allowing people to connect privately and adding enough friction to prevent harmful content from becoming viral as quickly as before.
Before 2018, WhatsApp users were able to forward a message to 250 groups at once. That year, Facebook reduced that number to 20, and then further reduced that to five in 2019. The company says last year's restrictions alone resulted in a 25 percent reduction of message forwarding globally.