Facebook has a fake news problem and it’s enlisting the help of several fact-checking groups to stop bogus stories from spreading on the social network. The company laid out its plan in a blog post today, after facing criticism over the role that intentionally deceptive stories shared in the News Feed (and other websites) may have played in the outcome of this year’s U.S. Presidential elections. Fake news and conspiracy theories were also blamed for a man's assault on a pizza restaurant in Washington, D.C.

As part of the initiative Facebook will ask users to report any story they think is dubious by clicking on a button at the top right. It will also use its algorithms to look for signs of bogus stories getting traction.

If a story is flagged enough times by the community or its software, then a group of third-party fact checking organizations that includes Snopes, Politifact, ABC News, FactCheck.org and the Associated Press will be able to vet the story themselves and report back to Facebook through a special website exclusively built for them. Stories that have been confirmed to be fake by these entities will still be shareable, but a disputed banner will be attached to the story on the News Feed and users that still want to share a disputed story will get a prompt asking them to re-confirm one last time

According to Facebook, the approach is a way to combat “the worst of the worst, the clear hoaxes spread by spammers for their own gain” without having to become the arbiter of truth themselves. “We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we're approaching this problem carefully,” said Adam Mosseri, who leads product management for the News Feed.

Facebook also says it will also prevent publishers that use spoof domains posing as legitimate websites from buying ads on its platform to generate traffic.