Facebook's first "Hard Question" explains how it is using AI to fight terrorism

midian182

Posts: 10,660   +142
Staff member

With Facebook now being used by almost 2 billion people around the world, the effect it has on society is at an all-time high. Yesterday, the social network announced plans to be more open about how it deals with some of the more controversial issues it faces, such as fake news, how it uses people’s data, and online terrorism.

The seven “Hard questions” Facebook asks are:

  • How should platforms approach keeping terrorists from spreading propaganda online?
  • After a person dies, what should happen to their online identity?
  • How aggressively should social media companies monitor and remove controversial posts and images from their platforms? Who gets to decide what’s controversial, especially in a global community with a multitude of cultural norms?
  • Who gets to define what’s false news — and what’s simply controversial political speech?
  • Is social media good for democracy?
  • How can we use data for everyone’s benefit, without undermining people’s trust?
  • How should young internet users be introduced to new ways to express themselves in a safe environment?

The first in the series of blog posts examines how Facebook is dealing with the terrorist material appearing on its site – a question that has once again been raised in the wake of the London and Manchester attacks.

While its long been accused of not doing enough to deal with the problem, Monika Bickert, director of global policy management, and Brian Fishman, counterterrorism policy manager, wrote: "We remove terrorists and posts that support terrorism whenever we become aware of them. When we receive reports of potential terrorism posts, we review those reports urgently and with scrutiny."

The company said it has a team of over 150 people “exclusively or primarily focused on countering terrorism.”

Facebook has recently started using artificial intelligence to stop the spread of extremist material. The AI is used in areas such as image matching, which compares uploaded photos and videos to content defined as terrorist, and text analyzing, which can detect posts praising or supporting terrorism. Facebook wants to expand these methods to other apps it owns, including Instagram and WhatsApp.

Mark Zuckerberg's company is also partnering with others in its fight against terror. It has joined with Microsoft, Twitter, and YouTube for a shared database of unique digital fingerprints, and works alongside government and inter-government agencies around the world.

Facebook hasn’t been very open about how it deals with these issues in the past; Bickert said the recent attacks are part of the reason why it has now chosen to do so. "We're talking about this because we are seeing this technology really start to become an important part of how we try to find this content," she said.

Permalink to story.

 
How should platforms approach keeping terrorists from spreading propaganda online?

Dox the posters.

After a person dies, what should happen to their online identity?

Give the credentials to next of kin.

How aggressively should social media companies monitor and remove controversial posts and images from their platforms?

They shouldn't.

Who gets to decide what’s controversial, especially in a global community with a multitude of cultural norms?

Precisely why they shouldn't.

Who gets to define what’s false news — and what’s simply controversial political speech?

The people consuming it.

Is social media good for democracy?

Yes.

How can we use data for everyone’s benefit, without undermining people’s trust?

You can't. You can mitigate this by giving users access to that data and the ability to erase it.

How should young internet users be introduced to new ways to express themselves in a safe environment?

Effective parenting.
 
the social network announced plans to be more open about how it deals with some of the more controversial issues it faces, such as fake news, how it uses people’s data, and online terrorism.

The best solution that solves all those problems at once, including promoting stupidity, laziness and general retardation... is to shut the Facebook down once and for all.
 
The best solution that solves all those problems at once, including promoting stupidity, laziness and general retardation... is to shut the Facebook down once and for all.

This is a most effective solution.
 
Now that FB has muttered something unintelligible about combatting terrorism online, I don't see them doing a much better job answering the other 6 questions. Typical FB, they talk a lot yet manage to say nothing.
 
"Is social media good for democracy?"
no, it has been shown time and time again that all it does is create 'communities' where people 'like' people who have the same views as them which results in a polarization of views with no middle ground and all people who are at either extreme seem to have to say to the 'other side' is what *****s they are. This phenomenon excludes meaningful dialogue and compromise.
 
Back