Facebook on Tuesday published the internal guidelines used by its community operations team to police content shared on the social network. The tech titan for years has shared its community standards policy with users but this marks the first time that Facebook has shared the guidelines it uses to enforce its standards.

Monika Bickert, Vice President of Global Policy Management at Facebook, said they decided to post their internal guidelines for two reasons. For starters, the guidelines will help people understand where Facebook draws the line with regard to nuanced issues. Furthermore, Facebook believes that sharing the guidelines will make it easier for everyone to provide feedback which will help the social network improve them over time.

The guidelines are broken down into six main categories - Violence and Criminal Behavior, Safety, Objectionable Content, Integrity and Authenticity, Respecting Intellectual Property and Content-related Requests - each with their own sub-categories. The guidelines are quite extensive so for the sake of brevity, we won't be detailing each category here although you're welcome to click through to each section using the links above.

Bickert also revealed that for the first time, Facebook is now giving users the right to appeal decisions. This means that if you don't agree with Facebook removing an individual post or feel they have made a mistake, you can ask for a second opinion.

The right to appeal is important. Facebook has more than 7,500 content reviewers working around the clock alongside advanced AI to identify posts, pictures or other types of content that violate Facebook's community standards. That may sound like a lot of people but when you consider the social network has more than two billion active users, it's clear that Facebook's resources are spread incredibly thin and mistakes can easily happen.