Facebook has made plenty of mistakes in the past, but that hasn't stopped it wading into another disaster. Over the weekend, the social network sent a survey to some users that asked whether men who request sexual pictures from children should be allowed on the site.

Guardian digital editor Jonathan Haynes was one of the users who received the survey, which contained the question: "There are a wide range of topics and behaviors that appear on Facebook. In thinking about an ideal world where you could set Facebook's policies, how would you handle the following: a private message in which an adult man asks a 14 year old girl for sexual pictures."

Response options included "this content should not be allowed on Facebook, and no one should be able to see it" and "this content should be allowed on Facebook, and I would not mind seeing it."

A follow-up question asked who should decide the rules when it comes to policing such content. Responses included Facebook deciding on its own, Facebook with advice from external experts, external experts on their own, or Facebook users by voting.

The platform has been criticised over the questions and for not including an option where it contacts law enforcement or child protection services to inform them of the illegal act.

Facebook said the survey was designed to get feedback from people about its community standards and the types of content they would find most concerning on Facebook. But it still admitted the questions were a "mistake."

"We run surveys to understand how the community thinks about how we set policies. But this kind of activity is and will always be completely unacceptable on FB. We regularly work with authorities if identified. It shouldn't have been part of this survey. That was a mistake," said Facebook's vice president of product, Guy Rosen.

In a statement to The Guardian, Facebook said: "We understand this survey refers to offensive content that is already prohibited on Facebook and that we have no intention of allowing so have stopped the survey. We have prohibited child grooming on Facebook since our earliest days; we have no intention of changing this and we regularly work with the police to ensure that anyone found acting in such a way is brought to justice."

As part of an investigation into pedophiles by the BBC in 2016, the UK broadcaster used Facebook's report button to flag 100 images that appeared to break the company's content guidelines. Just 18 of these were removed, and after asking reporters to send in examples of the material, Facebook reported the journalists to the UK's National Crime Agency for complying.