Facebook asked some users if they thought the company should allow posts from child sexual predators and violent extremists, then reversed course and pulled the surveys after they were spotted by a media outlet.
"We understand this survey refers to offensive content that is already prohibited on Facebook and that we have no intention of allowing so have stopped the survey," the company said in a statement emailed to CNBC.
"We have prohibited child grooming on Facebook since our earliest days; we have no intention of changing this and we regularly work with the police to ensure that anyone found acting in such a way is brought to justice," the statement said.
According to screenshots of the survey published online by The Guardian, one of the survey questions read:
"In thinking about an ideal world where you could set Facebook's policies, how would you handle the following: a private message in which an adult man asks a 14-year-old girl for sexual pictures."
The response choices to that question were:
- This content should be allowed on Facebook and I would not mind seeing it.
- This content should be allowed on Facebook but I don't want to see it.
- This content should not be allowed on Facebook and no one should be able to see it.
- I have no preference on this topic.
Facebook vice president of product Guy Rosen later said in a tweet that those questions "shouldn't have been part of this survey. That was a mistake."
The decision to send out the surveys, then pull them once they were discovered, is the latest in a series of recent flip-flops by the company.
Last month, for example, Facebook blocked the account of an Ethiopian political activist who was documenting unrest in that country, then restored it and apologised after his supporters protested on social media, in a campaign that included spamming Mark Zuckerberg's Valentine's Day post with messages of support.
This article was first published in CNBC