Facebook introduces a “review request” in its community guidelines, if your post has been removed. After facing the immense security breach, Facebook has been under constant pressure to improve content moderation around the globe. In this regard, recently Facebook has made set guidelines for the public to describe the company’s policies for determining whether posts should be removed from the service or not.
Apart from these guidelines, the company is also introducing a new appeals process which will allow users to request a review if they consider their post has been removed unfairly.
The guidelines are extended form of leaked copy of Facebook’s content moderation guidelines which was published by The Guardian published last year in May. Now, almost a year later, Facebook is making a prolonged set of those guidelines available to the public.
It includes the community standards which run 27 pages and cover a wide range of topics like bullying, self-harm, violent threats and nudity, among many other topics.
Monika Bickert, head of global policy management at Facebook said in an interview with reporters. “These are issues in the real world,”
“The community we have using Facebook and other large social media mirrors the community we have in the real world. So we’re realistic about that. The vast majority of people who come to Facebook come for very good reasons. But we know there will always be people who will try to post abusive content or engage in abusive behavior. This is our way of saying these things are not tolerated. Report them to us, and we’ll remove them.” She added.
These guidelines apply to each and every country in which Facebook operates. Moreover, the set of guidelines have been translated into more than 40 languages.
Well, if you will violate one of the company’s guidelines (community standards) and your post is taken down, you’ll be notified on Facebook with an option to “request review.”
Within next 24 hours Facebook will review your request, and if it decides it has made a mistake or unfair decision then it will restore the post and notify you.
For the most part, guidelines also apply to other Facebook services, like Instagram, even though there are some differences. Such as, you don’t need to use your real name on Instagram.
According to the company, the developing process was completed with the help of a “couple hundred” experts and advocacy groups from the entire world. As the guidelines evolve and they will evolve, Bickert said. They will be updated at the same time in every language.
However, the primary policies haven’t changed, Bickert said, despite the fact that now company include extra help and control on making decisions. “What’s changing is the level of explanation about how we apply those policies,” Bickert said.
Recently, a report in The New York Times unveiled the fact about connected hate speech on Facebook to murders in Indonesia, India, and Mexico.
In response to that report, Facebook has claimed that it will double its 10,000-person safety and security team by the end of this year. Moreover, company also plans to update the guidelines on a regular basis as new threats emerge.
Now, the objective of making the guidelines public is company hopes to learn from users’ feedback, Bickert said.
“I think we’re going to learn from that feedback, this is not a self-congratulatory exercise. This is an exercise of saying, here’s where we draw the lines, and we understand that people in the world may see these issues differently. And we want to hear about that, so we can build that into our process.” she said.