Facebook Apologizes For Uneven Content Moderation That Allows Hate Speech To Fester

Facebook has policies in place that are intended to curb bad behavior, including hate speech. Yet horrible things get posted on the social network (and other places on the web, such as Twitter), sometimes seemingly faster than moderators can remove them. Part of the problem is that the battle never ends, but even so, there is room for improvement. Facebook knows this, and has admitted to (and apologized for) making mistakes when it comes to hate speech.

"We’re sorry for the mistakes we have made—they do not reflect the community we want to help build," Facebook Vice President Justin Osofsky said in a statement. "We must do better."

Facebook

Osofsky's apology on behalf of Facebook came in response to ProPublica presenting the social network with dozens of examples of hate speech and the inconsistent way in which they were moderated. The media outlet asked Facebook to explain its decision on 49 items, each of which were sent in by Facebook users who claim the social network had made a mistake in its judgement, mostly by not removing certain posts.

In one example, there was a photo with the caption, "The only good Muslim is a f***ing" dead one." A Facebook user reported the post as hate speech, but an automated reply determined that the photo was allowed to stand.

"We looked over the photo, and though it doesn't go against one of our specific Community Standards, we understand that it may still be offensive to you and others," the reply stated.

In another example, a single-line post saying, "Death to the Muslims" without an accompanying photo was removed after users reported it to Facebook. It is that inconsistency that Facebook was asked to explain, and ultimately apologized for while acknowledging that it needs to do better.

As to the 49 examples that were submitted, Facebook admitted that its reviewers made the wrong call in 22 of them, and they should have been removed. It also defended its ruling in 19 others. Facebook said another six violated its rules but were never actually judged because they were not flagged correctly. As for the remaining two, Facebook said it didn't have enough information to respond.

We don't envy the position Facebook is in. Trying to police online posts is not only a thankless task, it's also a tricky one. Facebook has every right to enforce whatever rules it wants, but if it wants to remain a popular platform, it has to find a balance between taking out the trash and not doling out heavy-handed censorship.

As it stands, Osofsky says Facebook deletes around 66,000 posts reported as hate speech per day. However, he cautions that everything that is potentially offensive qualifies as hate speech.

"Our policies allow content that may be controversial and at times even distasteful, but it does not cross the line into hate speech," he said. "This may include criticism of public figures, religions, professions, and political ideologies."

Going forward, Facebook plans to beef up the size of its safety and security team, which includes content reviewers, to 20,000 people next year. The company hopes that by doing so it will be better equipped to enforce its rules, and do it more consistently than it is currently.