Facebook Adds 3,000 Moderators To Screen Violent Live Videos And Hate Speech

Over the last handful of months, Facebook has been working aggressively to tackle some important issues with the service, some of which are mere annoyances, and others more severe in nature. On one hand, for example, the company aims to eradicate "fake news" from our feeds, and on the other, it's putting into place measures to help detect suicidal users.

Another major issue Facebook is going to be focusing more effort on is live video used for malicious or upsetting content. This news follows an incident last month where an elderly gentleman had his life cut short by someone dealing with his own issues in a truly disgusting way. The video of the murder stayed on the service for a couple of hours, and then spread to other parts of the internet.

Facebook Game Streaming Notebook

Years ago, it might have been hard to imagine that someone would broadcast a suicide or murder live on the internet, but it's an unfortunate reality we've seen played out multiple times over the years. When it does happen, it shouldn't linger on the service for hours - it should be pulled as soon as possible.

To help make that notion a reality, Facebook is going to be bolstering its moderator staff by 3,000 throughout 2017. That's impressive in its own right, but when you consider that's an effective growth of 66% for that team, Facebook is definitely putting its money where its mouth is.

Facebook Live Subscribe

On his Facebook post, company chief Mark Zuckerberg says that these moderators won't just help improve the speed at which truly upsetting videos are taken down, but that they'll also be heavily monitoring for videos promoting hate speech and child exploitation.

Overall, this is a great move for Facebook, albeit an overdue one.