Facebook Moderators’ Identities Mistakenly Exposed To Suspected Terrorists

Every job has its risk, however, very few of us expect to be on a terrorist’s hit list. It was recently revealed that the information of over 1,000 content moderators was leaked thanks to a security flaw. The profiles of six Facebook content moderators in particular were considered “high priority” risks. 

Although Facebook uses algorithms to monitor most of its content, the company still employs people to comb over content related to terrorist propaganda, extreme violence, and/or child pornography. Last year, Facebook discovered that “the names of certain people who work for Facebook to enforce our policies could have been viewed by a specific set of Group admins within their admin activity log.” Over one thousand content moderators in twenty-two departments were affected, however, one forty-person unit in Dublin dedicated to counter-terrorism was particularly at risk.

facebook profile pictures

Out of the forty employees in Dublin, six were considered “high priority” risks. Their information was reportedly available for a month. This label, nevertheless, was more of a precaution than a reality. According to Facebook, there was no evidence that the six employees were in danger of retaliation. During their investigation, they discovered that the moderator’s profiles had thankfully never been viewed by the suspected group admins.

One of the six moderators, however, fled to Eastern Europe after the flaw was discovered. The Iraqi-born Irish citizen had fled Iraq after his father had been kidnapped and beaten and his uncle had been executed by ISIS. He claims that his moderator profile was viewed by accounted tied to ISIS, Hezbollah and the Kurdistan Workers Party. The moderator told Facebook’s head of global investigations Craig D’Souza, “I’m not waiting for a pipe bomb to be mailed to my address until Facebook does something about it”.

facebook newsroom profile

Facebook immediately adjusted its policies to prevent such a bug from happening again. The company noted, “As soon as we learned about this issue, we fixed it and began a thorough investigation to learn as much as possible about what happened.” Facebook offered the affected moderators home security systems, transport to and from work, and counseling as well. It also pledges to do better in the future to protect its employees.