By: Denise Simon | Founders Code
- Apple threatened to remove Facebook from its App Store after a report about an online slave market.
- The BBC in 2019 reported that human traffickers were using Facebook’s services to sell domestic workers.Apple threatened to kick Facebook off its App Store after a 2019 BBC report detailed how human traffickers were using Facebook to sell victims, according to The Wall Street Journal.The paper viewed company documents that show a Facebook investigation team was tracking down a human trafficking market in the Middle East whose organizers were using Facebook’s services. What appeared to be employment agencies were advertising domestic workers that they could supply against their will, per the Journal.The BBC published a sweeping undercover investigation of the practice, prompting Apple to threaten to remove Facebook from its store, the paper said.An internal memo found that Facebook was aware of the practice even before then: A Facebook researcher wrote in a report dated 2019, “was this issue known to Facebook before BBC inquiry and Apple escalation?,” per the Journal.Underneath the question reads, “Yes. Throughout 2018 and H1 2019 we conducted the global Understanding Exercise in order to fully understand how domestic servitude manifests no our platform across its entire life cycle: recruitment, facilitation, and exploitation.”Apple and Facebook did not immediately respond to requests for comment.The Wall Street Journal on Thursday also reported how Facebook’s AI content moderators cannot detect most languages used on the platform, a needed skill if the company is going to monitor content in foreign markets where it has expanded. source
Source: The dozens of internal Facebook documents obtained by the outlet showed how employees have expressed concerns about how the social media giant is being used in countries across the globe and how Facebook has failed to properly respond to these issues.
Some of the documents reportedly showed that Facebook employees raised concerns about human trafficking organizations in the Middle East that used Facebook to attract women. Other documents showed Facebook employees alerting their higher-ups of groups involved in organ selling and pornography.
The news outlet reported that while some of the groups and pages flagged by employees have been taken down, dozens of others remain active on the social media site.
Another document detailed a Facebook employee’s investigation into a Mexican drug cartel that was active on the social media site. The employee, who was a former police officer, was able to identify the Jalisco New Generation Cartel’s network of accounts on both Facebook and Instagram, which is owned by Facebook.
The employee wrote in the report that his team had found Facebook messages between cartel recruiters and potential recruits “about being seriously beaten or killed by the cartel if they try to leave the training camp.”
The documents reportedly showed that the cartel was open about its criminal activity, with several pages on the social media site showing “gold-plated guns and bloody crime scenes.”The Wall Street Journal reported that even after the employee recommended Facebook increase its enforcement on the groups, documents showed that Facebook didn’t completely remove the cartel from its site and instead said that it removed content tied to the group. Just nine days after the report from the employee, his team found a new Instagram account tied to the cartel, which included several violent posts.
Many of the documents apparently showed employees raising concerns about how the social media giant was being used in developing countries, such as militant groups in Ethiopia using Facebook to promote violence against minority groups.
Brian Boland, a former Facebook vice president, told the Wall Street Journal that the social media site sees these issues in developing countries as “simply the cost of doing business.”
“There is very rarely a significant, concerted effort to invest in fixing those areas,” Boland said.
In a statement sent to Newsweek, a Facebook spokesperson said: “In countries at risk for conflict and violence, we have a comprehensive strategy, including relying on global teams with native speakers covering over 50 languages, educational resources, and partnerships with local experts and third-party fact-checkers to keep people safe.”
In a series of tweets on Thursday, Facebook spokesman Andy Stone wrote, “As the Wall Street Journal itself makes clear, we have a team of experts who help us uncover patterns of harmful behavior so we can disrupt it. We’ve got arguably more experts and resources dedicated to this work than any other consumer technology company in the world.”