False or misleading posts on Facebook and Instagram are being blamed for enabling human traffickers who recruit African women to be ferried to Saudi Arabia and other countries with promises of visas and jobs.
Instead, the women are held captive, denied access to food and forced to perform sex acts in massage parlors, according to internal investigations published recently in the Wall Street Journal.
Evidence of the human trafficking was uncovered by Facebook’s own team of human exploitation investigators but until recently the social media platform lacked a protocol for dealing with job ads for domestic servitude. As a result, dangerous ads were allowed to remain on the site.
In a 3-page blockbuster titled “Facebook’s Staff Flags Criminals, But Company Often Fails to Act,” of Sept. 17, the WSJ cited “shocking behavior seen on FB posts… in clear violation of Facebook’s rules.” Internal Facebook documents found by the paper showed alarms being raised about how FB’s platforms were used in some developing countries where its user base is huge and expanding.
Allegations in the story brought a strong rebuke by the social media giant which called them “deliberate mischaracterizations” that conferred “false motives to FB’s leadership and employees.”
In the WSJ article, a young Kenyan freelance writer was profiled who had applied for a job seen on Facebook that promised free airfare and visas – even though Facebook had banned employment ads touting free travel and visa expenses.
Titled “Cleaners needed in Saudi Arabia,” the ad touted a $300 monthly wage, enticing the young writer to meet the recruiter at the Nairobi airport. The salary was now 10% less than promised and once hired, only the employer could terminate the contract. If she wanted out, she would lose her visa and be in Saudi Arabia illegally.
She attempted to back out only to learn her contract had already been sold to an employer and she would have to repay the employer if she quit.
Without the required money, she was flown to Riyadh, Saudi’s capital city. There, she worked in a home from 5 a.m. to dusk, cleaning, sleeping in a storage room and berated by her boss who called her a dog.
After two months, she escaped to a deportation center where she met other Kenyan women, including one with marks from chains on her wrists and ankles.
Yet Facebook still hadn’t implemented systems to find and remove the trafficking posts 18 months after abuses were discovered. Finally in 2019 they launched a sweep and found more than 300,000 instances of potential violations. More than 1,000 accounts were disabled.
But posts of human trafficking continued to pop up and Facebook allegedly delayed a project meant to improve understanding of human trafficking.
A memo uncovered by the report read: “We know we don’t want to accept profit from human exploitation. How do we want to calculate these numbers and what do we want to do with this money?”
The Kenyan victim, meanwhile, said she has been warning other people about the risks of getting trafficked and would like to see FB work harder. “I think something should be done about this,” she said, “so that nobody just goes in blindly.”