Chinese short video-sharing app TikTok has acknowledged that content produced by disabled users was deliberately suppressed by the firm’s moderators in a bid to prevent these users from becoming victims of bullying, the media reported. Facing criticism, TikTok acknowledged that its approach had been flawed, the BBC reported on Tuesday, adding that the measure was exposed by the German digital rights news site Netzpolitik.
Disability rights campaigners termed the strategy “bizarre”.
A leaked extract from TikTok’s rulebook gave examples of what its moderators were instructed to be on the lookout for: disabled people, those with Down’s syndrome and autism, people with facial disfigurements, and people with other “facial problems” such as a birthmark or sight squint.
Such users were “susceptible to bullying or harassment based on their physical or mental condition”, according to the rulebook.
The moderators were instructed to restrict viewership of affected users’ videos to the country where they were uploaded, according to an unnamed TikTok source quoted by Netzpolitik.
The moderators were told to prevent the clips of vulnerable users from appearing in the app’s main video feed once they had reached between 6,000 to 10,000 views, said the report.
A spokesman for TikTok admitted that it had made the wrong choice, the BBC reported.
“Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy,” he was quoted as saying.
“This was never designed to be a long-term solution, and while the intention was good, it became clear that the approach was wrong,” the spokesperson told the BBC.