According tomedia reports, the social video site TikTok apparently limited access to people with disabilities, including facial defects and Down syndrome. According to Netzpolitik.org, they learned through an exchange with company insiders that the problem policy is designed to protect users at risk of being bullied. In practice, however, this practice becomes a form of discrimination – and the problem is compounded by the fact that moderators need to make quick decisions about the user’s physical and psychological characteristics.
TikTok’s popularity depends heavily on content promoted through the TikTok algorithm. But sources at Netzpolitik.org say TikTok makes it difficult for those who might be bullied to do so. A risk category means that videos only appear in the country where they are uploaded. Another feature called Auto R prevents videos from appearing on other users’ “For” feeds after clicking on a number of views. This label may apply to individual videos, but it should be the default for dozens of “special users” groups.
Auto R seems to apply to a wide range of people. It can cover people with disabilities like the one mentioned above, but Netzpolitik.org notes that TikTok’s approach limits access to “confident obese” users, LGBT users or autistic users — in other words, people who may be particularly vulnerable to online abuse. However, as the article points out, it is difficult to judge these categories by personal data or video alone. Although this policy is designed to prevent bullying, it is done by punishing potential victims.
A tikTok spokesman said the rules were designed to counter early and flawed attempts to confront the conflict. The spokesman told Netzpolitik.org that they never intended to use it as a long-term solution.