Children’s Privacy has filed a lawsuit against TikTok with the U.S. Federal Trade Commission, saying the video app violated an agreement to protect children on its platform, the New York Times reported. TikTok paid a $5.7 million fine to the FTC in February 2019 for allegedly violating the Children’s Online Privacy Protection Act (COPPA) with an earlier version of an app called Musical.ly, which allows users under the age of 13 to register without parental consent. Under the terms of the agreement, TikTok also agreed to delete videos previously uploaded by all people under the age of 13.
But the Privacy Advocates Alliance, led by the Center for Digital Democracy and the Campaign for A Commercialized Childhood, found that videos uploaded by children under the age of 13 in 2016 still appeared on the app, and that the company was not good enough to get parental consent from new users.
“TikTok has failed to make reasonable efforts to ensure that the child’s parents are notified directly of the practice of collecting, using or disclosing personal information. The lawsuit says. “In fact, TikTok did not contact the child’s parents at any time or even ask for the child’s parents’ contact information. Therefore, TikTok does not obtain verifiable parental consent prior to the collection, use or disclosure of children’s personal information, as required by consent orders and COPA rules. “
Even its services designed for children under the age of 13 have problems, the lawsuit says, and TikTok for Younger Users, which restricts users’ postings, simply “encourages children to lie about their age.” According to the lawsuit, children who sign up for Younger can cancel the account and then re-sign up for a standard TikTok account on the same device by simply changing their date of birth.
This is not necessarily a unique problem for TikTok, and it is not the only platform that has problems dealing with children’s content. In September, the FTC fined YouTube $170 million for allegedly abusing and collecting child data in violation of COPPA. YouTube has therefore introduced a new tag system for creators focused on children’s content.
“Congress empowers the FTC to ensure online protection for children, but this is another case of a digital giant deliberately breaking the law,” Jeff Chester, executive director of the Center for Digital Democracy, said in a statement. “The FTC’s failure to ensure that TikTok protects the privacy of millions of children, including through its use of predictive AI apps, is another reason that raises questions about whether the agency can trust the agency’s effective surveillance of child data laws.” “
By the time TikTok went live in August 2018, Music.ly had 100 million monthly active users. TikTok has more than 500 million users worldwide, many of them children. In April, analytics platform Sensor Tower reported that TikTok had been downloaded more than 2 billion times worldwide.
Also in April, TikTok introduced a new feature called “Family Pairing” that allows parents to link their children’s accounts to their accounts, allowing children to turn off direct messages, turn on restricted content modes, set screen time limits, and more.
“We take privacy very seriously and are committed to helping ensure that TikTok continues to provide a safe and entertaining community for our users,” a TikTok spokesman said in an email to The Verge. “