Naked Security Naked Security

US parents file class action against TikTok over children’s privacy

Collecting children's data without their guardians' consent is illegal under COPPA and already earned TikTok a huge fine.

Parents and legal guardians in the US have their knives out for the insanely popular, oft-investigated social video app TikTok: they’ve filed a class action suit alleging that its parent company, ByteDance, is illegally collecting young users’ data in violation of child privacy law.

Named in the suit were Sherri LeShore and Laura Lopez, the mothers of two teenage girls who are also in on the class action but who, because they’re minors, are only identified by their initials: T.K. and A.S.

Both of the girls were under the age of 13 when they started using the app, known at the time as Musical.ly. But neither LeShore nor Lopez were asked for their verifiable consent, they claim: a violation of the Children’s Online Privacy Protection Act (COPPA), which is the nation’s strictest child privacy law.

COPPA applies to any site or service that collects children’s personally identifiable information (PII), which TikTok does: users handed over their email addresses, phone numbers, usernames, first and last names, short bios in which users could choose to mention their age, and profile pictures. For a while, between December 2015 and October 2016, TikTok was also hoovering up users’ geolocation data, which let the app figure out where its users were located.

Musical.ly (bought by ByteDance in 2017 and merged with the TikTok app in 2018) had all of that PII set to public view, by default. That meant that a child’s profile bio, username, picture, and videos could be seen by other users – including by adults and, potentially, by child predators. Even if a user switched their profile to private, their profile pictures and bios remained public, meaning that users/adults/predators could still send them direct messages, replete with colorful, cartoonish icons – animals, smiley faces, cars, trucks, hearts, that kind of thing.

In fact, there have been reports of adults posing as minors and messaging children, sometimes asking them for nude photos.

TikTok’s legal entanglements

We’ve heard all of this before: in February, the US hit TikTok with the biggest-ever fine for violating COPPA. In July, the UK launched an investigation to see if these same issues constitute a violation of the General Data Protection Regulation (GDPR).

But COPPA is only one flavor of TikTok’s legal morass.

TikTok’s Beijing-based parent company, ByteDance, plans to put its US division at arm’s length, separating the company to hopefully mollify US politicians who think it could be a national security risk.

If the separation happens, it will be just the latest attempt by ByteDance to prove that TikTok isn’t under China’s thumb. It hasn’t done a particularly good job at that, given its track record when it comes to censoring content that would displease the Chinese government – for example banning (and then reversing the decision) a US teen who used TikTok to decry China’s detention camps for Uighur Muslims.

TikTok admitted this week that yes, it did censor some videos – but for good, anti-bullying reasons, despite leaked documents from a few months ago showing that TikTok previously instructed moderators to follow a series of guidelines that led them to hide videos that flouted Beijing’s doctrine.

As far as the recent class action lawsuit over COPPA violations goes, there’s a chance that it won’t go far. TikTok told The Verge that it’s trying to work things out with the involved parties:

TikTok was made aware of the allegations in the complaint some time ago, and although we disagree with much of what is alleged in the complaint, we have been working with the parties involved to reach a resolution of the issues. That resolution should be announced soon.

Leave a Reply

Your email address will not be published. Required fields are marked *