Video streaming app TikTok has agreed to pay a $5.7 million fine for allegedly collecting names, email addresses, pictures and locations of children younger than 13 – all illegal under the US’s Children’s Online Privacy Protection Act (COPPA).
This is the largest settlement ever handed down for violating the nation’s child privacy law, the Federal Trade Commission (FTC) said when it announced the settlement on Thursday.
TikTok, based in Los Angeles, merged with Musical.ly in 2018. The Musical.ly app allowed users to create short videos lip-syncing or dancing to music and to share those videos with other users. Beyond letting users create and share videos, the app also allowed users to interact with other users by commenting on their videos and sending direct messages.
80 million US downloads
TikTok is both massively popular and considered to be addictive. It originally launched in China in 2016, where it was known as Douyin (literally: “vibrating sound”). A year later, it hit the international market with its new name, TikTok.
At least one Chinese doctor specializing in addiction has warned that young people are so hooked on social media approval that they’ve been risking their lives to garner likes with their 15-second Douyin clips, which have featured things like dancing in front of a moving bus or trying to flip a child 180 degrees… and then dropping her.
In April 2018, Douyin launched an anti-addiction system that reminded users to rest after using the app for more than 1.5 hours. When South China Morning Post asked TikTok if it would adopt a similar system, it didn’t reply.
As of June 2018, TikTok said it had 500 million monthly active users worldwide and 150 million daily active users in China. As CNBC reported, it became the world’s most downloaded app on Apple’s App Store in the first half of 2018, with an estimated 104 million downloads: more than that of YouTube, WhatsApp or Instagram for the same period. It’s had 80 million downloads in the US.
Some have raised concerns about privacy on TikTok. In particular, given the app’s popularity with children and teens all over the world, there have been concerns about exposure to sexual predators. That was highlighted last month, when Barnardo’s, a major children’s charity in the UK, found that children as young as eight are being sexually exploited online via social media.
The problem with live streaming video apps such as TikTok is that besides being extremely popular, they’re also very hard to moderate. YouTube, for one, has recently been grappling with a major advertiser backlash against lewd comments being added to videos featuring minors. But an app like TikTok has an added level of moderation difficulty, given that it features real-time comments posted directly to the person streaming: a situation that’s ripe for exploitation.
User accounts public by default
Unfortunately, Musical.ly set the stage for exploitation, given that it collected a lot of information. According to the FTC’s complaint, Musical.ly required users to provide an email address, phone number, username, first and last name, a short biography, and a profile picture. Having public accounts meant that other users could see a child’s profile bio, username, picture, and videos.
Users could change their default setting from public to private, but even so, users’ profile pictures and bios remained public, and other users could still send them direct messages, according to the complaint. There have, in fact, been reports of adults using the app to troll for minors to have sex with.
Beyond that, until October 2016, there was a stalker-friendly feature: the app allowed users to view other users within a 50-mile radius.
What parents might not realize about Musical.ly is that it’s not just a lip-synching/dancing app. It gives users the ability to search specific code words that lead to hundreds of videos of teenagers stripping or hurting themselves.
Musical.ly allegedly knew it had under-13 users
According to the FTC’s complaint, the operators of Musical.ly knew that they had a “significant” percentage of users younger than 13. They also received thousands of complaints from parents that their children under 13 had created Musical.ly accounts.
The FTC alleges that Musical.ly operators failed to notify parents that they were collecting and using those kids’ personal information, that they never got parental consent before doing so, and that they failed to delete the kids’ information at parents’ request.
Besides the fine, the settlement also requires the app’s operators to comply with COPPA going forward and to remove all videos made by children under the age of 13.