Video content app TikTok has been penalized $5.7 million by the US Federal Trades Commission for violating the Children’s Online Privacy Protection Act (COPPA), after complaints that it illegally collected personal information of children below 13 years without parental consent.
The complaint alleged that TikTok, then known as Musical.ly, was aware that a significant percentage of its users were younger than 13. The FTC had begun looking into TikTok when it was still Musical.ly, and said that 65 million people (or 32.5%) in the US of Musical.ly’s 200 million global downloads were in the US. COPPA requires a company to get parental consent before collecting personal information from children below 13 years of age.
The complaint: violation of children’s privacy online
The complaint alleges that TikTok violated the Children’s Online Privacy Protection Act (COPPA) by failing to notify parents about Musical.ly collecting personal information from kids below 13, obtaining their consent before doing so, and deleting that personal data at the parents’ request.
- Users have to provide personal information such as email address, phone number, username, first and last name, a short biography, and a profile picture to register for TikTok
- User accounts were public by default; a child’s bio, username, picture, and videos posted could be seen by everyone
- Even though the app gave an option to switch to a ‘private’ account, users’ profile pictures and bios remained public, and allowed direct messages from anybody
- There have been public reports of adults trying to contact children via Musical.ly
- Until October 2016, the app even allowed users to view other users within a 50-mile radius
TikTok will now verify its users’ age
The FTC ruling will impact how the app functions for children below 13 years. TikTok said that it will now require all users to verify their age. Both new and existing users aged below 13 years will be directed to a “limited, separate app” with increased privacy tools. The separate app will restrict them from sharing personal information and publishing videos.
The restricted, separate app was designed in line with FTC’s guidance for mixed-audience apps. Kids below 13 cannot share videos, comment on videos, message other users, or maintain a profile or followers.
- TikTok will take down all videos by under-13 kids, as required by the FTC settlement.
- An app update will notify new and existing users of the age distinction; kids will need to verify their birthday in order to be directed to the restricted app
Facebook and a similar complaint to the FTC
Child rights’ advocates and other consumer groups in the US are urging the FTC to investigate whether Facebook violated federal laws by allegedly fooling kids into spending their parents’ money on online gaming. The complaint alleges that Facebook enabled “friendly fraud” by encouraging game developers to allow kids spent their parents’ money without their consent.
- Facebook settled an earlier lawsuit regarding this in 2016 and updated its policy to address game purchases by minors
- Advocacy groups are arguing that it wasn’t sufficient and that the FTC must look into whether Facebook violated the FTC Act (which prohibits unfair and deceptive acts) and possibly COPPA.
Also read: Facebook paid teens to install an app that collected their data: report
I cover health, policy issues such as intermediary liability, data governance, internet shutdowns, and more. Hit me up for tips.
