TikTok has announced changes to its app to better protect underage users, by limiting their public visibility and also giving users more control over who can see and comment on their videos. TikTok’s head of US safety Eric Han, head of privacy in Europe Elaine Fox, and global minor safety policy lead Tracy Elizabeth announced the updates on Wednesday.
The short-video app will set the accounts of users aged between 13 and 15 years to “private” by default, circumscribing the people that can interact with these accounts. Only someone who the user has approved to follow them can view their videos. Users in this age group can also choose to either disallow comments on their videos or only let their Friends comment; the “everyone” option will be removed. “Suggest your account to others” will be off by default for this age group, and they can no longer make their videos downloadable. Further, the Duet and Stitch features, which lets users repost and respond to another user’s video, will be available only for users aged 16 and above.
The company is also changing default settings across the app for users under 18. Unless they actively change the settings, 16 and 17 year-olds will have downloads disabled by default. In addition, duet and stitch videos, by default, can only be viewed by Friends.
The changes were announced for the US, UK, France, Germany, Japan, Korea, Italy, Spain, Russia, Australia, Malaysia, Singapore, Ireland, Brazil, Latin America, Netherlands, the Philippines, and Turkey. Essentially, all geographies TikTok operates in except for Thailand, India (where it is banned), Vietnam, Indonesia, Taiwan.
So far, TikTok does not permit direct messaging to accounts belonging to those below 16, and does not permit sending virtual gifts to users below 18 years. TikTok’s new policy changes, interestingly, account for different age groups among minors, who have different needs from the internet. Different age groups’ understanding of privacy and harms risks also vary.
We know there is no finish line when it comes to protecting users and their privacy, and our investment in this important area won’t stop here. We’ll continue to evolve our policies, work closely with regulators and experts in minor safety, and invest in our technology and teams so that TikTok remains a safe place for everyone to express their creativity — TikTok
The short-video company has also announced a partnership with media company Common Sense Networks for additional guidance on appropriateness of content for users under 13 years.
Privacy by default, removing the ability for strangers to comment on children’s videos and restricting downloads are important steps in the fight against the grooming, sexual exploitation and abuse of children online. – Iain Drennan, Executive Director, WeProtect Global Alliance
Protections for children came in after FTC penalty
In February 2019, TikTok was slapped with a $5.7 million penalty by the Federal Trade Commission for violating children’s privacy laws in the US. The Commission’s probe had begun when TikTok was still Muscial.ly, prior to the Bytedance acquisition, on complaint that it had collected personal data of children below 13 without parental consent. After the ruling, TikTok began verifying users’ age, those below 13 years are directed to a “limited, separate app” with increased privacy tools. Kids below 13 cannot share videos, comment on videos, message other users, or maintain a profile or followers.
Read more:
- After US, UK investigates TikTok over how it handles children’s personal data
- Break end-to-end encryption to trace child porn distributors, make ISPs liable: Recommendations from Rajya Sabha Committee
- TikTok Fined $5.7M By The US FTC For Violating Kids’ Privacy
I cover health, policy issues such as intermediary liability, data governance, internet shutdowns, and more. Hit me up for tips.
