wordpress blog stats
Connect with us

Hi, what are you looking for?

TikTok’s handling of children’s data invites scrutiny from EU regulators

The popular video-sharing platform has caught the attention of regulators who will investigate its compliance with the GDPR.

TikTok has found itself on the EU’s radar due to the app’s policies for underage users and data transfers to China. Ireland’s Data Protection Commission, the EU’s lead data privacy regulator, launched two investigations into TikTok’s compliance with the European Union’s General Data Protection Regulation (GDPR) on September 14, the regulator announced on their website.

The first inquiry will examine TikTok’s processing of children’s data, and the second will look into the app’s transfer of personal data to China. If TikTok is found in violation of GDPR in either inquiries, the regulator can impose a fine of up to 4% of the app’s global revenue.

An investigation from the EU into TikTok’s compliance with the GDPR could nudge regulators across the world to look into the app’s data privacy standards.

Children’s Data: What EU wants to find out and what GDPR says

In the first TikTok inquiry, Ireland’s Data Protection Commission will look into three areas, according to a press release from the commission’s official website:

  1. TikTok’s design and default settings for the processing of personal data for users under age 18
  2. Age verification measures implemented by the app for persons under 13.
  3. Compliance with the GDPR’s transparency obligations for processing personal data of users under age 18

Article 8 of the GDPR emphasises that companies can only process the data of children after prior, explicit consent is taken. Here are the specific rules companies processing such personal data need to adhere to:

  • Age Limit: Processing of personal data is only lawful for users that are 16 years or older. To process data of younger children, the service provider would have to seek consent for such processing from a parent or guardian.
  • Verification of Consent: In the case of users younger than 16, service providers must make reasonable efforts to verify that consent is given by a parent or guardian.
  • Nature of Consent: Service providers need to seek consent from young users (or from guardians of users who are below 16) that is specific, free, and explicit as well as informed and unambiguous.

Is TikTok’s data transfer to China in line with GDPR?

Here are the key clauses of the GDPR that will be relevant in the inquiry focused on whether TikTok is sharing personal data of users to China:

  • Secure vs Unsecure Countries: According to the GDPR, secure countries are those which the EU has ascertained to have suitable levels of data protection in an adequacy decision. The list of secure countries in the GDPR does not include China.
  • For Unsecure Countries: Data controllers have to ensure that data is sufficiently protected by the recipient through standard contractual clauses (SCCs).
  • Consent Requirement: In case a controller wants to transfer data to an unsecure country without an SCC, they would require the free and explicit consent of users to do so.

MediaNama has reached out to TikTok for comment regarding its data transfer protocols and will update the report when we receive a response.

Steps TikTok has taken for underage users so far

In the backdrop of increasing criticism about the app’s failure to institute child-safe policies, the company has made a slew of changes this year to default settings and features available to under-18 users:

Advertisement. Scroll to continue reading.
  • Direct Messages: 16 and 17-year-old users will have limited direct messaging on their accounts, TikTok announced in August. Access to direct messaging for users below 16 was entirely revoked in April 2020.
  • Notifications: The app will send no notifications to under 16 users after 9 pm and users who are 16 or 17 will get notifications only till 10 pm, the company announced in August.
  • Account Privacy: In January, TikTok decided to make all under-16 accounts private by default. The app’s ‘suggest your account to others’ feature was also turned off by default for under-16 users.
  • Video Downloads: Videos created by users under-16 cannot be downloaded from the TikTok app, the company announced in January. Permission to download videos is set to ‘off’ by default for users between 16 and 17, which they can choose to enable.

What India’s data protection bill says about children’s data

The draft Personal Data Protection (PDP) Bill, 2019 has defined guardian data fiduciaries (GDF) as entities that:

  1. Operate commercial websites or online services directed at children or
  2. Process large volumes of personal data of children.

What are the responsibilities of such GDFs under the draft Personal Data Protection Bill?

  • GDFs are prohibited from “profiling, tracking or behaviourally monitoring or direct targeted advertising at, children”. They cannot process children’s data in a way that causes “significant harm” to the child.
  • GDFs are supposed to verify the age of their users, and obtain consent from their guardian or parents if the user is under 18.
  • Failure to adhere to the provisions can attract a fine of Rs. 15 crore or 4% of the company’s global turnover.

In a MediaNama discussion on children’s data and the Personal Data Protection Bill , we discuss how these fiduciaries will comply with this complex mandate. In another discussion, we also consider if there should be a blanket age of consent for using online services.

Also read: 

Have something to add? Post your comment and gift someone a MediaNama subscription.

Written By

Figuring out subscriptions and growth at MediaNama. Email: nishant@medianama.com

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



Looking at the definition of health data, it is difficult to verify whether health IDs are covered by the Bill.


The accession to the Convention brings many advantages, but it could complicate the Brazilian stance at the BRICS and UN levels.


In light of the state's emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?


The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.


The latest draft is also problematic for companies or service providers that have nothing to with children's data.

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ