TikTok's €345M Children Privacy Fine

Ireland's DPC issues landmark penalty for children's data protection violations - Key lessons for India's DPDPA

Case Overview

Violation Details

  • Company: TikTok Technology Limited (Ireland)
  • Fine Amount: €345 million
  • Date: September 2023
  • Regulator: Irish Data Protection Commission (DPC)
  • Period: July 31, 2020 to December 31, 2020

Key Violations

  • • Processing children's data without legal basis
  • • Failure to implement default privacy settings
  • • Inadequate transparency in data processing
  • • Age verification system deficiencies
  • • Public account defaults for under-16 users

Detailed Violation Analysis

Public-by-Default Profiles

TikTok defaulted accounts of 13-17 year olds to "public," exposing children's content to any TikTok user globally.

GDPR Article Violated:

Article 25 - Data Protection by Design and Default

Age Verification Failure

Inadequate age verification mechanisms allowed children under 13 to create accounts despite platform's 13+ age requirement.

GDPR Article Violated:

Article 8 - Conditions for Children's Consent

Legal Basis Issues

TikTok failed to establish proper legal basis for processing children's personal data beyond what was necessary for service provision.

GDPR Article Violated:

Article 6 - Lawfulness of Processing

Transparency Failures

Privacy information was not presented in an age-appropriate, clear, and understandable manner for children and parents.

GDPR Article Violated:

Article 12 - Transparent Information

Data Minimization

TikTok processed more personal data than necessary, including location data and device identifiers for advertising purposes.

GDPR Article Violated:

Article 5 - Principles of Processing

Best Interests Assessment

TikTok failed to conduct proper assessments of children's best interests when designing features and default settings.

GDPR Requirement:

Child Data Protection Impact Assessment

Irish DPC's Enforcement Action

Penalty Calculation

Base Fine€300M
Aggravating Factors€45M
Total Administrative Fine€345M

Note: This represents 1.7% of TikTok's global annual revenue, well below the maximum 4% GDPR allows.

Corrective Measures

Default Privacy Settings
All under-18 accounts must default to private
Age Verification Enhancement
Implement robust age verification mechanisms
Transparency Improvements
Age-appropriate privacy information and controls
Data Minimization
Limit data collection to essential purposes only

Critical Lessons for India's DPDPA Implementation

Key takeaways for protecting children's digital rights under DPDPA

1. Privacy by Default for Children

TikTok's €345M fine demonstrates that platforms must implement the highest privacy settings by default for children, regardless of business model preferences.

DPDPA Application:

Section 9 of DPDPA requires enhanced protection for children. Indian platforms should default to maximum privacy settings for users under 18, with clear parental controls.

2. Robust Age Verification Systems

The failure to properly verify ages and prevent under-13 users from accessing the platform contributed significantly to the penalty amount.

DPDPA Application:

Indian platforms must implement technically feasible age verification methods, potentially including document verification or biometric age estimation for child-directed services.

3. Child-Friendly Transparency

Privacy information must be age-appropriate, using simple language, visual aids, and interactive elements that children can understand.

DPDPA Application:

Indian platforms should develop child-specific privacy notices using icons, videos, and simple language appropriate for different age groups (8-12, 13-16, 16-18).

4. Best Interests Assessment Framework

Platforms must conduct formal assessments of how their features and default settings impact children's rights, safety, and well-being.

DPDPA Application:

Indian platforms should establish Children's Rights Impact Assessment (CRIA) processes, consulting with child development experts and privacy advocates.

DPDPA Children's Privacy Implementation Guide

Practical steps for Indian platforms to protect children's data rights

Technical Safeguards

Age Gate Implementation

Multi-factor age verification including date of birth, document verification, and behavioral analysis

Default Privacy Settings

Maximum privacy controls enabled by default for all users under 18

Parental Controls

Comprehensive parental dashboards with activity monitoring and control capabilities

Compliance Framework

Data Minimization

Collect only essential data required for service functionality, no profiling for children

Regular Auditing

Quarterly children's privacy compliance audits with external verification

Rights Exercise

Simplified mechanisms for children and parents to exercise data rights