Ireland's DPC issues landmark penalty for children's data protection violations - Key lessons for India's DPDPA
TikTok defaulted accounts of 13-17 year olds to "public," exposing children's content to any TikTok user globally.
GDPR Article Violated:
Article 25 - Data Protection by Design and Default
Inadequate age verification mechanisms allowed children under 13 to create accounts despite platform's 13+ age requirement.
GDPR Article Violated:
Article 8 - Conditions for Children's Consent
TikTok failed to establish proper legal basis for processing children's personal data beyond what was necessary for service provision.
GDPR Article Violated:
Article 6 - Lawfulness of Processing
Privacy information was not presented in an age-appropriate, clear, and understandable manner for children and parents.
GDPR Article Violated:
Article 12 - Transparent Information
TikTok processed more personal data than necessary, including location data and device identifiers for advertising purposes.
GDPR Article Violated:
Article 5 - Principles of Processing
TikTok failed to conduct proper assessments of children's best interests when designing features and default settings.
GDPR Requirement:
Child Data Protection Impact Assessment
Note: This represents 1.7% of TikTok's global annual revenue, well below the maximum 4% GDPR allows.
Key takeaways for protecting children's digital rights under DPDPA
TikTok's €345M fine demonstrates that platforms must implement the highest privacy settings by default for children, regardless of business model preferences.
Section 9 of DPDPA requires enhanced protection for children. Indian platforms should default to maximum privacy settings for users under 18, with clear parental controls.
The failure to properly verify ages and prevent under-13 users from accessing the platform contributed significantly to the penalty amount.
Indian platforms must implement technically feasible age verification methods, potentially including document verification or biometric age estimation for child-directed services.
Privacy information must be age-appropriate, using simple language, visual aids, and interactive elements that children can understand.
Indian platforms should develop child-specific privacy notices using icons, videos, and simple language appropriate for different age groups (8-12, 13-16, 16-18).
Platforms must conduct formal assessments of how their features and default settings impact children's rights, safety, and well-being.
Indian platforms should establish Children's Rights Impact Assessment (CRIA) processes, consulting with child development experts and privacy advocates.
Practical steps for Indian platforms to protect children's data rights
Multi-factor age verification including date of birth, document verification, and behavioral analysis
Maximum privacy controls enabled by default for all users under 18
Comprehensive parental dashboards with activity monitoring and control capabilities
Collect only essential data required for service functionality, no profiling for children
Quarterly children's privacy compliance audits with external verification
Simplified mechanisms for children and parents to exercise data rights