Instagram's Underage Dilemma: Meta's Tenfold Challenges in Safeguarding Youth

Age Verification Issues

Children aged 13 and under can easily access Instagram by falsifying their age, exposing them to inappropriate content​​.

Sexual Exploitation Risks

Meta has been criticized for underinvesting in preventing the sexualization of minors on Instagram, leading to unwanted sexual advances and harassment​​.

Addictive Design Concerns

Meta faces lawsuits alleging that its platform features, designed to be addictive, harm children's mental and physical health​​.

Mental Health Impact

Research revealed by Meta indicates Instagram can worsen mental health issues, particularly among teen girls, contributing to thoughts of suicide and eating disorders​​.

Historical Reluctance to Safeguard

Court documents indicate Meta's past reluctance to implement adequate child safety measures on Instagram​​.

Privacy Violations

Meta has been accused of collecting personal information from children under 13 on Instagram without parental consent, violating privacy laws​​.

Algorithmic Content Concerns

Internal communications at Meta have raised alarms about Instagram's algorithm steering children towards content harmful to their mental well-being​​.

Lack of Parental Consent Compliance

Meta allegedly did not obtain parental consent before collecting data from underage users, despite having knowledge of their presence on the platform​​.

Inadequate Enforcement of Age Restrictions

Until December 2019, Instagram did not require new users to disclose their age, and even afterwards, Meta did not effectively enforce the under-13 user ban​​.

Policy Evolutions for Teen Safety

Meta is updating its policies to hide more sensitive content from teenagers and applying more restrictive content control settings for teens on Instagram and Facebook​​.