Meta Faces EU Investigation Over Child Safety Risks
Brussels, 16 May (ONA)---European Union (EU) regulators have initiated a formal investigation into Meta, the parent company of Facebook and Instagram, over potential breaches of online content rules concerning child safety.
The European Commission expressed concerns about algorithmic systems on these platforms that could exploit children's vulnerabilities and stimulate addictive behavior.
Investigators will scrutinize whether these systems contribute to a "rabbit hole" effect, leading users to increasingly disturbing content. Additionally, the Commission is examining Meta's age-assurance and verification methods.
The investigation falls under the Digital Services Act (DSA), which mandates tech giants to bolster efforts in protecting European users online, especially children.
The DSA imposes strict rules to safeguard children's privacy and security.
Thierry Breton, the EU's internal market commissioner, expressed doubts about Meta's compliance with DSA obligations, particularly regarding the physical and mental health risks posed to young Europeans on Facebook and Instagram platforms.
The investigation underscores the EU's commitment to ensuring online safety and holding tech companies accountable for their platforms' impact on users, especially minors.
---Ends/Thuraiya