EU Investigates Meta for Failing to Protect Children on Facebook and Instagram

The European Union has launched a formal investigation into Meta, the parent company of Facebook and Instagram, over concerns that it is not adequately protecting children on its platforms. This probe could result in significant fines if Meta is found to be in violation of EU regulations.

This latest action reflects a growing focus among regulators on the potentially harmful impact of social media on young users. Concerns include the encouragement of addictive behaviors and exposure to inappropriate content.

The European Commission, the EU’s executive branch, is examining whether Meta has met its obligations under the Digital Services Act (DSA), a comprehensive new law designed to regulate online platforms. The DSA mandates that platforms implement measures to safeguard children, such as preventing access to inappropriate content and ensuring high levels of privacy and safety. Non-compliance with these rules can lead to fines of up to 6% of a company’s global revenue or enforced changes to its software.

In a statement on Thursday, the European Commission expressed concern that Facebook and Instagram might exploit the vulnerabilities and inexperience of minors, potentially leading to addictive behavior. The Commission also raised issues about the effectiveness of Meta’s age verification and assurance methods.

“We are not convinced that Meta has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans,” said Commissioner Thierry Breton. “We are sparing no effort to protect our children.”

Meta responded to these concerns through a spokesperson who stated, “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”

However, a report Meta submitted to the European Commission last September detailing its protective measures for minors has not alleviated regulators’ concerns. The EU remains unconvinced of Meta’s compliance with the DSA, which requires robust strategies to mitigate risks to the physical and mental health of young users.

The scrutiny of Meta is not new. In recent years, the company has faced increasing pressure over the impact of its platforms on youth. Various school districts and state attorneys general in the United States have sued Meta over issues related to youth mental health, child safety, and privacy.

Additionally, earlier this month, an investigation by the New Mexico attorney general into the dangers posed by Meta’s platforms led to the arrests of three men charged with attempted sexual abuse of children.

Meta’s challenges are not confined to youth protection issues. The company has frequently clashed with EU regulators over a range of issues, including its handling of advertisements by scammers, foreign election meddling ahead of upcoming EU elections, disinformation, and illegal content related to the war in Gaza. The European Commission’s ongoing investigation into Meta underscores the significant regulatory pressures facing social media companies today. As the EU continues its efforts to enforce stringent protections for children online, the outcome of this probe could have far-reaching implications for Meta and the broader tech industry.

Share TO
Facebook
Email
WhatsApp
Telegram