On Thursday, the European Union launched an investigation into Facebook parent corporation Meta for potential violations of online content laws related to child protection.
The EU’s executive body, the European Commission, said in a statement that it is studying whether the social media giant’s Facebook and Instagram platforms “may stimulate behavioural addictions in children, as well as create so-called ‘rabbit-hole effects.‘”
The commission also expressed worry about age verifications on Meta’s platforms, as well as privacy threats associated with the company’s recommendation algorithms.
“We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them,” a spokesman for Meta told CNBC via email.
“This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”
The commission stated that its decision to launch a probe stems from a preliminary study of a risk assessment report supplied by Meta in September 2023.
Thierry Breton, the EU’s commissioner for internal market, stated in a statement that the regulator is “not convinced [that Meta] has done enough to comply with the DSA obligations to mitigate the risks of negative effects on the physical and mental health of young Europeans on its platforms.”
The EU stated it will conduct a thorough examination of Meta’s child protection procedures “as a matter of priority.” The bloc can continue to collect evidence through demands for information, interviews, and inspections.
The beginning of a DSA investigation enables the EU to pursue additional enforcement actions, such as interim measures and noncompliance decisions, according to the commission. The commission noted that it can evaluate Meta’s commitments to address its concerns.
Meta and other U.S. internet behemoths have been under increased scrutiny from the EU with the passage of the bloc’s landmark Digital Services Act, a ground-breaking rule enacted by the European Commission to combat harmful content.
The EU’s DSA allows corporations to be penalized up to 6% of their global yearly revenues for infractions. The bloc has yet to impose fines on any tech behemoths under its new legislation.
In December 2023, the EU initiated infringement procedures against X, the corporation formerly known as Twitter, for alleged inability to address content misinformation and manipulation.
The commission is also investigating Meta for potential DSA violations connected to its handling of election disinformation.
In April, the bloc opened an investigation into the firm, expressing concern that Meta had not done enough to prevent disinformation ahead of the 2018 European Parliament elections.
The EU is not the only entity taking action against Meta due to child safety concerns.
In the United States, the attorney general of New Mexico is suing the corporation over charges that Facebook and Instagram facilitate child sexual abuse, solicitation, and trafficking.
A Meta spokeswoman at the time stated that the corporation used “sophisticated technology” and other precautionary measures to weed out predators.