Skip to content. | Skip to navigation

Personal tools
Sections
You are here: Home Breaking news Facebook, Instagram under EU investigation over child safety

Facebook, Instagram under EU investigation over child safety

17 May 2024, 22:39 CET
— filed under: , , ,
Facebook, Instagram under EU investigation over child safety

Internet child - Image by Caliroi on Pexels

(BRUSSELS) - The European Commission opened formal proceedings Thursday over Facebook and Instagram may have breached the EU Digital Services Act in areas linked to the protection of minors.

The Commission's concern is that the systems of both Facebook and Instagram, also including their algorithms, may stimulate behavioural addictions in children, as well as create so-called 'rabbit-hole effects'.

The EU executive is also concerned about age-assurance and verification methods put in place by Meta, the provider of both Facebook and Instagram.

"We have concerns that Facebook and Instagram may stimulate behavioural addiction and that the methods of age verification that Meta has put in place on their services is not adequate and will now carry on an in-depth investigation," said EC vice-president Margrethe Vestager: "We want to protect young people's mental and physical health."

The opening of proceedings is based on a preliminary analysis of the risk assessment report which was sent by Meta in September 2023, Meta's replies to the Commission's formal requests for information (on the protection of minors and the methodology of the risk assessment), publicly available reports as well as the Commission's own analysis.

The current proceedings address the following areas:

  • Meta's compliance with Digital Services Act (DSA) obligations on assessment and mitigation of risks caused by the design of Facebook's and Instagram's online interfaces, which may exploit the weaknesses and inexperience of minors and cause addictive behaviour, and/or reinforce so-called 'rabbit hole' effect. Such an assessment is required to counter potential risks for the exercise of the fundamental right to the physical and mental well-being of children as well as to the respect of their rights.
  • Meta's compliance with DSA requirements in relation to the mitigation measures to prevent access by minors to inappropriate content, notably age-verification tools used by Meta, which may not be reasonable, proportionate and effective.
  • Meta's compliance with DSA obligations to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, particularly with regard to default privacy settings for minors as part of the design and functioning of their recommender systems.

If proven, the failures would constitute infringements of Articles 28, 34, and 35 of the DSA. The Commission stresses however that the opening of formal proceedings does not prejudge its outcome, and is without prejudice to any other proceeding that the Commission may decide to initiate on any other conduct that may constitute an infringement under the DSA.

EU Official Journal text on the DSA 

Very large online platforms and search engines under the DSA 

The enforcement framework under the Digital Services Act


Document Actions