Meta is under investigation by the European Commission for violating the Digital Services Act’s protection of minors on Facebook and Instagram
It states that it is apprehensive that the company’s algorithms could potentially induce addiction in minors and generate “rabbit-hole effects.” It also intends to investigate the sites’ age verification and assurance methods.
“We established rules to protect minors during their online interactions with the Digital Services Act,” said Margrethe Vestager, executive vice president for a Europe suitable for the digital age.
Vestager further stated, “We have concerns that Facebook and Instagram may incite behavioral addiction and that the age verification methods implemented by Meta on their services are inadequate; we will conduct a thorough investigation.” “We wish to safeguard young people’s mental and physical health.”
The foundation for the start of proceedings is a preliminary analysis of the risk assessment report that Meta submitted in September of last year, Meta’s responses to the commission’s formal information requests and publicly available reports, and the commission’s analysis.
In particular, the commission is concerned that the interfaces of the sites may take advantage of the vulnerabilities and lack of experience of juveniles to induce addictive behavior and direct them down a rabbit hole of increasingly harmful material through the suggested content.
In addition, it states that the company’s age-verification procedures for preventing juveniles from accessing inappropriate content may not be practical, reasonable, or proportionate.
Furthermore, it implies that Meta might need to adhere to its responsibilities under the DSA to establish suitable and proportionate safeguards to ensure the utmost privacy, safety, and security for minors. This is evident in the default privacy settings it employs for minors within its recommender systems.
The accusations would violate Articles 28, 34, and 35 of the DSA upon verification.
“We are not convinced that it has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms, Facebook and Instagram,” according to Thierry Breton, the commissioner for the internal market.
Breton stated, “We shall now conduct a comprehensive examination of the platforms’ potential addictive and “rabbit-hole” effects, the efficacy of their age verification mechanisms, and the degree of privacy provided to minors through recommender systems.” “Less is being done to ensure the safety of our children.”
Meta asserts it has numerous child-protective features, such as parental oversight tools, “take a break” alerts, and a Quiet Mode. It reportedly blocks potentially harmful content for adolescents. It now allows young users to disable direct messages (DMs) from any Instagram user, including other teens with whom they do not have a connected account or do not follow.
“We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools, features, and resources designed to protect them,” according to a representative.
“This is an industry-wide challenge, which is why we continue to develop industry-wide age-assurance solutions that are implemented on all apps adolescents access.” We anticipate providing the European Commission with comprehensive information regarding our work.
The announcement is the most recent in a spate of DSA-related investigations. For instance, the commission initiated an inquiry into Meta last month regarding the company’s policies and practices about deceptive advertising and political content. Additionally, it is examining TikTok, X, and AliExpress.