EU Investigates Meta for Risks to Minors

The European Commission has filed a case against Meta, the company that owns the popular social networks Facebook and Instagram. Brussels fears that the Digital Services Act (DSA) may have been breached. Specifically, there are concerns about protecting minors and that “Facebook and Instagram, through their algorithms, may encourage behavioral addictions in children.” And, as a consequence, this leads them to isolation and risk of depression or, more generally, danger to their mental health.

“We are not convinced that Meta has done enough to meet the DSA’s obligations to minimize the risks of adverse effects on the physical and mental health of young Europeans on Facebook and Instagram,” explained French politician Thierry Breton, who serves as EU Commissioner for the internal market.

In particular, social networks have been accused of creating the so-called “rabbit hole” effect, meaning that algorithms consistently suggest similar content, “trapping” the user. A dynamic that any social media user will recognize, but which, however, can have detrimental effects on the very young. It would also be desirable to explore ways to verify the age of users and the security methods implemented by Meta.

Just a few weeks ago, on April 22, the EU launched an investigation into TikTok on similar grounds, and a few days later, the Chinese social network suspended its rewards program implemented in France and Spain.