The European Commission said Meta had failed to conduct proper risk assessments and lacked effective measures to block underage users from its platforms, or to remove them once they gained access.
Commission data showed that between 10% and 13% of children under 13 already hold accounts on Facebook or Instagram, leading Digital Vice President Henna Virkkunen to conclude that risk assessments submitted by Meta to Brussels must have been flawed.
While Meta's own terms of service set 13 as the minimum age — acknowledging that platform content is inappropriate for younger children — the Commission found the rule is not enforced in practice. Existing age-verification tools can be easily circumvented simply by entering a false date of birth when registering.
The Commission also criticized Meta's reporting mechanism for underage accounts as cumbersome, requiring up to seven clicks to reach the reporting form, which must then be completed manually. Even when reports are filed, the platforms frequently take no action, the Commission found.
Brussels called on Meta to close the security gaps, improve its mandatory risk assessments, and strengthen tools for detecting and removing underage users — steps officials noted would align with the company's own rules.
If Meta fails to comply, the Commission may issue a formal DSA violation finding and impose a fine of up to 6% of the company's annual global turnover, with interim penalties possible until violations are remedied. Meta has been given time to submit a written response.
Wednesday's findings stem from an investigation into Facebook and Instagram opened in May 2024. That probe also examines whether the platforms' interfaces have an addictive effect on minors, including so-called "rabbit hole" dynamics in which algorithms continuously serve engaging content, keeping users online for extended periods.
(jh)
Source: PAP