Meta has unveiled new measures to protect teenagers on its platforms, particularly concerning sensitive content related to suicide, self-harm, and eating disorders. The company announced that it will restrict teens from viewing such content, deeming it “not age-appropriate” for young users. Even if a teen follows someone who shares such content, it will remain hidden. Additionally, when teens search for related topics on Facebook and Instagram, they will be directed towards “expert resources for help,” such as the National Alliance on Mental Illness. This move is part of Meta’s broader efforts to enhance child safety amid increasing scrutiny from governments worldwide.
As part of the initiative, teen accounts will be defaulted to restrictive filtering settings, impacting the type of content they see on Facebook and Instagram. The filtering settings will affect recommended posts in Search and Explore that are considered “sensitive” or “low quality.” Meta will automatically set all teen accounts to the most stringent settings, although users have the option to adjust these settings if desired. This comprehensive update aims to address concerns about the potential exposure of teens to harmful content on social media platforms, including content that deals with sensitive issues. It reflects Meta’s acknowledgment of the responsibility to create a safer online environment for younger users.
These changes come at a crucial time for Meta, as CEO Mark Zuckerberg is scheduled to testify before the Senate on child safety on January 31, along with other tech executives. The tech giant is under heightened scrutiny and faces evolving regulations, such as the Digital Services Act in the European Union and the Online Safety Act in the United Kingdom. Governments are pushing for greater accountability from major tech companies, focusing on issues related to children’s safety, privacy, and the overall impact of digital platforms on society. Meta’s move to proactively implement measures to protect teens aligns with the broader trend of increased regulatory attention on child safety and well-being in the digital landscape.