Meta, the parent company of Facebook and Instagram, announced new measures to restrict teens' access to content related to suicide, self-harm and eating disorders on its platform. The move was made in response to government pressure and concerns about the impact such content has on the mental health of young users.
These changes include ensuring that content in sensitive categories such as suicide and eating disorders is not visible to users under 18. Even if her teen follows an account that shares such content, she won't be able to see it. Instead, seek help from professional resources such as the National Alliance on Mental Illness.
In addition to content restrictions, Meta defaults to restrictive filtering settings for teen accounts that control the type of content that appears on Facebook and Instagram. This includes filtering recommended posts in Search and Explore that are deemed „sensitive“ or „low quality.“ The default settings are the most restrictive, but users can choose to adjust these settings.
Meta's move comes amid increased government scrutiny of how technology companies treat children on their platforms. Meta CEO Mark Zuckerberg, along with other tech executives, is scheduled to testify before the U.S. Senate on child safety on January 31st. The hearing followed legislative efforts across the United States to restrict children's access to certain content online.
Governments, including the European Union and the United Kingdom, are enacting or considering regulations aimed at holding tech companies accountable for content shared on their platforms. The EU's Digital Services Act includes rules on algorithmic transparency and ad targeting, while the UK's Online Safety Act will require online platforms to comply with child safety rules under threat of fines. ing. The regulations aim to create a safer online environment for users, especially minors, but have faced criticism over potential privacy concerns.
© Copyright 2023 Tekumbu. It's okay.