Meta, formerly known as Facebook Inc., announced on Tuesday that it will be implementing new restrictions on teenage users’ access to certain content on its social media platforms, Facebook and Instagram. The initiative intends to provide a "safe and age-appropriate" experience for its younger user demographic.
According to Agence France-Presse (AFP), the tech giant, headquartered in Menlo Park, California, has automatically adjusted the privacy settings for all teenage accounts to the strictest level. This preventive measure intends to make it difficult for teens to access potentially sensitive content.
The new protective settings will affect accounts belonging to individuals between the ages of 13 the minimum age required to register on both platforms and 15 years old, the latter of which may vary to 18 in some countries.
In addition to content restrictions, these settings will limit others’ access to the teenager's ‘friends’ list, their ability to follow the accounts, and their capacity to comment on posts published by the young users.
Furthermore, the restrictions include preventing teens from finding any search results related to certain terms, including "self-harm," "suicide," "eating disorders," and "bulimia." Should teenagers search for any of these terms, they will instead be presented with a preemptive message offering them to reach out to a professional or a friend, or to review a list of helpful tips, as explained by Meta on Tuesday via its website.
However, these measures will not prevent young users from discussing personal problems or difficulties with individuals on their friends list via
Facebook or Instagram.
Meta has faced persistent criticism in recent years regarding its handling of its teenage user base. In the fall of 2021, the company abandoned its plans to launch a version of Instagram specifically aimed at users under the age of 13, which it had previously announced its intention to establish.
In October 2021, 41 U.S. states filed a civil lawsuit against Meta, accusing
Facebook and Instagram of detrimental impacts on the "mental and physical health of young people." In response to this lawsuit, Meta reiterated its commitment to improving protections for young users by reminding the public that it had already launched a series of tools designed to enhance the online safety of its platforms.