Meta brings parental control features to Messenger and Instagram

Menlo Park, California - A series of safety features aimed at safeguarding teenagers will be rolled out across Meta's social media platforms, the company announced on Tuesday.

Facebook parent company Meta has introduced new safety features to Messenger and Instagram.
Facebook parent company Meta has introduced new safety features to Messenger and Instagram.  © Collage: 123RF / inkdrop & kovop58

The most significant announcement involves the introduction of parental supervision tools for Messenger.

Parents and guardians will now have access to information such as the amount of time their teens spend on Messenger, updates to their contacts list and privacy settings, and details about who can message them and view their Messenger stories.

Alongside Messenger, Meta is also developing features to restrict who can send direct messages on Instagram.

Users will need to send an invite and obtain permission from others before initiating direct messages on the platform.

Only one invite can be sent at a time, and users must wait until the invitation is accepted before sending another.

Invites will be limited to text, preventing the inclusion of images, videos, or voice messages and calls.

Facebook and Instagram introduce parental control features

Meta has come under pressure for failing to protect young users from damaging content.
Meta has come under pressure for failing to protect young users from damaging content.  © REUTERS

In addition to the new Messenger and Instagram tools, Meta has announced enhanced supervision features for Instagram.

When teens block someone, they will receive a notice encouraging them to add their parents for account supervision.

Although it may be unlikely that many teens will opt for this option, it provides an additional layer of protection.

Parents will have visibility into the number of mutual friends their teens have with the accounts they follow and are followed by, enabling a better understanding of their kids' relationships with other users.

These announcements come shortly after a Wall Street Journal investigation uncovered instances where Instagram's recommendation systems pushed connections related to the sharing of illicit material involving minors.

In response, Meta disabled thousands of hashtags and dismantled 27 networks associated with such content over the past two years.

While Meta's apps have a minimum age requirement of 13, the company acknowledges that children can find ways to bypass these restrictions.

In addition to the safety features, Meta has implemented measures to encourage healthier usage patterns among teens. After spending 20 minutes on Facebook, teens will receive a notification urging them to take a break from the app, similar to an feature introduced earlier for Instagram.

Meta feels the heat from leiglsators

The social media industry is facing increased scrutiny from lawmakers and regulatory agencies regarding the impact of social media on teenagers.

In May, US Surgeon General Dr. Vivek Murthy issued an advisory highlighting the effects of social media use on young users.

In the same month, the Biden administration established a task force dedicated to examining the influence of social media on children.

Meta faced additional scrutiny in 2021 when Facebook whistleblower Frances Haugen revealed internal documents indicating that the company was aware of the negative impact Instagram had on teen girls.

Cover photo: Collage: 123RF / inkdrop & kovop58

More on Facebook: