Meta Platforms (formerly Facebook), announced a series of new measures aimed at enhancing the safety of teenage users on its Instagram and Facebook platforms. These safeguards are designed to mitigate unwanted direct messages and potentially harmful interactions.
Context and Background
The decision to implement these safeguards follows recent actions by Meta in response to heightened regulatory pressure. WhatsApp, a subsidiary of Meta, had previously committed to concealing more content from teenage users, prompted by calls from regulators to address concerns about minors’ exposure to harmful material on social media platforms.
Want to know if you’re earning what you deserve? Find out with LawCrossing’s salary surveys.
The regulatory scrutiny intensified notably after a former Meta employee testified before the U.S. Senate. In their testimony, the ex-employee alleged that Meta was mindful of instances of harassment and other forms of harm affecting teenagers on its platforms but failed to take adequate steps to address these issues.
Enhanced Safeguards
Meta’s new measures primarily target direct messaging features on Instagram and Facebook, aiming to provide greater protection for teenage users:
Instagram Restrictions
- Teenagers on Instagram will now adjust their default settings to prevent them from receiving direct messages from accounts they do not follow or are not connected with.
- Specific settings within the Instagram app will require parental approval before teenagers can make changes, ensuring greater parental oversight and control.
Messenger Restrictions
- Similarly, users under the age of 16 (and under 18 in specific regions) on Messenger will only be able to receive messages from individuals on their Facebook friends list or from contacts in their phone’s contact list.
- Moreover, Meta has implemented a restriction preventing adults over 19 from sending direct messages to teenagers who do not follow them on Messenger.
Meta’s proactive approach in introducing these safeguards reflects a recognition of the importance of ensuring the safety and well-being of young users on its platforms. By addressing concerns raised by regulators and implementing stricter controls over direct messaging functionalities, Meta aims to foster a safer online environment for teenagers while navigating the complex landscape of social media regulation and accountability.
Don’t be a silent ninja! Let us know your thoughts in the comment section below.