Your children will soon be protected from messages from strangers
Meta strengthens parental controls and restrictions on private messages
Meta, Mark Zuckerberg’s company that owns Facebook and Instagram, has just announced that it will implement new controls to protect its young users from private messages sent by strangers. It’s about time… especially when we know it 68% of primary school children between the ages of 8 and 10 have at least one account on a social network. The initiative responds to the growing concerns of young users about online safety, particularly regarding the dangers of Cyberbullying affects nearly one in four householdsand exposure to inappropriate content. From now on, on Instagram or Messenger (Facebook’s messaging service), users under 18 will no longer receive private messages (also called DMs) from people they don’t follow.
At the same time, Meta has improved its parental control features. On Instagram, any changes to important settings by teenagers, such as switching accounts from private to public, now require parental approval. This initiative allows parents to take a more active role in overseeing their children’s online experience, thereby strengthening their ability to protect their digital well-being.
Prevention and Education: A Dual Approach
Meta doesn’t just limit interactions; The Company is also committed to filtering inappropriate content. The purpose of the algorithmic adjustments is to avoid recommending content that could harm the mental health of minors, such as self-harm or eating disorders. This approach aims to provide a healthy digital environment suitable for young users.
Additionally, Meta plans awareness campaigns and educational resources on Facebook and Instagram aimed at informing teens and their parents about safe online practices. The aim is to promote the conscious and responsible use of social networks, equipping users with the necessary knowledge to navigate these platforms safely.