Meta, parent company of Facebook and Instagram, is implementing new default settings that will stop strangers from messaging minors. Parents and guardians will be able to access enhanced parental supervision tools allowing them increased control over the safety and privacy settings of their teens’ Instagram accounts.
These changes are the tech firm’s latest effort to offer better protection to minors on its platforms.
In a Newsroom post, Meta elaborated that these stricter default messaging settings will apply to teens under 16 years old (or under 18 in some countries). This will mean that they won’t be able to receive messages or be added into group chats by users they’re not connected to on Instagram or on Facebook (or Facebook Messenger).
Meta began this present push to reinforce its safeguards for minors in 2023 after years of ongoing discussion about the impact social media platforms like Instagram and Facebook have on children and teenagers reached boiling point. Last year, a New Mexico state attorney leveled some heavy accusations at Meta’s algorithms and Mark Zuckerberg in court, as reported by The Verge, calling Facebook and Meta a “marketplace for predators in search of children.”
This development follows an earlier policy which banned users aged over 19 from contacting minors through direct messages if the minors don’t follow them, and Meta has now extended this to apply to all users who try to message minors. Teens whose accounts are supervised and linked to the accounts of a parent or guardian will have to ask for permission to change this new default setting.
According to Meta, Instagram users will be formally informed of the changes with a message at the top of their feeds.
(Image credit: Hans Christian Haraldsen Moen)
New parental controls incoming
This new change isn’t all that parents and guardians should be aware of: parental supervision tools are also being revamped. So, if a child tries to change their safety and privacy settings, a prompt will appear for parents asking if they want to approve the child’s request.
Previously, parents would only be notified of any changes made by children to their Instagram accounts. This strikes me as a pretty important change and I’m a little surprised this wasn’t the case already. While there is a balance to be struck between control and freedom for children and parents, especially in the age of rapidly-changing tech, I think parents appreciate changes that make it easier to protect their children in a proactive way, while also respecting their children’s privacy.
According to The Verge, Meta is also currently working on a new feature that will hopefully help protect users from inappropriate, unwanted, or harmful images sent by other users they’ve already connected with. It will also attempt to discourage the users who are sending such content from doing it in the first place. Meta hasn’t stated a date for when users can expect this change to come about, but that more information will come – hopefully soon.
This is a good move from Meta, in my opinion, even if it’s pretty overdue. This is dependent, of course, on parents being aware and involved in their children’s digital lives at all and that remains one of the biggest issues facing child safety on the internet. Moves from social media companies like this are certainly welcome, but they should not be a replacement for active and considerate parenting either.
YOU MIGHT ALSO LIKE…
Instagram and Facebook users will soon no longer be able to chat in new updateInstagram’s Threads is a better photography app than… InstagramFacebook is being flooded with fake ads that are actually malware
Source: TechRadar – All the latest technology news
Leave a Reply