Snap puts in place new measures to guide youth towards safer online experiences
Posted on 2023 Sep,11

Listen to the article


Snap Inc. has unveiled a suite of new safeguards to further protect 13-to-17-year-old Snapchatters against potential online harms. These safeguards, which will begin to roll out in the coming weeks, are designed to protect youth from being contacted by people they may not know in real life, provide a more age-appropriate viewing experience on Snapchat’s content platforms, as well as enable Snap to more effectively remove accounts that may be trying to market and promote age-inappropriate content through a new strike system and new detection technologies. These new measures will put in place greater enforcement and regulations to guide youth towards safer experiences while also limiting unwanted contact from suspicious accounts.

 

Real relationships that are safer to navigate

Moving forward, when youth become friends with someone on Snapchat – such as a close friend, family member, or other trusted person – they will need to be existing Snapchat friends or phone book contacts with another user before they can begin communicating. It is now also harder for a user to show up as a suggested friend to another user outside their friend network.

As part of the rollout, Snapchatters can expect to see the following upgrades to their user experience:

  • In-App Warings:  Youth will now receive pop-up warnings if they are contacted by someone they don’t share mutual friends with or the person isn’t in their contacts. This message will urge users to carefully consider if they want to be in contact with this user and not to connect with them if it isn’t someone they trust.
  • Stronger Friending Protections: Snapchat already requires a 13-to-17-year-old to have several mutual friends in common with another user before they can show up in Search results. They are now raising this bar to require a greater number of friends in common based on the number of friends a Snapchatter has. This aim is to further reduce the ability for youth and strangers to connect.

 

New Strike System to Crack Down on Accounts Promoting Age-Inappropriate Content

Across Snapchat’s two main content platforms - Stories and Spotlight - users can find public Stories published by vetted media organizations, verified creators, and Snapchatters. On these public content platforms, Snapchat applies additional content moderation to prevent violating content from reaching a large audience. 

To help remove accounts that market and promote age inappropriate content Snapchat recently launched a new Strike System. Under this system, any inappropriate content detected proactively or that gets reported will be immediately removed. If an account tries to repeatedly circumvent rules, it will be banned. 

“From the start, Snapchat was designed to be different, built as an antidote to traditional social platforms - prioritizing the safety, privacy and wellbeing of our community, especially our younger audiences. A huge share of our audience comes from the GCC region, and we continue to work on creating a better online ecosystem that also offers safe avenues for young Snapchatters. In the coming months, we will build on these upgrades by launching new and improved resources for families to help encourage safer habits and practices for using Snapchat,” said Georg Wolfart, Head of Public Policy at Snap Inc.

Beyond these developments, Snap has released new in-app educational content that will be featured on Snapchat’s Stories platform and available for Snapchatters using relevant Search terms or keywords. Snap will also be releasing an updated safety guide for parents and new YouTube explainer series that walk through suggested protections for youth.