Instagram will start alerting parents if their kids repeatedly search for terms clearly associated with suicide or self-harm.
The announcement on Thursday comes as Instagram’s parent company, Meta, is in the midst of two trials over harms to children.
A trial under way in Los Angeles questions whether Meta’s platforms deliberately addict and harm minors. Another in New Mexico seeks to determine whether Meta failed to protect kids from sexual exploitation on its platforms.
The alerts will only go to parents who are enrolled in Instagram’s parental supervision program. The company said it already blocks such content from showing up in teen accounts’ search results and directs people to helplines instead.
Thousands of families – along with school districts and government entities – have sued Meta and other social media companies claiming they deliberately design their platforms to be addictive and fail to protect kids from content that can lead to depression, eating disorders and suicide.
Meta executives, including Mark Zuckerberg, have disputed that the platforms cause addiction. During questioning at the Los Angeles trial last Wednesday, Zuckerberg said he still agreed with a previous statement he made that the existing body of scientific work has not proved that social media causes mental health harms.
The head of Instagram, Adam Mosseri, took the stand a week earlier and also pushed back on the science behind social media addiction, denying that users could be “clinically addicted”. Mosseri described children’s high usage of Instagram as “problematic use” – similar to “watching TV for longer than you feel good about”.
Psychologists do not classify social media addiction as an official diagnosis, but researchers have documented the harmful consequences of compulsive use among young people, and lawmakers around the world have repeatedly voiced concern about social networks’ addictive potential.
Instagram’s new alerts will be sent via email, text or WhatsApp, depending on the parent’s contact information available, as well as a notification through the parent’s Instagram account.
Setting up parental supervision on Instagram requires both the teen and their parent to agree by sending an invite through the platform. Teens must be in the 13 to 17 age range, and only one parent is allowed to supervise their child’s account.
“Our goal is to empower parents to step in if their teen’s searches suggest they may need support. We also want to avoid sending these notifications unnecessarily, which, if done too much, could make the notifications less useful overall,” Meta said in a blogpost.
Meta said it was also working on similar notifications to parents about their children’s interactions with artificial intelligence.
“These will notify parents if a teen attempts to engage in certain types of conversations related to suicide or self-harm with our AI,” Meta said. “This is important work and we’ll have more to share in the coming months.”