Instagram is launching a new safety feature that alerts parents when their teenagers search for suicide or self-harm related content on the platform. The update, announced today by Meta, marks the company's latest effort to address growing concerns about teen mental health on social media as lawmakers and parents demand stronger protections for young users.
Instagram is expanding its parental controls with a feature that could reshape how families navigate teen safety online. Starting today, Meta will automatically notify parents when their teenagers search for content related to suicide, self-harm, or eating disorders on the platform.
The alert system works within Instagram's existing Family Center, which allows parents to supervise their teen's account activity. When a teen conducts a search using keywords associated with self-harm or suicidal ideation, their parent or guardian receives an immediate notification through the Family Center dashboard. Along with the alert, Meta provides access to mental health resources and guidance on how to start conversations about online safety.
The timing isn't coincidental. Meta has spent the past two years fighting a public relations battle over its impact on teen mental health. Internal documents leaked in 2021 revealed the company knew Instagram made body image issues worse for one in three teen girls. Since then, the company has faced lawsuits from dozens of state attorneys general and congressional hearings where CEO Mark Zuckerberg apologized directly to families affected by social media harms.
"We recognize parents want more visibility into their teen's online experiences, especially around sensitive topics," a Meta spokesperson said in a statement. The company didn't specify whether the alerts are opt-in or automatic for all accounts using Family Center supervision.
But the feature raises questions about privacy and trust between parents and teens. Mental health advocates have long debated whether surveillance-style monitoring helps or harms young people in crisis. Some experts argue that teens might avoid seeking help online if they know searches trigger parental alerts, potentially cutting them off from support communities and resources.
The notification system appears to focus on search behavior rather than content consumption, meaning parents won't be alerted if their teen simply scrolls past sensitive content in their feed. Meta uses a combination of keyword detection and machine learning to identify concerning searches, though the company hasn't disclosed specifics about which terms trigger alerts.












