Starting next week, Instagram will notify parents to watch their teens for terms related to self-harm or suicide. Meta says a similar alert system for its AI chatbots is coming later this year.
Instagram’s new feature sends an alert to parents when their child “tries to search terms clearly related to suicide or self-harm within a short period of time.” It’s rolling out next week in the US, UK, Australia and Canada, but it’s only for parents and teens who choose to monitor. It is expected to expand to other regions later this year.
“The majority of teens do not attempt to search for suicide and self-harm content on Instagram, and when they do, our policy is to block those searches, rather than direct them to resources and helplines that can provide help,” Instagram said in the announcement. “Our goal is to empower parents if their teen’s findings indicate they may need help.
Parent alerts will be sent via email, text, or WhatsApp – depending on available contact information – along with in-app notifications that provide optional resources on how to discuss sensitive topics with your child.
