Instagram will alert parents when their teens search words related to suicide, self-harm
Britain said in January it was considering restrictions to protect children online, following Australia’s move in December. Spain, Greece and Slovenia have said in recent weeks they are also considering limiting access.

Instagram said it will notify parents if their teen repeatedly searches for terms related to suicide or self-harm within a short period, as pressure grows on governments to comply with Australia’s ban on social media use for people under 16.
Britain said in January it was considering restrictions to protect children online, following Australia’s move in December. Spain, Greece and Slovenia have said in recent weeks they are also considering limiting access.
❮❯
Meta Platform Inc.-owned Instagram said Thursday it will start alerting parents who have signed up to its optional supervision setting if their children try to access content suggesting suicide or self-harm.
“These alerts build on our existing work to help protect teens from potentially harmful content on Instagram,” the platform said in a statement. “We have strict policies against content that promotes or glorifies suicide or self-harm.”
Instagram said its current policy is to block such searches and redirect people to support resources, adding that it would begin rolling out alerts starting next week for those signing up in the United States, Britain, Australia and Canada.
Governments are trying to protect children from harm online, particularly after concerns over AI chatbot Grok, which generated non-consensual sexual images.
In the UK, measures designed to prevent children’s access to pornography sites have had an impact on the privacy of adults, and have led to tensions with the US over limits on free speech and regulatory access.
Instagram’s “teen accounts” under the age of 16 require parental permission to change settings, while parents can opt out of an additional layer of monitoring with their teen’s consent.