19 February 2019
Instagram has launched a new sensitivity feature in a bid to reduce exposure of minors to provocative, self-harming, and disturbing content on its platform. The feature blurs questionable pictures and video-thumbnails on the app until the viewer opts in. The new feature comes post the suicide of a British teenager whose parent stated that she was exposed to self-harm and suicidal content on the social network.
The new feature blocks images of cutting and self-harm which could pop-up in search, as well as recommendations or hashtags that could influence minors into physical danger.
Adam Mosseri, Head of Instagram announced the rollout of the new "sensitivity screens" feature. He expressed grief on the suicide of British teenager Molly Russell.
"We are not yet where we need to be on issues of suicide and self-harm. We need to do everything we can to keep the most vulnerable people who use our platform safe," Mosseri wrote.
UK Health Secretary Matt Hancock issued a warning to Facebook-owned Instagram to improve protection for young people on its apps or it will face legal action.
"We already offer help and resources to people who search for such hashtags, but we are working on more ways to help," Mosseri stated.
"At Instagram, nothing is more important to us than the safety of the people in our community and we do not allow posts that promote or encourage suicide or self-harm. We rely heavily on our community to report this content and remove," Mosseri added.
Facebook and Instagram are already facing criticism for the spread of violent content as well as fake news. Last year, Instagram introduced a "prompt" feature which aims at curbing drug abuse and substance sales on the social network. It has a "get support" option which can direct people looking for help against drug-abuse issues and connects them to recovery and treatment organisations.