‘Unacceptable’ self-harm images still on Instagram

0
Tony Stower
Image caption The NSPCC’s Tony Stower said social media firms should face “really tough regulation”

The NSPCC has criticised Instagram for continuing to allow “distressing” pictures of self-harm to remain on the site.

Last month, Instagram said all graphic images of self-harm would be removed.

The BBC reported three images of people cutting themselves to Instagram. The social media platform added warnings to two but ruled that all three could remain on the site.

A spokesman for Instagram said it “will take time… to get this right”.

‘Disturbing’

NSPCC head of child safety online Tony Stower said leaving the posts on the site was “simply not acceptable”.

Mr Stower was shown the image by the Victoria Derbyshire programme that Instagram said did “not violate our community guidelines”, of a person cutting their wrist. He said it was “clear… this is a distressing image that should be taken down”.

He said the images could be damaging “to the victims who have self-harmed and may be thinking about [self-harming]”.

Alisha Cowie, the current holder of the Miss England title, cut herself as a teenager, and said it was those same kind of images “that caused me to self-harm” aged 13.

“What’s [leaving them on the site] saying to other children, or even teens or adults on Instagram?” she asked.

She said the images were still “disturbing” for her to see now, as an adult.

Image caption Alisha Cowie said she was inspired to self-harm by images she saw online

Instagram says it does allow pictures of healed scars if they are seen to be posted by people who no longer self-harm and offer support to others.

But in February, Instagram head Adam Mosseri said all graphic images of self-harm would be removed.

His pledge came after the father of 14-year-old Molly Russell, who took her own life in 2017, said Instagram had “helped kill” his daughter.

Mr Stower said it was necessary for the government to place “really tough regulation” on social media firms.

“We’ve seen time and again for the last 10 years these companies will only do the bare minimum.

“They won’t do anything until they’re forced to,” he added.

The government said it would “soon publish a White Paper which will set out the responsibilities of online platforms, how these responsibilities should be met and what would happen if they are not”.

Image caption Molly Russell, 14, took her own life in 2017.

Instagram said in a statement: “Nothing is more important to us than the safety of the people who use Instagram.

“As part of an ongoing review with global experts, we are making changes to no longer allow any graphic images of self-harm, such as cutting, and we are making it harder for people to discover non-graphic, self-harm related content.

“We have a responsibility to get this right and are committed to making this change as quickly as possible, but it will take time.”

Follow the BBC’s Victoria Derbyshire programme on Facebook and Twitter – and see more of our stories here.

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *