Facebook’s policy to allow the live broadcast of self-harm on its platform has attracted concern among mental health advocates.
The power to go live and unfiltered to the internet from a phone has opened up enormous possibilities to transform the way we communicate.
But it has created a wave of ethical questions that the world’s biggest social network, Facebook, is being forced to navigate very publicly.
Residents have live streamed the aftermath of fatal shootings in the US, drive-by attacks and racially motivated abuses.
Police around the world now fear there could be a disturbing trend of suicides being live streamed.
In March, the social media giant expanded its suicide prevention tools to Facebook Live, which gives Australian support groups the opportunity to target young people in the moment of their distress.
The scale of the task at hand is growing — a recent leak of documents published in The Guardian reported that moderators were escalating thousands of reports each fortnight.
According to The Guardian, a recent policy update shared with moderators highlighted they were “now seeing more video content — including suicides — shared on Facebook” and that “[Facebook doesn’t] want to censor or punish people in distress who are attempting suicide”.
“However, because of the contagion risk [that some people who see suicide are more likely to consider suicide], what’s best for the safety of people watching these videos is for us to remove them once there’s no longer an opportunity to help the person.
“We also need to consider newsworthiness, and there may be particular moments or public events that are part of a broader public conversation that warrant leaving up.”
It is new, and complex, ground for mental health advocates dealing with emerging platforms like Facebook Live according to SANE Australia — a national charity that helps those affected by mental illness.
Originally published by abc.net.au. Continue reading article here.