Report Suicidal Content on Facebook

Facebook launched a new Report Suicidal Content feature for those of us in the US and Canada, where friends can report a concern about a post, picture or status update that someone wrote. But, it’s not obvious as to how to go about creating this actual report. One way is to go directly to the Report Suicidal Content page.

The other is to choose the option “Edit or Remove” when you hover over the post (you’ll see a small pencil show up in the right hand corner). Choose “Report/ Mark as Spam” and then, instead of clicking ok, you need to click on the small text asking you to “file a detailed report”.

Next, select “Violence or Harmful Behavior” and then through the drop down menu, select “Suicidal Content”.

So…what actually happens when suicidal content is reported?

“Facebook’s User Observations team will email the user who posted the content marked suicidal a link to a private web chat with a crisis representative from the National Suicide Prevention Lifeline.

[To] avoid abuse of the tool, and people crying wolf, by having the User Observations team carefully screen all the reports and only sending the private link out to those that have been deemed “actionable.”

However…in cases where Facebook has information indicating the threat is imminent, the company will “take all possible action to get help”…[by calling] local emergency and law enforcement authorities to respond, which the company also urges users to do on its “Report Suicidal Content” entry form.”

Facebook wants to expand the feature Worldwide, but did not give a timeline in this article.

From what the media is reporting, there is a mixed response to this new feature both in the US and Canada.

I know FB is trying to reduce the number of false alarms by not making the tool easily accessible…but if you didn’t stay up on the news, how would you know you can report suicidal content if it’s hidden under the guise of “Report as Spam”? (Plus, it looks like you’d report cyber-bullying in the same place. I had no idea!) I’m sure I can speak for everyone when I say that I hope this does help prevent suicidal behavior. I’m just not sure how functional the feature is if it’s basically only word of mouth that’s informing users of its existence.

I also hope the data is being collected and researched, as it will be very useful in understanding the warning signs of online suicidal ideation, and it’s similarities and differences than the presentation in “real life”.

Leave a Reply

Your email address will not be published. Required fields are marked *