RTX1SQLE
Facebook has a plan to address its users suicidal posts. Facebook CEO Mark Zuckerberg is seen on stage during a town hall with Indian Prime Minister Narendra Modi at Facebook's headquarters in Menlo Park, California Sept. 27, 2015. Reuters/Stephen Lam

Facebook posts aren't always about great times, cute baby photos or funny YouTube videos. Sometimes they're about suicide or suicidal thoughts.

The world was reminded of this Sunday when Sinead O'Connor shared the following on Facebook: “The last two nights finished me off. I have taken an overdose. There is no other way to get respect. I am not at home, I’m at a hotel, somewhere in ireland, under another name,” she wrote.

Police in Ireland later found the singer-songwriter and said she was receiving medical assistance, Fox News reports. However, again on Monday, O’Connor shared another Facebook post that ringed of pain: “Jake, Roisin, Jr., frank, Donal, Eimear, I never wanna see you again. You stole my sons from me.” On Tuesday, she called out to her family for help, via Facebook.

O’Connor’s first post about an overdose was shared over 5,000 times and liked by more than 7,000 Facebook users. It received over 12,000 comments. One user sent a poem. Another wrote, “don’t give up ever.” Another comment, that received one like, read, “0 f---s given.”

But while those comments -- ranging from kind words to heartless insults -- may be public, Facebook relies on private acts to set a process in motion to help.

Facebook does not use algorithms or have employees monitoring posts that could be suicidal. Rather, it relies on its users to flag distressing content to the company. As it has grown, Facebook has taken an increasingly active role in helping assist those who may attempt suicide and connect them to others -- be it mental health specialists or friends.

The social network provides identifying information to local authorities and the National Suicide Prevention Lifeline -- such as name, email and city -- of the person from Facebook. Facebook declined to disclose whether the company offers more exact information about the user's location.

Facebook continues to expand how it empowers users to act during times of crisis. Currently, Facebook is testing alternative versions of its traditional ‘Like’ button -- including a heart -- in overseas markets. The network had previously tested a “sympathize” button. Facebook, in partnership with suicide prevention organization Forefront, recently completed a focus group on ways it can improve its services such as offering more online support groups.

Not To Like, To Act

Over the years, as Facebook has grown in employees and technical sophistication, it’s been working to further empower those reporters and those afflicted. “Keeping you safe is our most important responsibility,” read a post from February that announced updates to its prevention tools in the U.S.

For example, when reporting a post as related to self-harm, Facebook suggests messaging another friend. A filler message reads, “Hey, this post [NAME] made makes me worried. Can you help me figure this out?”

The tool also advises reporting the post to Facebook. “If you think we should know about this post, we'd like to take a look at it as soon as possible and offer support to [NAME]. Your name will be kept confidential,” Facebook explains on the site.

Among its 12,000 employees, Facebook has a team, literally referred to as the Safety Team and also Community Operations, that is staffed around the world and online 24/7, to review any report sent in. Reports related to “self-harm” are prioritized.

If the user is evaluated as at risk, Facebook will send an email to that user. If it’s a serious incident where an employee, who is trained in Facebook’s protocol and by suicide prevention experts, believes the person might be in imminent danger, Facebook’s team can choose to alert emergency services and provide information, as it has done since the early days.

Facebook is the first to admit it's no expert when it comes to addressing mental health. “We realized there’s a lot we don’t know,” Jennifer Guadagno, a Facebook researcher, wrote in February. And so, Facebook has consulted and continues to work with nonprofit organizations and specialists.

Since 2006, Facebook has partnered with the National Suicide Prevention Lifeline, called the Lifeline, to help establish the company's protocol and also to report cases (For cases outside the U.S., Facebook works with what its dubbed the "appropriate international organization.") In 2011, Facebook and the Lifeline teamed up to add a feature where Facebook users can chat in real-time with suicide prevention specialists.

In 2014, Facebook established a relationship with Forefront, a suicide prevention organization affiliated with the University of Washington, to develop new ways to help its users who are deeply upset. They’re also in continuous conversation.

"Facebook is about sharing experiences and supporting their users in the best times, but Facebook also wanted to provide support to its users that are potentially in crisis," Matthew Taylor, Forefront’s executive director, said. "They were looking to better understand how people can support one another safely during a suicide crisis and even the appropriate language."

The language used in Facebook’s tools is one result of Forefront’s partnership. The nonprofit helped "Facebook, and by extension, society refrain from using the word 'committed' suicide. We advocate for changing it to, for example, died from suicide or took one’s own life."

Facebook's February update also included a suggestion for one-on-one messaging with the victim, instead of immediately directing someone to a specialist. "Our belief is suicide and depression is often about loneliness and isolation. A direct connection with somebody who cares is one of the most powerful prevention and intervention strategies," Forefront's Taylor said.

Facebook also partnered with two other suicide prevention organizations, Now Matters Now and Save.org, in 2014.

Uncensored

For now, Facebook posts and Twitter messages that are reported as suicidal aren’t censored. All of O’Connor’s posts remain up, and Facebook users have continued to comment, share and like.

Still, social networks have not sat by idly. “What social media platforms do these days ... is encourage people not to like these posts, not to jump on the bandwagon and make inflammatory comments. They ask people to flag these,” Dr. Karen North, a clinical professor at USC Annenberg’s School of Communication, said.

“In the world of social responsibility, most of the bigger social networks have created very simple click here opportunities to report content that should be attended to,” she continued.

Suicide Deaths in the United States | FindTheData

Twitter has a page on its Support site titled “Dealing with Self-Harm and Suicide.” “We will provide the reported user with available online and hotline resources and encourage them to seek help,” the site reads.

Like Facebook, Twitter has also taken responsibility for acting themselves: “If you don’t feel comfortable reaching out to the person on your own or aren’t sure how to reach them, you can also alert Twitter.”

Facebook is currently taking more steps to address suicide prevention. Forefront's Taylor said his team worked with Facebook on a focus group where they asked, "Could an online support group affiliated with Facebook be helpful, and if so, what would be the parameters, and what would be the concerns?" The focus group included both active Facebook users and people who are not on the social network.

"I think that’s very proactive and innovative for Facebook to consider how they could use the power of their platform not just to prevent suicide in the moment but also provide resources for an ongoing conversation,” said Taylor, who is continuing to consult with Facebook on what steps to take in the future. “I admire them for that.”