Facebook's 'Lifesaving' Feature: What Will Mark Zuckerberg Unveil On Tuesday's 'GMA' Interview?

ANALYSIS

 @redletterdave on April 30 2012 10:53 AM
Facebook, the world's biggest social network with 900 million-plus users, is finally ready to make its Wall Street debut on Friday, May 18, 2012. Facebook is clearly dominant in its industry, but buying stock so quickly might be unwise. Here are five reas
Facebook, the world's biggest social network with 900 million-plus users, is finally ready to make its Wall Street debut on Friday, May 18, 2012. Facebook is clearly dominant in its industry, but buying stock so quickly might be unwise. Here are five reasons why you shouldn't go buy Facebook stock right away. Reuters

On Monday, ABC News released a teaser for an upcoming Good Morning America exclusive with Facebook founder Mark Zuckerberg, who invited GMA's Robin Roberts to Facebook's headquarters in Palo Alto to share a feature that could save lives.

But what could it be? If the feature truly is lifesaving, it likely relates to online bullying and suicide prevention, two initiatives Facebook has given serious attention to in recent months.

In March, Facebook announced a new suite of tools at a White House Conference for Bullying Prevention, in an effort to protect users from bullying and create a culture of respect among users.

We're thrilled that President Obama has convened today's summit to discuss an issue we all care about deeply: bullying prevention, Facebook wrote in a blog post. Bullying can happen anywhere-in the classroom, in the schoolyard, on college campuses, in the work place, and through the use of new technologies. So it's only fitting that all of us-parents, kids, educators, safety experts, researchers, and companies, including Facebook-come together to discuss how we can create a culture of respect wherever we are.

Facebook introduced two major features to its safety platform, including a new social reporting option, which let users report content to someone in their support system (i.e. a parent, a teacher) to help root out offensive content, and the company also updated its Safety Center with more multimedia resources for parents and educators, adding educational videos, experts' opinions and resources, and downloadable materials that encourage a conversation around safety so people can make smart choices wherever they are.

Arturo Bejas, Facebook's director of engineering, told Mashable at the time that bullying isn't always intentional, but Facebook's tools aim to be constructive ways to curb bullying.

We want this to be a learning experience where people learn how to deal with bullying and feel empowered, Bejas said. In talking to safety advocates, we learned that a lot of these things are accidental. People post things they think are funny, but they don't realize it's stressful. There's no malicious intent, and it might not violate the terms of service, but it still needs to be resolved.

Even before March's announcement, back in December, Facebook announced a suicide prevention program that allowed users to instantly connect with real crisis counselors through Facebook's chat messaging system.

One of the big goals here is to get the person in distress into the right help as soon as possible, said Fred Wolens, Facebook's manager for public policy.

Whenever Facebook believed a user was going to harm themselves (via a post or a note from a friend), the company would e-mail users or their friends to contact law enforcement or call the National Suicide Prevention Lifeline. The company's new platform sought to be much more pro-active, providing a familiar space where someone can instantly have someone to talk to.

The science shows that people experience reductions in suicidal thinking when there is quick intervention, said Lidia Bernik, the Lifeline's associate project director. We've heard from many people who say they want to talk to someone but don't want to call. Instant message is perfect for that.

In Facebook's new system, if a friend believed someone was in danger of potentially harming themselves, they had an quick and easy option to report the post to Facebook by clicking a link at the bottom of the comment. Facebook promised to quickly respond with an e-mail to the person who made the comment, encouraging them to either call a hotline or click on a given link to launch a confidential chat with a suicide prevention expert.

We have effective treatments to help suicidal individuals regain hope and a desire to live and we know how powerful personal connections and support can be, said the U.S. Surgeon General Regina Benjamin. Facebook and the Lifeline are to be commended for addressing one of this nation's most tragic public health problems.

Facebook's Lifeline is available to users 24 hours a day, with crisis workers always ready to chat.

The Lifesaving Feature: What Could It Be?

It's highly unlikely that the lifesaving feature Zuckerberg referred to would relate to something so frivolous as shopping or gaming or networking. The fact that the announcement will be made on Good Morning America suggests it's not a major piece of technology -- that would be saved for the f8 Developer's Conference -- but it's big enough for a one-on-one interview with Robin Roberts at Facebook HQ. 

All things considered, this new lifesaving feature most likely relates to bullying and suicide prevention, and Facebook will likely announce a change to its software that relates to these initiatives. Could it be that Facebook has finally created an algorithm that can detect suicidal thoughts or expressions?

Facebook has said it's never instituted scanning software to seek out suicidal thoughts or expressions, but that's only because the it would be logistically difficult and easy to misinterpret.

The only people who will have a really good idea of what's going on is your friends so we're encouraging them to speak up and giving them an easy and quick way to get help, Wolens said in December.

While friends and family reporting these questionable posts is ideal -- after all, nobody knows users better than their friends and family -- users can shield themselves from even loved ones with certain privacy features. If a user feels isolated and decides to raise their privacy walls, there's no way their friends or family could know how much they're suffering.

If Facebook could create software that detects if a user may resort to violence, it would be a major coup for the company right before its initial public offering, which is expected to be the largest IPO since Google's Wall Street debut in 2004. Facebook looks to raise between $5 billion to $10 billion to achieve a $100 billion valuation, which is all more likely with the recent acqusition of Instagram for $1 billion, and the well-timed announcement of a lifesaving feature.

The most revolutionary feature Facebook has released in six months has been the Timeline profiles, but that wasn't a hit with all users. If Facebook could pull off a feat like suicide detection and intervention, it could certainly raise some eyebrows, but it would also save the lives of bullied children, teenagers, and also their parents.

Join the Discussion