Facebook Outlines Coming Mental Health Support Tools

There’s no other way to put it – 2020 has been tough, and we still have three and bit months to go, which, ideally, will deliver some form of good news and/or relief. But nobody knows what’s coming, more so than ever, and that uncertainty has had significant impacts on the mental health of many people, as they work through the various concerns and try to manage the outcomes as best they can.

Really, we all need to give each other a break, and seek to reach out to friends where we can. The lack of in-person interaction takes its toll, in many ways, and while tensions and frustrations are running high, it’s worth trying to maintain a level of compassion, in all situations. Because it’s tough. People are facing major, ongoing challenges and battles, many of which you can’t possibly know about.

Recognizing this, Facebook, as part of World Suicide Prevention Day, has today announced a range of new mental health support tools that it’s looking to roll out in the coming months.

As per Facebook:

“Since the pandemic began, we have taken a number of additional steps to keep people safe, including providing people with tips we developed with global experts, localized resources and easy access to over 100 local crisis helplines through our COVID-19 Information Center. Experts have made clear that making these tips and resources easier to find is key to those seeking help.”

Expanding on this, Facebook will soon also be adding:

New rules around the sharing of self-harm related content

Facebook already has a range of rules in place in regards to self-harm related content, with Instagram expanding its ban on images of self-harm late last year. Now, Facebook’s also looking to implement restrictions on content that may relate to self-harm, but doesn’t violate the current regulations – “such as depressing quotes or memes”.

“We’ll share our approach to address this content soon. These issues are complex and nuanced, but we are committed to doing all we can to address potentially harmful content without stigmatizing mental health.”

This is a difficult area – for some people, it could be that posting memes actually helps them deal with such challenges. But Facebook’s working with various mental health organizations to improve its approach on this front, which will see it extended its parameters around self-harm content in the next few months.

Crisis support via chat

Facebook’s also looking to provide a new, real-time assistance option via Messenger chat.

“Getting people help in real time is especially important when they are in distress. In the coming months, we’ll make it easier for people to talk in real time with trained crisis and mental health support volunteers over Messenger.”

This could be a critical initiative, with the non-invasive, and less confronting nature of a Messenger chat likely to be appealing to many people in need.

Combine this with the fact that some 1.3 billion people currently use Messenger, and that Messenger will also soon be integrated with Instagram Direct chats and WhatsApp, and it could facilitate connection with many at-risk individuals. 

Expanded resources to help young people

Facebook’s also expanding its online resources for educators to help them provide assistance for students in distress. 

“[We’re] adding Orygen’s #chatsafe guidelines on how to help young people talk safely online about suicide to Facebook’s Safety Center. These will be available first in English, and seven more languages next month.”

Facebook has continued to expand its digital literacy resources, which is crucially important given more students are now having to spend increasing amounts of time online.

This is another key element for educators to watch for, and these new resources could be critical in connecting students with better support structures.

Instagram Wellness Guides

Facebook’s also looking to expand on its wellness guides on Instagram to cover more aspects of suicide prevention and mental health:

“We’re launching localized guides that address ways to prevent suicide and support those who might be struggling. For example, in the US, the American Foundation for Suicide Prevention created a guide to help people understand the warning signs of suicide. In India, the Suicide Prevention India Foundation’s guide focuses on how to foster social connectedness; in Hong Kong, Samaritans HK’s guide shares ways to check in on your friends; and in Nigeria, Mentally Aware Nigeria’s guide focuses on having safe conversations about suicide.”

Evolving industry guidelines

Finally, Facebook’s also working with suicide prevention experts to continue to revise and improve its approach to suicide and self-harm content.

“We welcome the guidelines Samaritans launched today, which are designed to help the tech industry address these issues as sensitively and effectively as possible.”

Keeping constantly updated with the latest info is key amid the ongoing pandemic, and its impacts.

This is a key area of concern – in the US, suicide is the tenth leading cause of death overall, claiming the lives of more than 48,000 people every year. But more critically, in the case of Facebook, according to the CDC, suicide is the second leading cause of death among individuals between the ages of 10 and 34. 

Younger users generally over-index on social platforms, which is especially true on Instagram, and with the pressures of comparison against people’s highlight reel updates online, coupled with the added impacts of COVID-19, this is something that needs more focus, and anything that can be done to assist is a positive.

As such, Facebook should be praised for seeking new ways to help those at risk.   

Like this article?

Share on Facebook
Share on Twitter
Share on Linkdin
Share on Pinterest

Leave a comment

Why You Need A Website