About the author
Symptoms of toxicity on the web are nothing new. Most websites with content co-created by users struggle with this phenomenon. However, extremes are self-mutilation, brutality and suicide. Facebook is trying to fight this wave harder.
Facebook has not censored homicide and suicide streams
Have you even heard of the 2017 tragedies that could be seen on Facebook? Thanks to the streaming function, you could come across such live reports as the murder of your own daughter less than a year old and the hanging of her father, shooting a random passerby by a frustrated pensioner, or repeated rape of a 15-year-old.
All this happened in the USA, which does not mean that Europe is free from contamination, although it also strongly manifests itself on YouTube on our pages, but it usually has a slightly different form. Alarming is the lack of response not only from Facebook, but above all viewers who did not call the emergency number.
Zuckenberg's company has been trying to fight the problem for years
In the new announcement, the company boasts that since 2006 it has been actively cooperating with experts from around the world and faces the problems of suicide and self-mutilation (as you can see, at least once it was not very fruitful cooperation). On the occasion of World Suicide Prevention Day, which was celebrated yesterday, Facebook summarized activities in this area.
Apparently, between April and June this year, Facebook took action on up to 1.5 million content that concerned suicide or self-harm. The mechanisms would be so good that 95% were detected before individual users reported. On Instagram, this percentage was 77% for over 800,000 materials.
Are there signs of improvement?
It is good that at least now the situation according to the company has improved. Pathological, violent and self-destructive behavior should not take place in such a space. In principle, none. Facebook has tightened its policy of presenting the effects of self-mutilation and content of this type cannot take place in Explore on Instagram.
The company is talking about creating and improving procedures to prevent various damned issues. Moreover, more research has been announced on the specifics of suicide talks. Is the detection of a future suicide based on content and conversations still in the bud of his plans?
The new position is to improve the manifestation of negative behavior even better
Facebook would also like to educate its users through the Facebook Security Center or #chatsafe. Thanks to them, you can review recommendations on how to respond to suicide related content posted on the web, or what to do if you want to share your feelings about this topic yourself.
The last noteworthy fact from the company's message is the creation of a new position in the team for public policy and security. As we read in the message "The obligations of this person will be oriented on health and well-being issues, in particular on examining the impact of the application on the health of users."
Good PR is one thing, and the real reality may not necessarily be so rosy. Let's hope that toxicity and self-destructiveness on the Internet will not get worse. At least Facebook would like to do a little more on this issue. Or maybe such good steps towards limiting the processing of private data? The company has just said that location sharing is a plus for use, although it is common sense that it is not so beautiful.
Source: Facebook, Wyborcza press materials