As with other social media platforms, tumblr has an enormous reach (18,878,347,183 total posts as of the time of this blog). Therefore, it has great potential to help and hurt the public's health as it facilitates communication among millions of people.
A few weeks ago, Tumblr presented to its users a challenge (and possible solution) regarding blogs that promote self-harm. Their users are being asked to weigh in on the policy. I think that is a smart move.
Here is an excerpt from the Tumblr staff blog:
Our Content Policy has not, until now, prohibited blogs that actively promote self-harm. These typically take the form of blogs that glorify or promote anorexia, bulimia, and other eating disorders; self-mutilation; or suicide. These are messages and points of view that we strongly oppose, and don’t want to be hosting. The question for us has been whether it’s better to (a) prohibit them, as a statement against the very ideas of self-harm that they are advancing, or (b) permit them to stay up, accompanied by a public service warning that directs readers to helplines run by organizations like the National Eating Disorders Association.
We are planning to post a new, revised Content Policy in the very near future, and we’d like to ask for input from the Tumblr community on this issue.
The blog goes on to say that they currently think the right answer is to implement a policy against pro self-harm blogs. They aim to focus only on blogs that actively glorify or promote these behaviors. They also intend to start showing public service announcements (PSAs) on specific search terms like "anorexic" or "thinsperation". It is unclear from their post how this policy will actually be implemented. It would take enormous staff resources to comprehensively review their site and remove concerning materials.
Other online and social media platforms have struggled with similar issues regarding how to respond to users that may be searching for or posting worrisome content. Here are a few examples of other challenges and solutions:
- When someone uses Google or Yahoo to search for the term "suicide", the link to the National Suicide Prevention Lifeline is provided as the first result.
- In December 2011, Facebook began a new service that allows "Friends" to report concerning posts. Facebook then sends an email to the person who posted the suicidal comment encouraging them to call the hotline or click on a link to begin a confidential chat. Note that Facebook does not troll the site itself, but instead waits for reports to come in.
- Pop Health has previously posted about the debate regarding Twitter's duty to restrict hate speech vs. its duty to promote free speech among its users.
- What strategy or combination of strategies is best for the public's health?
- If users are prohibited from posting, does that make them more isolated and less likely to connect to services?
- Should the overall health of the user group outweigh the health of that individual?
No comments:
Post a Comment