MY BBC1 'NEWS AT TEN' APPEARANCE! | 22.05.2017 | Ad

Waking up to a missed phone call from Raf Hamaizia and a text asking me to call him back; I didn't, for one minute, think that returning the call would result in an appearance on the BBC national news!
A BBC Producer (Elizabeth Neeham-Bennett) had been in contact via the Group Medical Director at Cygnet Healthcare, with the opportunity for an Expert By Experience to be featured on the BBC 'News at Ten' later that day (22nd May).
The piece was in response to The Guardian's publication of leaked Facebook policies given to their moderators (the people that get your message when you click that little 'report' button). The policy mentioned 'revenge porn,' 'threats of violence,' and 'hate speech' and 'terrorism;' but the BBC were focusing their attention on the Facebook guidelines surrounding self-harm, and suicide, related content posted on Facebook.


“We’re now seeing more video content – including suicides – shared on Facebook. We don’t want to censor or punish people in distress who are attempting suicide. Experts have told us what’s best for these people’s safety is to let them livestream as long as they are engaging with viewers."
https://www.theguardian.com/news/2017/may/21/facebook-users-livestream-self-harm-leaked-documents
"Such footage should be “hidden from minors” but not automatically deleted because it can “be valuable in creating awareness for self-harm afflictions and mental illness or war crimes and other important issues.”
https://www.theguardian.com/news/2017/may/21/revealed-facebook-internal-rulebook-sex-terrorism-violence


As I was talking to Liz (the Producer) on the phone she was asking lots of questions and I had to bite my tongue and try to contain myself at some points because I was just like "I have a lot to say about this!" We talked about social media and mental health in general and I explained that I felt there's so much attention being paid to the negatives and if we 
focused more on the positives then maybe people would realise that they do exist; and there are support networks out there for someone struggling with their mental health. Then we moved onto Facebook specifically...


"The documents also tell moderators to ignore suicide threats when the 'intention is only expressed through hashtags or emoticons' or when the proposed method is unlikely to succeed. Any threat to kill themselves more than five days in the future can also be ignored, the files say."

https://www.theguardian.com/news/2017/may/21/facebook-users-livestream-self-harm-leaked-documents




I told Liz about an instance a few years where Facebook contacted me telling me that there'd been a number of reports on one of my photos and that they found the content inappropriate and concerning and advised that I remove the photo. There was another instance where I posted a completely different photo but received some messages from friends and family telling me that it'd upset them. No word from Facebook. And the photos? One was
 whilst I was an inpatient in a Psychiatric Hospital and I took a photo of my healing scars on my arms with a message alongside that read:
"IT'S TIME TO TALK #endmentalhealthstigma p.s I'm sorry if some find this upsetting but it has to be done. Mental health problems ARE real"

And the other photo? That one was taken by a member of my family whilst I was in Intensive Care after a suicide attempt had resulted in me being put on life-support. The message alongside it read:
"Tomorrow will be six years since my 'trauma' ended. But it wasn't the end. In fact, in many ways it was the beginning of a whole new one. This photo is of me on a ventilator with a central line giving me treatment for an overdose. It wasn't that the overdose was that bad, it was that I felt that bad so I refused treatment. I wanted the memories to stop. This photo reminds me of how far I've come, it tells me to keep going too. I don't want the picture to upset people, but inspire people; you can be at your very lowest and come back. Come back fighting."

I used these examples to get my opinions across to Liz that to me, photographs of scars and the other one I've posted are there to help people to see that 1. to stop and think instead of hurting yourself because it can be dangerous and it can have a permanent result and 2. there is hope. You can recover. You can feel better - feel differently. You can heal - whether that be physically or mentally, or both!
I feel that in posting photos of open wounds, vlogs of you having your cuts stitched up in A&E, and especially livestreaming the actual act of self-harming; can not possibly, have a positive influence, impact or affect on anyone.
Liz asked whether I saw the guidelines being leaked as a good thing or bad thing... I told her that I actually think it's been a good thing; because it has provided the opportunity for everyone to give their opinions; and hopefully that will mean Facebook will revisit and reconsider their guidelines for moderators. I also found it interesting because there's been a number of times when I have reported comments on videos that've gone viral, inappropriate photos etc and I'll get the report back that Facebook have reviewed the reported comment and decided to let it remain online. And it sometimes has me thinking "what on earth has to be posted for them to actually do something?!" I found my answer...

On 'Hateful, Hurtful, and Violent Content': "Remarks such as “Someone shoot Trump” should be deleted, because as a head of state he is in a protected category. But it can be permissible to say: “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”, or “fuck off and die” because they are not regarded as credible threats."
On 'Graphic Violence' and 'Animal Abuse': "Photos of animal mutilation, including those showing torture, can be marked as disturbing rather than deleted. Moderators can also leave photos of abuse where a human kicks or beats an animal."
On 'Graphic Violence' and 'Non-Sexual, Physical Child Abuse': We do not action photos of child abuse. We mark as disturbing videos of child abuse. We remove imagery of child abuse if shared with sadism and celebration.”
One slide explains Facebook does not automatically delete evidence of non-sexual child abuse to allow the material to be shared so “the child [can] be identified and rescued, but we add protections to shield the audience”. This might be a warning on the video that the content is disturbing.
https://www.theguardian.com/news/2017/may/21/revealed-facebook-internal-rulebook-sex-terrorism-violence

Since it was about 1:30/2pm in the afternoon by the time everything was confirmed, Liz sent down a Producer and Camerman (whose names I can't remember... and I feel really bad for it!) who were in Leeds so that they would have a few hours for filming and time to send the footage to London where it would have to be edited and broadcast within two hours in order for the piece to be ready for the 10 o'clock slot!
The Producer at my home also asked how I felt about Facebook after the release of these policies, I told her that,  in part, it was a relief to hear that they do actually consider 'reports' and review content; but also, it's disappointing to hear that they don't seem to have their priorities in order when it comes to what should or shouldn't be permitted. 
However, it's not going to stop me from using the social network (and others like it) the way I always have which is being honest and open whilst keeping some parts of my life private. Unfortunately, as soon as you do anything online you make yourself vulnerable to negative responses but for the website you're using to increase the chance of this by leaving inappropriate content online is careless. At the end of the day, someone live streaming their self harm or posting photos of it, has a certain amount of mental capacity and there has to be a point where Facebook have to accept a level of having a duty of care. 
tweet me your own thoughts and opinions on this issue: 



Blogger Template Created by pipdig