Facebook Moderators & Their Broken Moral Compass

Facebook’s motto is ‘Move fast and break things’. That ethos might be good for innovation, but it appears to be contributing to a growing problem. Quite simply, Facebook has lost control of its content. And it’s having real world, negative effects on people’s lives.

At the beginning of the month we wrote about the escalating problem of hate speech and hate crimes on social media. People are using Facebook to showcase assaults, suicides and murder, in real time. And just recently we’ve had Evan Williams, a co-founder of Twitter, telling the New York Times that he believes the internet is broken.

He says the current format of the internet rewards extremes. And those extremes, be they good or bad, get shared and ultimately generate profit. In the old days, newspapers just had to worry about the front page grabbing readers’ attention. Today, every single story needs to have a front page quality to get people to click. The internet is groaning under poor-quality content. And now, the democratisation of communication has led to fake news and crimes being published online.

Yes Mr Williams, it does appear that the internet is broken. But society as a whole seems to be broken too.

We’ve been looking to the social media gatekeepers to help with this problem. After all, if we see something we don’t like online, the only recourse we have is to flag that content so it can be assessed and removed. But what if the moral code of those gatekeepers leaves a lot to be desired? What if their decisions seem inconsistent and concerning to us? What can we do then?

Facebook Moderators’ Training Manuals

The Guardian recently published what they’re calling the Facebook Files. Facebook’s training manual for moderators, long shrouded in mystery, have been leaked. Their rules for dealing with violence, hate speech, terrorism, pornography, racism and self-harm certainly make for disconcerting reading. Facebook’s moral code seems quite askew.

Threats about President Trump would be removed because he’s a public figure. But general threats against women or minority groups are permitted because they aren’t specific enough to be deemed credible. Images and videos of violent deaths can be left online because they may raise awareness of mental health issues. Sometimes content of children being bullied and harassed are allowed to remain live. Abortion videos are permitted if there is no nudity. These inconsistencies are staggering and quite odd. Facebook wants to protect free speech whenever possible, which is admirable. However, some of the content users are posting remind me of the saying “this is why we can’t have nice things”.

In addition to the strangeness of Facebook’s rules, moderators are faced with an endless stream of distressing images, text and video day in, day out, and are often only given 10 seconds to make a call on whether to delete reported content or not. Definitely not enough time to make an adequate assessment.

Facebook is drowning beneath a sea of questionable material – at the same time it is giving us new tools which make it easier than ever before to broadcast to large audiences. One thing is certain, this problem is only going to get worse. Social media is holding up a mirror to society and sometimes we don’t like what we see. And now I’m getting a look at Facebook’s moral compass, I don’t like what I see there either.

Move fast and break things indeed.

WHO ARE VKN DIGITAL?

Christian Taylor is a writer and digital marketer and a member of the VKN Digital team. VKN Digital is a digital marketing agency in Hertfordshire, UK. We aim to help SMEs overcome marketing hurdles and create compelling digital content that drives results. Need help creating and launching a native advertising campaign? How about assistance with SEO or social media? Please contact us for a free Skype consultation.

Write a comment