Social Media hate speech is back in the headlines. Today the Home Affairs Select Committee published its paper ‘Hate crime: abuse, hate and extremism online’. It highlights major failings from social media’s key players to effectively tackle hate speech and inappropriate content on their networks. MPs believe that enough is enough. They state that organisations like Google, Twitter and Facebook should face fines if they don’t deal with these important issues as a matter of urgency.
In a statement, Committee Chair MP Yvette Cooper says, “Social media companies’ failure to deal with illegal and dangerous material online is a disgrace. They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shameful. These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people’s lives. This isn’t beyond them to solve, yet they are failing to do so. They continue to operate as platforms for hatred and extremism without even taking basic steps to make sure they can quickly stop illegal material, properly enforce their own community standards, or keep people safe.”
This year we’ve seen some shocking incidents taking place on social media platforms. Most recently there was the shooting murder streamed on Facebook Live. Back in March it was alleged that extremists and hate preachers had made at least £250,000 by allowing advertising to be shown alongside their YouTube videos. In February, a young girl in Wales was bulled so heavily via Snapchat that it drove her to take her own life.
Social media: how is it different?
Social media is always in our palms, purses and back pockets. It goes where we go. Our digital communication ultimately makes money for these massive tech organisations. Social media is more influential than traditional media for some demographics. However these companies are largely unregulated, compared to TV and newspapers, which are bound by legislation. Today, the UK government is putting its foot down and saying that this needs to change.
Social media is a reflection and an extension of the real world. If the content coming through these platforms is broken, doesn’t that mean society is broken too? Social media giants have given us the tools to communicate to whoever we like, to spread messages and cultivate audiences of our own. Who is really to blame if we don’t like what we see online? Are the social media organisations themselves to blame? After all, YouTube sees 65 years worth of video content uploaded every single day. Who on earth is going to screen all of that content to ensure it meets community guidelines? Can we really lay the blame solely at the door of the social media giants?
Social media hate speech: what should the big players be doing?
It goes without saying that social media organisations could all be doing a better job. According to Guardian, Google plans to extend its “trusted flagger” programme. Facebook told MPs that will review the way they deal with violent and objectionable content. They need to. It took more than two hours for them to remove the aforementioned murder video. Meanwhile, if Facebook don’t like an advertisement that you create on their site, they’re very swift to delete your entire account.
It’s good to see Facebook tackling the issue of fake news with new tools for users. Fundamentally, if we are going to tackle the issue of harassment, and inappropriate, extremist content, we need a multi-faceted approach.
Sure, we need better reporting mechanisms and faster response times. But we also need people to take more responsibility for themselves, for the things they post and for the things they view. Victims of online bullying should have more offline support available. There should be enough offline support for online bullying victims to ensure they get the help they need to deal with the situation. If we see something inappropriate online, we shouldn’t just turn a blind eye, we should report it. Social media is a mirror for society. We can’t expect the powers that be to fix the problem. We all have a role to play in making our world a better place, and that includes our online world too.
WHO IS VKN DIGITAL?
Christian Taylor is a writer and digital marketer and a member of the VKN Digital team. VKN Digital is a digital marketing agency in Hertfordshire, UK. We aim to help SMEs overcome marketing hurdles and create compelling digital content that drives results. Need help creating and launching a native advertising campaign? How about assistance with SEO or social media? Please contact us for a free Skype consultation.Tags: bullying, extremist content, online bullying, social media, trolling