The problem with social media is that, sometimes, it’s anything but social.
Since 2020, toxic social media content has increased by 20%, according to a study commissioned by youth charity Ditch the Label.
This harmful content often includes attacks on people’s physical appearance, sexual orientation, gender, age, political views, religion or race and ethnicity, but can also cover pornography and material promoting violence and self-harm.
Victims of these types of online harms (and their families and loved ones) often report negative emotions and physical symptoms, such as anxiety, depression, low self-esteem, panic attacks, self-harming behaviour and suicidal thoughts. Some even attempt suicide.
The 2019 Online Harms White Paper argued existing regulatory and voluntary initiatives had ‘not gone far or fast enough’ to keep users safe — suggesting a single regulatory framework to tackle a range of online harms, including a duty of care for social media platforms. The Draft Online Safety Bill (May 2021) also aims to tackle the issue, placing much of this responsibility on platforms and regulators.
But the networks and regulators can’t solve this problem themselves. For one, how do you define what’s offensive? This is highly subjective across countries, regions and cultural groups.
Networks could set their filters very conservatively to avoid a huge fine, blocking anything even slightly precarious, but then you risk doing real damage to freedom of speech. Plus, even if the networks could define it, how do regulators go about enforcing it? The sheer volume means each regulator would need large teams with lots of data science and engineering skills — both of which are in short supply.
So, how do we tackle online harms?
In reality, everyone needs to play more of a role in confronting online hate and toxicity — helping to make social media social again…
Nip harmful content in the bud
Arwen is a comment moderation tool that sits alongside your existing social media listening or posting platforms and uses artificial intelligence (AI) to automatically remove unwanted content from social media channels.
The tool monitors social media comments, scanning text, emojis and misspellings for 24 different types of online hate and unwanted content (across 29 languages) and removing it in a sub-second. Because the content is removed before anyone sees it, Arwen stops pile-ons from starting to nip problems in the bud.
As for freedom of speech… did you know 37% of social media users are less likely to engage in an unsafe online environment where toxic content is present, effectively forcing them out of the conversation? The minority toxic users are limiting their freedom of speech. Arwen brings everyone back in and makes it safe for them to speak freely. Plus, we don’t enforce blanket rules on what speech is or isn’t allowed — each customer sets their own filters.
Creating a safe space for engagement
Arwen is currently protecting 75,126,673 people from online harms, connecting with people and communities to discuss and tackle this growing problem. We work with a range of organisations —from Comic Relief to sports clients like Northampton Saints rugby club and Mercedes Formula 1 — to turn their social media channels into a safe place, pushing back the toxic minority to ensure the majority of followers feel safe to comment and engage. For one of our elite sports clients, we increased the number of followers actively engaging and commenting by 29.4% in just four months.
Our clients also include individuals like Lewis Hamilton and comedian Rosie Jones, who said: ‘I feel much more reassured that I can be my authentic self on social media and that my community can engage with me in a safe environment. We don’t live in fear that a comment might trigger a tidal wave of abuse. I couldn’t recommend Arwen enough’.
Making social media 100% safe is unrealistic. After all, online human behaviour reflects offline behaviour, and there has always been offline hate. But with Arwen, you know you’ve got a shield to protect you — removing harmful content before you ever see it. For more information, book a demo today.