Should social media platforms adopt a human-centered moderation system to tackle abuse?



The Drum’s social media manager, Amy Houston, examines whether social media needs to take a more humane approach to content moderation, and why it’s not moving fast enough to tackle racism online.

All eyes are on social media again this week after the disgusting display of online racism targeting three black footballers – Marcus Rashford, Bukayo Saka and Jadon Sancho – following England’s defeat at the Euro last week.

Statements from Facebook (which owns Instagram) and Twitter condemned racial abuse, but the words and actions of these Silicon Valley giants are not comparable.

Digital abuse is nothing new. A 2020 report from the Professional Footballers Association found that 43% of Premier League players in the study experienced targeted and explicitly racist abuse, and 29% of the abuse came in the form of an emoji. The study also found that “Twitter’s algorithms did not effectively intercept racist messages sent using emojis.”

At the time, England and Manchester City striker Raheem Sterling said: “I don’t know how many times I have to say it, but football and social media platforms need to step up, show real leadership and take the right steps to tackle online abuse … Technology is here to make a difference, but more and more I’m wondering if there is the will.

This begs the question: Should social media platforms evolve towards a more human approach to moderation? Algorithms don’t keep up with the speed at which our language is changing and Matthew Cook, Culture and Brand Manager, Gravity Road says: Offensive and traumatic material. We end up shifting the human cost out of sight. “

“It comes down to a question for society: what do we want from technology and what are we not willing to give up for it? Technology will always play a role in moderation, but maybe “we need a combination of increased human moderation to oversee better technology,” he adds.

The subject of content moderation has been debated for years, with many users feeling that social media has been extremely slow to implement truly meaningful practices. An open letter written this week by advertisers and brands, led by the Conscious Advertising Network, condemned how major social networks allow racist abuse.

Human interaction “will always be the key,” says Aaron Seale, senior director of creative content at Jellyfish Social. “But as platforms increase the opportunities for users to create content and have a voice, it’s unimaginable how much content these moderators would need to review. “

He notes that “seeing all of these posts constantly will undoubtedly create emotional and physical exhaustion”, and another potential solution is to get social media to focus more on education and “invest in curriculum-level services. , from infancy to early adulthood ”.

A widely shared online petition calls for verified ID to be a requirement for opening a social media account to combat anonymity used as a weapon to spread hatred. In theory, that sounds like a good solution, but it could potentially harm the people he’s trying to help. By requiring people to use government-issued ID to open a social account, it could alienate large groups of people who don’t have access or the means to identify themselves. In countries where speaking openly about your sexual orientation or government online can get you arrested, “protecting anonymity is crucial for free speech,” says Cook.

Are new social media platforms like TikTok making more progress in content moderation? In a July 9 statement, Eric Han, head of US security, TikTok, said the video-sharing app is advancing its approach to user security and added that while no technology “does can be completely accurate in moderation of content, where decisions often require a high degree of context or nuance, we will continue to improve the accuracy of our technology to minimize incorrect deletions ”.

While no social media platform can be perfect, “we just have to watch the creators of Black TikTok feel repressed in the wake of Black Lives Matter last year or the recent strike to protest. against appropriating black creativity to see that TikTok isn’t doing enough, “Essi Nurminen, chief strategy officer, Born Social, tells me.

Many social media users have noted that in attempting to report racist comments on the pages of the three footballers, they have encountered generic responses from Instagram stating that the platform deals with high levels of reporting and is not able to review them all. However, it is not just social media that should be doing more, “but the government and us as a society,” Nurminen concludes.

Hope this starts to ring true.



Leave A Reply