Top

Bust hate all time: Onus on moderators to take down content

With technology as an enabler, we can build and deploy mechanisms that can help us reduce the effect of hate speech instead of banning it.

Hate crime victims left suicidal and afraid to leave home because of attacks ‘unleashed after Brexit referendum'" - Independent.co.uk

"Facebook Live murder: North Carolina man arrested"- BBC

"A High School Student Committed Suicide After Posting A Message Of Her Intent On YouTube." - Buzzfeed News

"India: A country of cyberbullies?" - Business Standard

Back in 2004, when social media channels like Orkut and Hi5, came into existence, they were a passageway to connect the world through faster communication, in a gamified and interactive manner. Leveraging the power of speech, social media has always been a medium used for expression. Be it through 140 characters, a testimonial on someone's page, or a comment on a picture taken to share with friends.

des freedom which, when in the wrong hands, can be exploited beyond measure. It is in these days that headlines such as the ones above see their sad day as truth.

In the early days, social media platforms were innocent. It was a place for techies, rebels, and outliers to share their thoughts. But the nature of the medium changed as soon as it became a place for almost everyone. The system started behaving like an echo chamber for many with more than a billion people on it coming from different nations, ethnicity, and demography.

In the physical world, the majority lives within confines of their culture and boundaries. Traditionally, victims of hate speech are mostly minorities, immigrants, celebrities, and women. In the physical world, a limited number of bullies gets access to them to make offensive and hateful remarks. On the contrary, in the digital world, everyone is a click away. That’s why it is convenient for anyone to commit hate speech crime on the Internet and these hate speech crimes on social media platforms demand platforms to act beyond their role of mediator and become a moderator.

These faceless moderators now need to pick sides in the digital world and take down content that is evident hate speech and violates the laws of humanism. But this gets tricky as the difference between censoring hate speech and taking away the freedom of speech rests on a fine line of a playful comment and one clearly sharpened to cause pain.

As social media users and creators of a social media platform, we are advocates and can vouch that online platforms are made to spread everything but hate. With stronger privacy rights being deployed to safeguard individuals, deploying mechanisms to manually identify and take down hate, using AI to categorize the sensitivity of the content, and collaborating with the governments, online platforms need to take ownership for the world they are creating. It is a world that doesn’t demand any passports but only respect among online users.

In hindsight, increased censorship seems to be the only option to get rid of hate speech crime. But it is evident that strict censorship leads to the violation of freedom of speech. Strategies to combat it could be more fundamental and multifaceted. With technology as an enabler, we can build and deploy mechanisms that can help us reduce the effect of hate speech instead of banning it.

A strategy we named RAMP will become a major apparatus for social media 3.0 in the near future. Current platforms also have complex mechanisms to detect, assess and remove hate speech. But the RAMP approach mixes new incentivization methods and is built to work along with authorities to combat alienating and politically-influenced online hatred.

Reporting: On every social media, an option to report a post is fundamental. This is the very first step that helps platforms detect hateful behaviour. With incentivized social media, the task of reporting can be outsourced to every user of the platform instead of limiting it to the army of manual reviewers. New platforms built on blockchain-powered incentivization and revenue sharing will reward its users for reporting what they think of as hate speech

Artificial Intelligence: On Facebook, once a post is reported, AI categorizes it into tier-1 to tier-3 buckets. Tier-1 is the most serious. After that, it is the work of experts to decide what to do with it. Taking down content or limiting its reach to groups that might feel offended is a common step. Irrespective of the nature of social media (traditional or 3.0), the progress on the AI part will go on. Actions like removing fake accounts and accounts that are used to spread particular kinds of messages are being taken to combat hate speech.

Measuring : Detecting the source or quantity of online hatred is another mechanism. For instance, a one-time angry response from an individual versus an account that is using censored words constantly on its page is a clear signal of organized crime. Measuring the quantity and intensity of hatred from particular demography sometimes leads to funded and organized operations of online hate crime.

Penalize: The last stage of this strategy is to engage law and cybercrime authorities to take justified legal actions. Vast numbers of digital users have now made the term online into a world of their own and with such scale that there needs to be swifter legal actions much like we face in the real world.

Combating online hatred without strict censorship, such that the fundamental right to speech isn’t violated is a hard task. It took the Internet decades to realize that what was expected to become a liberating power ends up becoming obvious to censorship and a place to attack groups and individuals.

Educating the future users of the Internet in schools about values and how to put forth opinions without hateful targeting could be a good place to start on the non-technological side of the problem. Renaming “Online hate speech” to “hate speech” is another good start to reassess the problem and view it as one whole issue to tackle because hate, be it online or offline, is still a powerful word with all negative connotations attached to it.

(The author is Co-founder and partner – Huddle)

Next Story