Hate Speech Loads the Gun, Misinformation Pulls the Trigger - And It Is Profitable

Hate speech loads the gun, misinformation pulls the trigger - And that's the kind of the relationship that we've come to understand over the years. Credit: Shutterstock.
  • by Baher Kamal (madrid)
  • Inter Press Service

Hate speech has now reached dangerous records, fuelling discrimination, racism, xenophobia and staggering human rights violations.

It mainly targets whoever is not “like us” i.e ethnic minorities, black, ‘coloured,’ and Asian peoples; and Muslims worldwide through widespread Islamophobia, let alone the millions of migrants, and the billions of poor. In short, the most vulnerable human beings, let alone the world's girls and women.

The UN reports that the new communications technologies are one of the most common ways of spreading divisive rhetoric on a global scale, threatening peace around the world.

A new UN Podcasts series, UNiting Against Hate, explains how this dangerous phenomenon is being tackled worldwide.

Online hate speech on staggering rise

According to a leading international human rights organisation, Minority Rights Group, one analysis records a 400-fold increase in the use of hate terms online in Pakistan between 2011 and 2021.

Being able to monitor hate speech can provide valuable information for authorities to predict future crimes or to take measures afterwards.

There is concern amongst human rights experts and activists that hate speech is becoming more prevalent, with views once perceived as fringe and extreme, moving into the mainstream.

An episode of UNiting Against Hate features Tendayi Achiume, the outgoing UN Special Rapporteur on contemporary forms of racism, and Jaroslav Val?ch, who is the project manager for fact-checking and news literacy, at Prague-based media development organisation “Transitions”.

‘Hate speech is profitable’

For Tendayi Achiume, a former independent UN human rights expert, more attention needs to be paid to the business models of social media companies.

“A lot of the time people want to talk about content moderation, what should be allowed on these platforms, without paying close attention to the political economy of these social media platforms. And it turns out hate speech is profitable”.

Hate speech and misinformation, closely related

Chris Tucker, the executive director of the Sentinel Project warns that hate speech and misinformation are closely related: “Hate speech loads the gun, misinformation pulls the trigger.“

“And that's the kind of the relationship that we've come to understand over the years”.

It's now theoretically possible for any human being who can access an Internet connection to become a producer of that sort of content. And so that really does change things, and with a global reach, adds Chris Tucker.

The Sentinel Project is a Canadian non-profit organisation who’s Hatebase initiative monitors the trigger words that appear on various platforms and risk morphing into real-world violence.

Tucker describes it as an “early warning indicator that can help us to identify an increased risk of violence.”

It works by monitoring online spaces, especially Twitter, looking for certain keywords, in several different languages, and then applying certain contextual rules to determine what was or was not most likely to be actually hateful content.

In the Balkans

Another organisation doing a similar kind of hate speech mapping is the Balkan Investigative Reporting Network.

The Network monitors every single trial related to war crime atrocities in Bosnia and Herzegovina and amounts to 700 open cases.

In mapping hate it looks out for four different aspects; “hateful narratives by politicians, discriminatory language, atrocity denial and actual incidents on the ground where minority groups have been attacked.”

Politicians fuelling hatred

According to Dennis Gillick, the executive director and editor of their branch in Bosnia and Herzegovina, the primary drivers of hate narratives in the country are populist, ethno-nationalist politicians.

“The idea behind the entire mapping process is to prove the correlation between political statements and political drivers of hate and the actual atrocities that take place.”

The Network also wants to prove that “there is a lack of systematic prosecution of hate crimes and that the hateful language allows for this perpetuating cycle of violence, with more discriminatory language by politicians and fewer prosecutions.”

As a result of hate speech, we have seen a rising number of far-right groups being mobilised, explains Gillick.

Fake humanitarian groups spreading hateful language

“We are seeing fake NGOs or fake humanitarian groups being mobilised to spread hateful or discriminatory language, in order to expand this gap between the three different ethnic and religious groups in this country.”

The real-life consequences reported by the Network have included defacing or vandalising mosques, or churches, depending on where a specific faith group is in the minority, and open calls to violence.

According to Gillick, this is fuelling the agenda of ethno-nationalist parties who want to cause divisions.

Need to create counter narratives

The way to combat this toxic environment, according to Gillick, is to create counter-narratives, disseminating accurate, factual information and stories that promote unity rather than division.

However, he acknowledges that this is a big ask.

“It is difficult to counter public broadcasters, big media outlets with several hundred journalists and reporters with thousands of flights a day, with a group of 10 to 15 journalists who are trying to write about very specific topics, in a different way, and to do the analytical and investigative reporting.”

Minorities under attack

Another organisation that is trying to create counter-narratives is Kirkuk Now, an independent media outlet in Iraq, which is trying to produce objective and quality content on these groups and share it on social media platforms.

“Our focus is on minorities, internally displaced people, women and children and, of course, freedom of expression,” says editor-in-chief of Kirkuk Now, Salaam Omer.

“We see very little content in the Iraqi media mainstream. And if they are actually depicted, they are depicted as problems.”

Social media moguls urged to change

The heads of many of the world’s biggest social media platforms were urged to change their business models and become more accountable in the battle against rising hate speech online.

In a detailed statement, more than two dozen UN-appointed independent human rights experts - including representatives from three different working groups and multiple Special Rapporteurs - called out chief executives by name.

They said that the companies they lead “must urgently address posts and activities that advocate hatred, and constitute incitement to discrimination, in line with international standards for freedom of expression.”

They also said the new tech billionaire owner of Twitter, Elon Musk, Meta’s Mark Zuckerberg, Sundar Pichai, who heads Google’s parent company Alphabet, Apple’s Tim Cook, “and CEOs of other social media platforms”, should “centre human rights, racial justice, accountability, transparency, corporate social responsibility and ethics, in their business model.”

And they reminded that being accountable as businesses for racial justice and human rights, “is a core social responsibility, advising that “respecting human rights is in the long-term interest of these companies, and their shareholders.”

The human rights experts underlined that the International Convention on the Elimination of Racial Discrimination, the International Covenant on Civil and Political Rights, and the UN’s Guiding Principles on Business and Human Rights provide a clear path forward on how this can be done.

Corporate failure

“We urge all CEOs and leaders of social media to fully assume their responsibility to respect human rights and address racial hatred.”

As evidence of the corporate failure to get a grip on hate speech, the Human Rights Council-appointed independent experts pointed to a “sharp increase in the use of the racist ‘N’ word on Twitter”, following its recent acquisition by Tesla boss Elon Musk.

This showed the urgent need for social media companies to be more accountable “over the expression of hatred towards people of African descent, they argued.

Soon after Mr. Musk took over, the Network Contagion Research Institute of Rutgers University in the US, highlighted that the use of the N-word on the platform “increased by almost 500 per cent within a 12-hour period,” compared to the previous average, the human rights experts said.

© Inter Press Service (2023) — All Rights ReservedOriginal source: Inter Press Service

Where next?

Advertisement