Commentary: Redpilling, rabbit holes and how far-right ideology spreads in online spaces
The employment of slick, sophisticated messaging in the far reaches of the Internet is worrying but more alarming is the appropriation of popular culture in redpilling, says Gareth Tan.
SINGAPORE: The Internal Security Department (ISD) recently announced the detention of a 16-year old Protestant Christian of Indian descent for planning to attack two mosques, after being radicalised through exposure to far-right extremist ideology on the Internet.
This shocking event is a chilling demonstration that Singapore is not immune to the allure of such movements, despite seeming so distant from its places of origin in Europe and North America.
We tend to think the threat is far removed.
After all, past news coverage on the ideologies of the far-right in the West closely associates the concept with white supremacy and predominantly racial ideologies asserting the superiority of Caucasians and seeking to create exclusively Caucasian states.
But far-right ideologies cover a broader spectrum of attitudes, being linked, as the United Nations’ recognises, by “hatred and racism towards minorities, xenophobia, Islamophobia or anti-Semitism”.
READ: Commentary: Why Asia may not be immune to far-right terrorism
READ: Commentary: Why support for the Islamic State has persisted in Southeast Asia
Their transnational popularity has surged in recent years, due to their growing presence in both mainstream and underground English-speaking internet spaces.
A UN Security Report details a 320 per cent rise in attacks by individuals identifying with such causes over the last five years. They have spread in North America, Europe, Australasia and South Asia.
A RABBIT HOLE
Key to their expanded reach has been the growth of online platforms such as 4chan, which achieved wild popularity as a venue for indoctrination, where ground-up, amateur but resonant videos, images and illustrations are widely shared.
Part of this rabbit hole dynamic stems from 4chan’s function as an imageboard, which encourages users to communicate using popular Internet memes - images overlaid with text which can come across as creative, amusing, even engaging, if often crude and almost deliberately unprofessional.
This casual mode of communication has become a cornerstone of the Internet, persisting on sites like Reddit, a social news platform which allows like-minded individuals to swap thoughts on subjects of mutual interest.
But because such message boards are loosely moderated, if at all, and allow users to remain anonymous, they can become swirling pots of hateful speech.
Problematically, research has demonstrated that sub-communities within 4chan and Reddit sympathetic to the far-right have been vital to the creation and dissemination of problematic memes to a wider audience.
These same sub-communities have played major roles in the incubation of alternative news stories and disinformation, which feed into mainstream Internet platforms such as Twitter.
Popular items are often broadcast by users on Twitter, resulting in just two communities on 4chan and Reddit being responsible for around 6 per cent of mainstream news and 4.5 per cent of alternative news posted on Twitter as a whole.
READ: Commentary: GameStop insanity has painful lessons on short-selling and more for retail investors
The influence of these sub-communities has grown so much, there is even slang for the subtle process of introducing someone to far-right ideas: Redpilling.
Redpilling can happen in many ways.
Practically, a single thread on 4chan can begin as a genuine request for information about Islam, progress into theological debate, and conclude with multiple users having posted uncensored, gory pictures of extremist terror attacks.
Because these conversations tap on the general Internet culture where lines between satire, entertainment and radicalisation are blurred, attempts to identify tipping points and stages of radicalisation face an uphill climb.
READ: Commentary: The 2010s – when tolerance and pluralism came under attack
TAPPING ON MAINSTREAM INTERESTS
Often, platforms like 4chan play host to diverse communities dedicated to specific hobbies, discussing various topics and areas of interest. Unfortunately, these communities often retain shared vocabularies, which may work to normalise extremist ideology.
A thread discussing a popular television series might, for example, use misogynist terms crafted on the website’s far-right leaning communities to describe a character. Interest in that term could lead participants to visit those communities to seek answers.
This cross-pollination of language originating in extremism, across communities, feels organic. Strategies for radicalisation can be honed with impunity on a self-replenishing supply of users.
Far-right terrorists including the Christchurch and El Paso shooters have been directly linked to these sites, citing them as inspiration in the development of ideas and using them as platforms for disseminating their manifestos prior to attacks.
READ: 'It was not my time yet': Malaysian survivor recounts horrific Christchurch mosque shooting
Both shooters have notably published manifestos on 8chan (now 8kun), founded in 2013 by a 4chan user who felt it had grown too regulated.
Research further suggests mainstream platforms like YouTube assist in this process, with suggestions for videos featuring far-right influencers often appearing after content focusing on comparatively innocuous topics, like fashion or video games.
Investigations on the Christchurch and Quebec mosque shootings found YouTube influencers promoting Islamophobia played significant roles in radicalising both individuals.
READ: Commentary: Critical thinking, a needed nutrition to resist the virus of falsehoods
For now, detailed assessments of the radicalised Singaporean’s online activities have not been made public. But we have been told he was triggered after viewing videos depicting the execution of Christians by members of the Islamic State in Iraq and Syria (ISIS).
Indeed, highlighting such media is a favoured tactic on far-right communities on 4chan and similar sites, which promote Islamophobia by falsely presenting Muslims as inherently violent and barbaric.
Members of these communities are often exhorted to respond in kind with violence, after buying into the narrative of an inevitable clash of cultures framed in religious terms.
Sometimes the use of language, images and memes evoke a martial Christianity – frequently embodied in the image of the European Crusader – turbocharging these sentiments and pushing them over the brink by associating them with violence.
This was so for the 2011 Norway terrorist, and the Christchurch shooter, whose manifesto directly quoted the Catholic Pope whose call to arms launched the First Crusade.
READ: Commentary: Hate cannot be an appropriate response to the Christchurch shootings
These sentiments are again spurred on by content on mainstream platforms like YouTube, through which polemicists have developed cohesive alternative networks, selling themselves to audiences as truth-telling alternatives to mainstream media.
NO SPECIFIC CUES
While it can be tempting to suggest specific cues, or even participation in singular online communities, might directly lead to radicalisation, there is often no tell-tale indicator distinguishing ordinary Internet users from potentially violent actors.
Promoters of extremism have successfully tapped into the mechanisms of Internet culture production to build online communities, while also actively absorbing popular concepts from those communities for their own appropriation.
These bait a vulnerable minority of users who may, in delving deeper, explore and absorb increasingly extreme perspectives promoted by diffuse but ideologically aligned networks of far-right content creation and commentary.
These networks carefully cloak hateful ideology in satire and self-deprecation until vulnerable users are fully, if unintentionally, indoctrinated.
THE ONLY DEFENCE
The only defence we have must involve all stakeholders. Parents must address the development of harmful ideological leanings before they evolve into justifications for violent behaviour.
READ: Parents can help steer youth away from online radicalisation
Schools should commit greater resources to developing programmes to enhance Internet literacy, expanding extant initiatives targeted at cyberwellness and digital misinformation to address the spread of extremist ideology online, in concert with ongoing national education initiatives.
Religious communities must likewise develop more assertive countermeasures to the threat of extremism by curbing the spread of potentially inflammatory rhetoric and equipping leaders at the grassroots level to identify and counsel congregants expressing potentially dangerous ideologies.
This recent sign of Singapore’s susceptibility to far-right ideology is a stark reminder that extremism is not inherent to any specific creed or culture.
Ultimately, these movements and their proponents are reliant on strategies designed to undermine confidence in principles which underpin core social, cultural, and governmental institutions – democracy, multiculturalism, and the rule of law.
Our response should be to ensure the value of these principles is clearly communicated, and that their contributions to the prosperity and security of our nation is made abundantly clear to all segments of our society.
Listen to cybersecurity experts reveal the tricks scammers and hackers have employed in 2020, as more work from home and are susceptible to phishing and other cybersecurity threats:
Gareth Tan is a research analyst at TRPC, an information technology consulting and research firm. These are his own views.