Dark ops: Isis, the far-right and the gamification of terror

gamified terrorism

2020-02-14 / www.ft.com

Adding game-playing elements to situations that are not games - a concept called gamification - is a wonderful thing. Getting lost in an imagined world where reality isn't more than a simulation, a thought experiment, can be oddly comforting.
But like so many brilliant innovations, this can also be used as a weapon. When a far-right gunman livestreamed his attack on two Christchurch mosques in New Zealand in March 2019, the first comment to appear beneath the video said: "Get the high score."
The second one asked: "Is this a Larp?" Larps, short for Live Action Role Plays, are games in which the participants physically embody their characters.
To some of those users observing events unfold on the far-right extremist 8chan forum, this was still a game. Even after watching the perpetrator shoot dozens of Muslims in real time, they were unable to grasp that 51 innocent civilians had just been killed.
Some sympathisers of the Christchurch attacker were quick to create video-game-style remixes of the livestreamed atrocity, which was filmed from a first-person shooter perspective. The loss of reality was by design. The attack had been carefully orchestrated as an entertainment programme for those who lurk in the darkest parts of the internet.
Seen in this context, gamification seems to be a particularly 21st-century phenomenon. Its roots, however, go back a lot further and stem from a far more wholesome source.
Arguably, it was invented by the breakfast-cereal industry in the early 20th century, when small games could be found in cornflakes boxes. The idea was to turn the family breakfast into a feel-good experience, with a little surprise that would create an additional entertainment incentive.
More than 100 years on, this idea is everywhere. Marketing efforts, customer loyalty programs and employee incentive systems all use elements of gamification. Interactive ranking systems, real-time progress bars and achievement updates provide stimuli that tap into our competitive instincts and love of games.
Extremists,from Isis to the international alt-right, ideological fringe movements have exploited gamification techniques to tap into new audiences. By tailoring their propaganda and recruiting in this way, they have been able to attract young members and raise their profile globally.
I spent two years undercover in 12 extremist groups across the ideological spectrum - from Isis hackers and female misogynists to the white nationalist networks that radicalized the Christchurch perpetrator.

At the Institute for Strategic Dialogue (ISD), my colleagues and I use sophisticated social-media monitoring and analysis tools and collaborate with top data scientists at MIT. We are able to perform complex network analysis, trace the roots of disinformation campaigns and assess the reach of violence-inciting materials.
And yet, there were questions that were left unanswered, channels that remained impenetrable and concerns that left me sleepless at night: how do extremist groups' recruitment and socialization processes work? What are the motivators that drive people to extremist networks, and what makes them stay there?
In the aftermath of the Isis-inspired terrorist attacks across Europe, the UK and the US, I realized that the international far-right was benefiting from the fears sparked by the Islamist extremist atrocities.
In 2016, I wrote my first book on the vicious circle between Islamist and far-right extremism. My aim was to shed light on the effects of "reciprocal radicalization" that I felt were profoundly underestimated in policy and security circles.
It was in 2017 that I witnessed the gamification of hate first-hand. One day, the far-right influencer Tommy Robinson, a co-founder of the nationalist English Defense League, paid me a surprise visit at the Quilliam office.
He and his cameraman came to confront me over a Guardian article I had written that mentioned him. He live-streamed the confrontation, which he dubbed "Troll Watch 3", to his 300,000 supporters on Twitter.
The video was part of a larger operation to discredit "mainstream" media outlets and think-tanks reporting on the far-right. But the consequences were real. What followed was a massive hate storm, with death threats and sexual threats reaching me and my colleagues. I learnt how the gamification of hate works the hard way.
Since then, my investigations have shown me clearly how the lines between the digital and the real have become disturbingly blurry for many members of virtual subcultures.
Sometimes they only realize that after shock or surprise moments. Following the Christchurch attack, I observed how multiple members of far-right extremist online networks announced their departure from these groups, saying they were afraid they would go mad.
After I reached out to some of my own harassers on Twitter, they apologized and said they didn't want to hurt me. It was only then that in their heads I made the transition from a non-playable character (NPC) in their game to a human being.
Far-right extremist groups have hijacked video-game platforms, including the gaming chat app Discord. For example, the organizers of the white nationalist rally in Charlottesville in 2017 established several channels on Discord to facilitate communication and co-ordination ahead of the event, which was followed by a white supremacist driving his car into a crowd of protesters, killing one of them.
In the Discord channel of the neo-Nazi trolling army Reconquista Germanica you even got promoted to higher "military ranks" when you carried out a particularly successful hate campaign against minority communities or political opponents. At its peak, the group of far-right extremists using Discord counted 10,000 "foot soldiers" and had the declared goal of influencing the German national elections.
Beyond gamifying their operations, violent far-right extremists have also created their own "mods" (modifications) for actual video games. There are white power versions of the most popular shooting games like Counter-Strike and strategy games like Civilization and Crusader Kings.
The neo-Nazi website Daily Stormer has launched its own Pokémon Go challenge, which involves finding "gyms" that are used as battlegrounds by Pokémon Go players and distributing recruitment flyers.
At the Institute for Strategic Dialogue we have long warned of the exploitation of gaming culture and gamification mechanisms by extremist movements.
Having spent time within range of extremist groups both online and offline, I knew it would not be long before we saw the first attack at the intersection of trolling and terrorism: "gamified terrorism".
Even more unfortunately, games are designed to be repeatable. When terrorism meets gaming, there is a high likelihood of copycat attacks. In 2019 alone, the Christchurch shooting kicked off a series of deadly incidents that all followed similar gamification patterns.
Video-game-like elements and gaming language featured heavily in the documents left behind by the perpetrators of attacks in Poway and El Paso in the US, and Oslo and Halle in Europe.
The shooter in Poway cited Christchurch as a catalyst and posted a musical playlist along with his shooting. The gunman in Halle live-streamed his attack on the gamers' platform Twitch and used his own 3D printed gun in reference to "weapon crafting" in gaming.
Again, glorifying pictures of the shooters were shared across far-right extremist online communities. Sympathizers hailed the Christchurch attacker as a "saint" for inspiring a new wave of attacks that would accelerate their expected race war.
The Encyclopedia Dramatica, a Wikipedia alternative used by far-right gamers, ran half-ironic, half-celebratory entries on the attacks, ranking the Norwegian terrorist Anders Breivik as the record holder among real-life first-person shooters. Calls for "higher scores" can turn into a call for inspirational terrorism, giving rise to a new competition for virtual scores in return for real lives.
In many corners of the internet therefore, this kind of radicalization, intimidation and manipulation continues to thrive. Platforms like 8chan (now replaced by 8kun) and the Encyclopedia Dramatica have been taken down but little has been done to counter the wider problem.
The business models of social-media platforms and online forums have amplified and rewarded the gamification of hatred and terrorism and, increasingly, mainstream politics.
The use of memes and trolling rapidly spread to Europe and the UK in the run-up to strategically important elections and referendums.
The algorithms of most online platforms are designed to maximize the time users spend there. This means that they prioritize content that captures our attention. Sadly, it turns out that addiction and emotional arousal through fear, anger and indignation yield maximum human attention.
It is the responsibility of both lawmakers and tech firms to prevent extremists using their architecture as a megaphone for gamified hate and terrorism. Unless we regulate against the online ecosystem of the extreme fringes, it's likely we will have to watch the repetition of more heinous real-life shooters.

Julia Ebner is a senior research fellow at the Institute for Strategic Dialogue. Her book "Going Dark" is published by Bloomsbury on February 20