How Neo-Nazis Are Taking Advantage of the Covid-19 Crisis

Spreading their violent ideology and recruiting new members…

Anahide T
10 min readDec 22, 2020
A person types “corona” in Google
Source: Nathana Rebouças on Unsplah

In March 2020, the Western world finally took the measure of the Covid-19 pandemic and applied the first restrictions to contain its spread.

Quarantines implemented all around the globe are, for many, a source of anxiety or even depression. Isolation, fear for loved ones, time spent behind screens, or worrying about restricted freedoms… According to the sociologist Cynthia Miller Idriss, the climate was perfect for growing conspiracy theories, and extremists didn’t wait to use them for propaganda and recruiting purposes.

« This dynamic creates a perfect storm for extremist recruitment and radicalization. Extreme isolation and increased online presence on gaming platforms, social media, and more, creates growing possibilities for exposure to extremist content and expands the gateways that can lead to extremist radicalizations ».

The ultra-right, or far-right, is more dangerous than the mainstream extreme-right. It is both distinguished and characterized by the use of violence. Those groups are described as “terrorists” by the organizations in charge of tracking them.

Ultra-right partisans believe in white nationalism, white supremacism, cultural nationalism, and neo-Nazism. Their ideologies center around racism, ultra-nationalism, ethnocentrism, anti-Semitism, and opposition to equality and democratic principles.

Since the beginning of the pandemic, Tech Against Terrorism observed social media infiltration by neo-Nazi recruiters, who take advantage of massively shared conspiracy theories to sell their doctrines. Born of cooperation between tech companies and public authorities, Tech Against Terrorism’s mission is to monitor the terrorist threat on internet platforms and support companies and institutions in addressing it.

How Did the Covid-19 Crisis Become a Vehicle for Neo-Nazi Ideology?

Conspiracy theories are not the preserve of neo-Nazis. So many theories are circulating on social media that it is getting difficult to distinguish what is real and what isn’t.

Crises are invariable vectors of fake news and speculations of all kinds. Of course, the virus is an inexhaustible source for them. The European Think Thank Fondalpol listed a few of the made-up plots that became popular in 2020:

  • People believe that the coronavirus is linked to the development of 5G. China would have used this new technology to spread the virus across the planet.
  • The virus is said to be a tool for implementing social control. As always, a small group of people is plotting to take power over the planet. They would use the virus as a pretext for imposing social control over humanity.
  • The virus would be a biological weapon, discriminating against certain parts of the population, races or ethnicities, to eliminate them.
  • And, of course, the one we’ve heard the most about, QAnon and the Deep State. These theories have been around since 2017 but they have grown remarkably since the beginning of 2020. On Facebook, between January and April, QAnon gathered more interactions than WHO guidelines: 80 million for QAnon against 6.2 million for the WHO. Meanwhile, on Twitter, the hashtag #QAnon increased by 21%.

Beyond being false, those conjectures are rooted in anti-government and xenophobic ideas -especially the last two, often implying the existence of a global-elite conspiracy to control mankind. Over the course of two weeks, Blackbird AI studied 40 million posts related to covid-19 on social media. Approximately 18 million were found to push harmful narratives to the public.

Although they aren’t necessarily responsible for their creation, Neo-Nazis thrives in plot theories. They love exploiting conspirational narratives, as it gives a lot of credit to their idea of accelerationism.

1. Conspiracy Theories and Fake News, Open Doors to Accelerationism

According to Vox, accelerationism “rests on the idea that Western governments are irreparably corrupt. As a result, the best thing white supremacists can do is accelerate their demise by sowing chaos and creating political tension.

Accelerationism took its origins in the infamous American neo-Nazi James Mason’s writings, published during the 1980s. He claimed to have uncovered the “Great Replacement,” a global elite plot to replace native Europeans with non-Europeans, in a “genocide by substitution.”

He predicted the world‘s destiny was to evolve to a period of chaos called “race war”, when fascists would destroy the existing system and eliminate “race- traitors.” After what, the Iberian Peninsula cleared of non-white Men, they would re-establish Christian rules in their homeland. Unfortunately, James Mason's essays still find an echo in white supremacists and neo-Nazis’ activities today. And they didn't wait to link coronavirus to accelerationism.

Tech Against Terrorism analyzed over 100 channels on Telegram related to the neo-Nazi movement. They found mentions of a virus coming from China as early as January 2020. In March, the conspirators had already associated the Covid-19 with accelerationism.

According to the Global Network on Extremism and Technology, the ultra-right refers to the disease using the tag “based-corona.” “Based” is a word they employ to talk about a subject that promotes the supremacist movement. Indeed, for some followers, the coronavirus is a blessing. They rejoice to see the film industry shut down, a symbol of the depraved contemporary world. They praise the virus for making the borders effective again and for devastating the African continent.

In this regard, they entertain the idea that the virus would be more harmful towards certain persons because of their origins.. They go as far as encouraging each other to get infected and frequent mosques and synagogues to propagate the virus among those they want to eliminate.

Far-right followers rapidly promoted their “revelations” on social media and started to recruit.

2. The Use of Social Media as Recruitment Platforms for Neo-Nazis

Usually, supporters of the ultra-right operate on websites or forums they administer. But extensive sharing of fake news and conspiracy theories provided them with a new marketing strategy.

Recruiters infiltrated social media and mainstream websites to participate in conspiratorial discussions that were already circulating. Without revealing their allegiance, they professed anti-government and xenophobic ideas, while connecting various theories with apocalyptical scenarios. Only later do they refer future recruits to supremacist websites.

In particular, survivalist communities, or “preppers,” who plan for all kinds of emergencies, have become gold mines for neo-Nazi recruiters. Indeed, end-of-the-world narratives fit well into accelerationism theories.

They also took advantage of anti-quarantine protests in the United States. Although white supremacists don't generally organize those protests, they saw the climate of anger as an opportunity to disseminate their ideas.

Once propagated and turned into neo-Nazism, plot theories become a real threat to public order and people’s safety.

3. Violent Actions on the Internet and Real-Life Attacks

We understood that the ultra-right differs from the extreme-right because of the use of violence, giving those movements a terrorist character.

Initially, their actions were limited to the virtual sphere. The recent expansion of videoconferencing applications allowed the development of a practice called “zoom-bombing: Neo-Nazis trolls entered college’s virtual classrooms, disrupting the conduct of education with insults and threats. They also took advantage of this system to steal teachers' and students’ personal information for the purpose of harassment.

The New York Times studied this phenomenon and found more than 150 active social media accounts where people gathered to organize “zoom-bombing”. The FBI even issued a public warning against the practice, inviting videoconference app’s users to take all necessary precautions to avoid unwanted intrusion in private meetings.

An man poses with a firegun and neo-Nazi’s symbols, as a sign of his participation in terrorist activities
Credit: Global Network on Extremism and Technology

But neo-Nazis don’t hide their intention to led real-life attacks. In fact, they endorse their ideology's brutality, like in this picture, by posing with fire-guns and harboring nazi symbols. This person is concealing his identity by wrapping his head in a scarf showing the death’s head, the trade-mark of Hilter’s Schutzstaffel army (SS).

In recent years, the ultra-right has been successfully recruiting members of the police and military forces in Germany. Such enrollments raised concerns about those groups' capacity to access weapons or act in the appearance of public authority.

The threat is significant enough that those groups require constant surveillance. On March 2020 in Missouri, the FBI thwarted an attack against a hospital hosting Covid-19 patients. A 36-year-old was shot dead during his arrest after the FBI uncovered his plan to bomb the building. The nature of his motive was unsurprisingly purely terrorist. He intended to draw attention to the white supremacist ideology while benefiting from “the media attention on the health sector.”

This failed attempt is a reminder of what happened on March 15, 2019, in Christchurch, New Zealand, when an Alt-Right supporter carried out massive shootings in several mosques. 51 people lost their lives. The threat is often overlooked, but real.

The spread of neo-Nazi theories must be controlled in order to prevent those events. The question is, by what means? Experts warn about the adoption of methods that could compromise freedom of expression.

Can We Prevent Terrorism While Protecting Fundamental Liberties?

There are two types of actors that can play a role in preventing cyber terrorism:

  • Private actors: websites and social media which moderate users’ content.
  • Public actors: regulatory institutions, i.e., Governments and Parliaments, as well as their partner organizations, which establish binding standards.
    In our example, Tech Against Terrorism is a sort of hybrid since it was born out of cooperation between governments and Tech companies.

Content sharing platforms watch guidelines carefully and quickly change their community standards. No one wants to participate in the development of terrorism.

The measures implemented are often the same: removing fakes news as soon as they are detected. But this type of moderation is problematic.

The Center for Democracy and Technology released a statement concerning the use of automated moderation, whether it’s performed by a simple bot or artificial intelligence. In fact, automation increases the risk of false positives. It also raises the question of responsibility for withdrawal decisions. Or rather, the irresponsibility that could be invoked by platforms to evade censorship accusations.

Besides, certain content denouncing violations of human rights, broadcasted by journalists, for example, could be used to prove such violations and might disappear from platforms.

In France, for example, the government attempted to pass a bill using an accelerated procedure that would prohibit the diffusion of any element that could help identify police officers, except their identification number, although not always visible. The law has been widely criticized by legal actors, and journalists, precisely because it would prevent the publication of images proving police violence.

The bill generated friction in the journalism industry, especially after one of them became a police brutality victim while covering a demonstration. Videos of the aggression were broadcast on Twitter, which led to the opening of an investigation. The government finally stepped back after facing critics from the public and NGOs.

Of course, websites have greater flexibility for self-regulating, and modifications in their terms and conditions are often overlooked. They don’t have to divulge how they program their bots to remove problematic content.

Reporter Without Borders declared they feared the crisis would be used by governments to “establish or tighten control over the national media and to step up state censorship.” Indeed, internet access was blocked or restricted to certain sites in several countries, like Ethiopia, India, Myanmar, and Bangladesh.

David Kaye, the UN spokesperson on the promotion of the right to freedom of expression, Harlem Désir, of the Organization for Security and Cooperation in Europe, and Edison Lanza, of the Inter-American Commission on Human Rights, published a collective statement. They stress the importance for governments and businesses to work together to provide reliable information in response to fake news.

“Resorting to other measures, such as content take-downs and censorship, may result in limiting access to important information for public health and should only be undertaken where they meet the standards of necessity and proportionality”.

However, this is not the direction the European Union decided to take when considering to impose an automatic content upload filtering system. According to experts, this is the worst way to deal with moderation. Censorship is fully operational and opaque. Whether this regulation will be implemented remains to be seen.

As for the ultra-right terrorist groups, some have already found a solution. With the neo-Nazi Forest Forge forum shut down in February 2020, users migrated to GAB, a platform using the browser Dissenter, which allows users to comment anonymously on any URL, thus escaping censorship.

Far-right supporters are considered terrorists due to the threat they represent to public safety and order, incentivizing the use of violence, from verbal intimidations on internet platforms to real-life killings.

In social media, they found a tool to market their ideas, profiting from the dissemination of fake news and plots to rally new members to their cause.

The Internet era is a fast-moving world whose rules aren’t clearly established yet. The growing threat of terrorism forces us to think about the way we want our society to use this technology.

We are digging out our old debates about public freedoms, particularly freedom of expression, now that we are provided with tools that allow sharing ideas with everyone, everywhere, at any time.

Public and private actors are working hand in hand to find the best possible ways to prevent the terrorist threat while preserving our freedoms. The question is, will they be heard?

--

--