The recent wave of riots in the UK has been fuelled by a subversive combination of online incitement to violence, xenophobic rhetoric from ultra-nationalist radicals, disinformation and the possible intervention of hostile states with the intention of destabilising. With the internet as a tool for spontaneous combustion, false claims about the brutal murder of three girls in Southport (north-west England), for which a 17-year-old boy born in the UK and from a family of Rwandan refugees is in custody, opened the Pandora’s box that has caused the worst far-right riots in the country since the 1970s. But the architects of the chaos were not in the street, but behind a screen, and in contrast to the bottles or bricks thrown at the police, their weapon is a keyboard.
Although the first sentences have already been handed down against several of the detainees, around 750 of them, those mainly responsible for fomenting the disorder have not yet been brought before a judge. Behind the serious damage to the anti-immigration marches in Liverpool, Blackpool, Manchester and Belfast (the only territory outside England where there were incidents) there is a heterogeneous group that ranges from hooligansand born provocateurs, to ultra-nationalist activists and even teenagers (the youngest detainee was 11 years old).
The initial catalyst was fake news that attributed the July 29 massacre to a Syrian citizen, Ali al Shakati, a fictitious name that first appeared on Channel3 Now, a controversial news platform with ties to Russia. The information, though false, was amplified by outlets such as Russia Todaya Russian state corporation, on Facebook, X (formerly Twitter) and far-right channels on Telegram, such as Reality Reports (Reality Reports), Dismantling the Cabal (Disarming the Conspiracy), or Freedom Warriors (Warriors of Freedom).
The crisis has seen far-right agitators in the virtual sphere, who did not even withdraw their false accusations when the police confirmed that the Southport suspect had been born in Wales. Among them, Tommy Robinson (alias of Stephen Yaxley-Lennon), founder of the English Defence League (EDL), an ultra-nationalist group that has been technically inactive since its promoter decided to focus his activism on the Internet; or influencer extremist and misogynist Andrew Tate, with almost 10 million followers on X, who took it upon themselves to perpetuate a false narrative: analysis of internet data shows that in less than 24 hours, posts claiming the alleged killer was Muslim, a migrant or a refugee had been viewed at least 27 million times.
Once the poison was inoculated, the networks took care of the rest. Robinson has not even needed to be in the United Kingdom, despite the fact that his name has been the most chanted during the riots, where T-shirts proposing him as Prime Minister have even been seen. On the day of the attack in Southport, on July 29, Robinson skipped a court date for contempt and, despite his leading role in the mobilization, he has followed the disturbances from a distance. resortfive-star hotel in Cyprus. There, all he needed was his mobile phone, after the tycoon Elon Musk, after acquiring X, restored the account that Twitter had blocked in 2018, precisely for encouraging hate speech.
The director of the equivalent of the Prosecution Office (Crown Prosecution Service),Stephen Parkinson has warned that the glorification of violence detected online provides grounds for more serious charges, including terrorism. “When there are organised groups planning an activity that tries to push for a particular ideology, by means of very, very serious public disorder, then we will consider it a terrorist offence,” he said, and in a veiled warning to the Robinsons and the Tates, both outside the UK, Parkinson said: “Some of those responsible [de los altercados] They are abroad, but that does not mean they are safe. We will consider extradition.”
Knowing what’s happening outside means understanding what’s going to happen inside, so don’t miss anything.
KEEP READING
Although he does not have a million followers, Robinson’s influence is incalculably greater. According to the Center for Countering Digital Hate, his posts on X since the crisis began have been viewed more than 434 million times, a volume that is five times the average he recorded before the outbreak of violence.
Meanwhile, channels on Telegram, which boasts a lack of moderation, and TikTok, contributed from multiple accounts and forums, in some cases anonymous, taking the virtual storm to the streets. Joe Mulham, director of research at Hope Not Hate, an organization against racism and extremism, explains that the initial wave of violence was “organized in an organic way.” “Many of the incidents have been coordinated by people in the places where they occurred, with support from racists and local far-right activists,” he says via email.
The difference between this crisis and previous ones, according to Mulham, is the mobilization capacity offered by social media, as demonstrated on Wednesday night, when 41 of England’s 43 police forces were on high alert, in the face of hundreds of protests outside migration management centres. Ultimately, the threat did not materialize, but Mullham points out that “the fact that the list [de localizaciones para las protestas] “The fact that the video was initially shared on a relatively small Telegram channel but then spread nationwide shows how easy it is for the far right to spread fear and mobilise violence via social media.”
The lack of regulation of the virtual sphere has allowed, alongside the EDL (named by the police of Merseyside, county where Southport is located, as a key driving force of the initial disturbances), small groups of the extreme right to freely share messages of hate and spread disinformation. Some have long been in the sights of the security forces, such as Patriotic Alternative, a white supremacist party founded by Mark Collett, a neo-Nazi activist who was a member of the extreme right-wing British National Party. The organisation, however, has been careful not to directly incite violence, in order to avoid being banned by the Ministry of the Interior.
British intelligence suspects hostile states are also involved in fomenting anti-immigration sentiment, using bots and fake accounts. The Home Office and the National Crime Agency are investigating suspicious online activity after Tech Against Terrorism, the UN agency set up to combat disinformation, warned that the events in the country suggested “state-level disinformation efforts, encouraging extremism to destabilise the UK”.
Other fronts of violence that have emerged in the riots are the circles of hooligans football matches, closely monitored by the police, who shared locations of the protests via Telegram; or the so-called Unity News Network (Unity News Network),a platform with more than 105,000 followers on Facebook and nearly 21,000 on Telegram that promotes markedly anti-immigration rhetoric. It describes itself as a news hub and is one of the main sources of information for the far right, especially popular since the conspiracy theories promoted during the coronavirus pandemic.
Follow all the international information atFacebook andXor inour weekly newsletter.