security

Teenagers as young as 13 under suspicion for UK far-right terrorism


Teenagers as young as 13 are coming under suspicion of engaging in terrorism after being exposed to a toxic cocktail of easily accessible far-right extremism online, experts have warned.

Insiders describe “a horrible hateful soup” of social media content where children can “pick and mix” terrorist narratives, including the Terrorgram network – recently banned in the UK – of white supremacist channels on Telegram.

Experts have tracked 49 children convicted of terror offences since 2016 – all but one of whom are boys – and this week Ken McCallum, the head of MI5, said “13% of all those being investigated by MI5 for involvement in UK terrorism are under 18”, a threefold increase in three years.

But the increasing proportion of children under scrutiny also poses problems, with questions arising over whether teenagers should be criminalised – and MI5 and experts acknowledge cases often raise issues of mental health or grooming.

Hannah Rose, an analyst with the Institute of Strategic Dialogue (ISD) thinktank, said there had been a “surge in online extremist ecosystems” for several years, which remain “very easy for children to access”, and that “offline vulnerabilities, which kids are more likely to have, can make somebody more prone to adopting extremist views”.

In April Britain proscribed Terrorgram, a loose online neo-fascist group. Coordinators helped run three Telegram channels with more than 70,000 subscribers, an ISD study found in August, purporting to be news feeds, one of which claimed to be associated with Steve Bannon’s War Room podcast.

The channels are not overtly linked to Terrorgram, but all three pushed readers or subscribers from around the world to join related group chats where, the ISD said, “content supporting mass violence and societal collapse is rife” – with investigators considering children potentially more susceptible to violent influence.

Read More   Drive Business Growth With AMAG Tech Financial Services Leasing | Security News - SecurityInformed

Some of the online radicalisation is considered by investigators to be a hangover from the Covid period, when youngsters and families were forced to spend more time online. But it has also led to a blurring of ideas and motivations, making it hard at times for investigators to see what, if any, ideology is involved.

“We’re also seeing the mainstreaming of ideas such as extreme misogyny online, or school shooter fandoms,” said Rose. “Some of this is aestheticised online with graphics or fast music, presented as a computer game – and one result is that people without an ideology become motivated to carry out acts of extreme harm.”

Reflecting online culture, Terrorgram is largely a loose association of shared content. Though two of its alleged leaders, both American, were charged in September with 15 counts of hate crimes, soliciting the murder of federal officials and conspiring to support terrorists by US authorities, it is expected to evolve and endure.

Child terror cases are largely related to extreme rightwing networks, but there has been a resurgence in Islamist activity. Last month a 15-year-old from Nottingham received three-year community behaviour and youth referral orders for sharing violent videos relating to Islamic State and pledging allegiance to the terror group on Telegram.

MI5 has previously found itself investigating a 13-year-old from Cornwall, who became leader of the UK arm of the banned neo-Nazi Feuerkrieg Division, an online group that was itself set up in 2018 by another 13-year-old from Estonia.

skip past newsletter promotion

The British boy was sentenced to a two-year non-custodial rehabilitation order in 2021, while one of his recruits, 17-year-old Paul Dunleavy, was jailed in November 2020 for five and a half years for helping prepare acts of terrorism. Though Dunleavy’s terror efforts were described as “inept”, he advised other members how to use firearms, some of whom were convicted of terror offences abroad.

Adam Hadley, the executive director of Tech Against Terrorism, which aims to disrupt extremist groups online, said there had been “significant job cuts” at social media companies, particularly in “trust and safety content moderation”, which has helped extremist content escape being censored.

Specialists say extremist material is most prevalent on X, which has cut content monitoring since its takeover by Elon Musk, and on Telegram. Last month its boss, Pavel Durov, announced he would seek to improve moderation on the network after he was arrested in France amid an investigation into its use for sharing child sexual abuse images and drug trafficking.

Questions have also been raised about whether terror prosecutions are an appropriate way to deal with cases of childhood radicalisation. Last month it emerged that MI5 had been monitoring Rhianan Rudd, a teenager who had taken her own life after being groomed by a far-right American extremist and been charged on suspicion of having obtained instructions to make firearms and explosives.

At a pre-inquest hearing the Security Service said it was not involved in the police decision to charge her with terror offences, a prosecution that was later discontinued. A lawyer representing Rudd’s family said it “appears she was monitored by MI5 while being subjected to online exploitation” – and her family has argued she should have been treated as “a victim rather than a terrorist”.

McCallum appeared to reflect such concerns on Tuesday, when he tried to differentiate between types of terror cases involving children. “For those planning attacks, a criminal justice outcome is generally needed. But for some vulnerable individuals, alternative interventions delivered by a wider range of partners may be more effective,” the head of MI5 said.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.