Page:E02710035-HCP-Extreme-Right-Wing-Terrorism Accessible.pdf/117

This page has been proofread, but needs to be validated.
Extreme Right-Wing Terrorism

anonymity of users) and/or are unwilling to collaborate with UK law enforcement requests to remove terrorist content. For example:

  • Some platforms hosting ERWT content, such as Gab, 4chan and 8kun, are designed as so-called 'free speech' platforms, are U.S.-based and regard themselves as abiding by U.S. law and claim protection under the First Amendment. This means that UK engagement with such platforms is particularly challenging.
  • The trajectory for the online space is one in which an increasing number of platforms evolve or emerge on principles of free speech and privacy—indeed, privacy is increasingly prioritised above security in the design of platforms.

278. With the emergence of many 'free speech' unmoderated platforms specifically aimed at the Extreme Right-Wing, the Government will also need to consider the levers that can be used to influence sites such as 8kun and BitChute. The Head of CTP explained that:

other than with the top six major [CSPs] . . . the stuff that the CTIRU takes down is a voluntary process. We [CTP] can only work in taking down extremist material because the companies actually co-operate with us there are many other providers . . . Bitchute is an example . . . that want nothing to do with law enforcement, will not co-operate and do not volunteer.[1]

279. The importance of finding a solution to extremist content on free-speech platforms was underlined to the Committee by Nick Lowles:

I think the major question here, and the other major question remaining is around bringing smaller platforms around the table, holding them to account, because if they can't be held to account and brought around the table, then we're just going to be playing whack-a-mole continually.[2]

The Government 'Online Harms' legislation

280. Homeland Security Group advised the Committee that HMG has an active dialogue with the CSPs in terms of alerting them to terrorist exploitation of their platform(s). The CTIRU a Metropolitan Police unit set up in 2010 to actively identify and assess online content, which is then referred to the CSPs for removal if it breaches UK terrorist legislation[3] and platform terms of service—has succeeded in getting platforms to remove 310,000 pieces of terrorist online material since its inception in 2010.[4]

281. This does, however, appear to be a rather modest achievement when contrasted with action taken by Facebook just over a year later—in the period April-June 2020, Facebook reported that it had removed 8.7 million pieces of terrorist content, and that over 99% of


  1. Evidence to the Home Affairs Select Committee - CTP, 23 September 2020.
  2. Oral evidence - Nick Lowles, Hope Not Hate, 16 December 2020.
  3. Counter-Terrorism Strategy (CONTEST) - June 2018.
  4. CTP, 'Together, we're tackling online terrorism', 19 December 2018, counterterrorism.police.uk/together-were-tackling-online-terrorism

110