Documents to download

Extremism lacks a clear definition, which has contributed to difficulty in regulating it. The UK Government characterises extremism as “opposition to fundamental values, including democracy, the rule of law, individual liberty, and respect and tolerance for different faiths and beliefs”. However, this definition lacks the legal precision for extremism to be grounded in UK law. Extremism is not the same as terrorism. However, exposure to extremism can encourage an individual’s support for terrorist tactics. Assessing the scale of exposure to extremist content is difficult. People may access it covertly, unknowingly or be unsure whether they should report it. 

The internet may facilitate extremism in multiple ways, including recruitment, socialisation, communication, networking and mobilisation. Content can be posted instantly without verification, so information can be produced rapidly and disseminated widely. Contributors consciously evade detection by using multiple accounts and avoiding specific terms. Responsibility for regulating the internet is shared across multiple public and private bodies, including government, counter-terrorism police and private internet companies. Existing counter-extremism strategies include content removal (often using automatic detection) and deplatforming (removal of a group or individual from an online platform), as well as social interventions. Many stakeholders believe that current counter-extremism responses are too focused on law and technology, and do not address the underlying reasons that people are drawn to extremist content. UK Government, public sector practitioners and industry stakeholders have called for a coordinated response to address the range of social and technological challenges around extremism. In 2019, the UK Government put forward proposals to address online extremism in the Online Harms White Paper. 

Key Points 

  • The internet can leave users vulnerable to social challenges, which creates opportunities for extremism to spread. Users can be exposed to extremism in multiple ways, including through recruitment and socialisation. 
  • Extremist content may be found on mainstream social media sites and ‘alt-tech’ platforms, which replicate the functions of mainstream social media but have been created or co-opted for the unconventional needs of specific users. 
  • Automatic detection can be used to moderate extremist content on a large scale. However, this is prone to false positives and may disproportionately impact a particular group, which can fuel mistrust in the state. 
  • Many stakeholders believe that current counter-extremism responses are too focused on law and technology, and do not address the underlying reasons that people are drawn to extremist content. Individual and societal interventions aim to identify underlying socio-economic and cultural contributors and implement protective factors to reduce how many people develop extremist views. 

Acknowledgements 

POSTnotes are based on literature reviews and interviews with a range of stakeholders and are externally peer-reviewed. POST would like to thank interviewees and peer reviewers for kindly giving up their time during the preparation of this briefing, including: 

  • Clean Up The Internet 
  • Commission for Countering Extremism 
  • Counter-Terrorism Internet Referral Unit 
  • Home Office* 
  • Moonshot CVE* 
  • Ofcom 
  • Prevent* 
  • Dr Stephane Baele, University of Exeter 
  • Professor Mark Bellis, Public Health Wales* 
  • Ben Bradley, techUK* 
  • Joseph Briefel, Integrity UK 
  • Dr Chico Camargo, Oxford Internet Institute 
  • Professor Maura Conway, Dublin City University 
  • Professor Paul Gill, University College London* 
  • Dr Scott Hale, Oxford Internet Institute 
  • Dr Katie Hardcastle, Public Health Wales* 
  • Ivan Humble, Me & You Education 
  • Haydn Kemp, College of Policing* 
  • Ashton Kingdon, Southampton University 
  • Alex Krasodomski-Jones, Demos 
  • Dr Benjamin Lee, Lancaster University* 
  • Dr Caroline Logan, Greater Manchester NHS Mental Health Trust 
  • Professor Stuart Macdonald, Swansea University 
  • Dr Raffaello Pantucci, Royal United Services Institute for Defence and Security Studies* 
  • Dr Lorenzo Pasculli, Coventry University* 
  • Dr Bertie Vidgen, Alan Turing Institute and Oxford University* 
  • Charlie Winter, Kings College London 

* denotes people and organisations who acted as external reviewers of the briefing 


Documents to download

Related posts

  • International Trade Committee announces its first Areas of Research Interest: UK Trade Policy

    The International Trade Committee has published five Areas of Research Interest (ARIs) for 2021 to help support the Committee’s scrutiny of UK trade policy. Each ARI comes with a series of questions aiming to further break down the broad areas. The ARIs focus on UK trade policy and include: Trade negotiations, Gender and trade, Food standards, Developing countries, and Foreign Policy and Trade.

    International Trade Committee announces its first Areas of Research Interest: UK Trade Policy
  • Coastal Management

    The UK coastline is shaped by interactions between complex social, ecological, and physical processes. Increasing coastal flood and erosion risk is a major climate adaptation challenge. This POSTnote examines coastal management in England, associated issues and how an adaptive approach can better prepare the country for uncertain future sea level rise under climate change.

    Coastal Management
  • Regulating Product Sustainability

    Products can be designed to maximise life cycle energy- and resource-efficiency, from raw material extraction to end-of-life treatment. This POSTnote outlines key aspects of, and consumer attitudes towards, sustainable products. It considers challenges associated with their design, production, regulation and supporting business models as part of a circular economy. ‘End-of-life’ treatment and value recovery, through reuse, recycling and other methods, are discussed.

    Regulating Product Sustainability