Documents to download

Extremism lacks a clear definition, which has contributed to difficulty in regulating it. The UK Government characterises extremism as “opposition to fundamental values, including democracy, the rule of law, individual liberty, and respect and tolerance for different faiths and beliefs”. However, this definition lacks the legal precision for extremism to be grounded in UK law. Extremism is not the same as terrorism. However, exposure to extremism can encourage an individual’s support for terrorist tactics. Assessing the scale of exposure to extremist content is difficult. People may access it covertly, unknowingly or be unsure whether they should report it. 

The internet may facilitate extremism in multiple ways, including recruitment, socialisation, communication, networking and mobilisation. Content can be posted instantly without verification, so information can be produced rapidly and disseminated widely. Contributors consciously evade detection by using multiple accounts and avoiding specific terms. Responsibility for regulating the internet is shared across multiple public and private bodies, including government, counter-terrorism police and private internet companies. Existing counter-extremism strategies include content removal (often using automatic detection) and deplatforming (removal of a group or individual from an online platform), as well as social interventions. Many stakeholders believe that current counter-extremism responses are too focused on law and technology, and do not address the underlying reasons that people are drawn to extremist content. UK Government, public sector practitioners and industry stakeholders have called for a coordinated response to address the range of social and technological challenges around extremism. In 2019, the UK Government put forward proposals to address online extremism in the Online Harms White Paper. 

Key Points 

  • The internet can leave users vulnerable to social challenges, which creates opportunities for extremism to spread. Users can be exposed to extremism in multiple ways, including through recruitment and socialisation. 
  • Extremist content may be found on mainstream social media sites and ‘alt-tech’ platforms, which replicate the functions of mainstream social media but have been created or co-opted for the unconventional needs of specific users. 
  • Automatic detection can be used to moderate extremist content on a large scale. However, this is prone to false positives and may disproportionately impact a particular group, which can fuel mistrust in the state. 
  • Many stakeholders believe that current counter-extremism responses are too focused on law and technology, and do not address the underlying reasons that people are drawn to extremist content. Individual and societal interventions aim to identify underlying socio-economic and cultural contributors and implement protective factors to reduce how many people develop extremist views. 


POSTnotes are based on literature reviews and interviews with a range of stakeholders and are externally peer-reviewed. POST would like to thank interviewees and peer reviewers for kindly giving up their time during the preparation of this briefing, including: 

  • Clean Up The Internet 
  • Commission for Countering Extremism 
  • Counter-Terrorism Internet Referral Unit 
  • Home Office* 
  • Moonshot CVE* 
  • Ofcom 
  • Prevent* 
  • Dr Stephane Baele, University of Exeter 
  • Professor Mark Bellis, Public Health Wales* 
  • Ben Bradley, techUK* 
  • Joseph Briefel, Integrity UK 
  • Dr Chico Camargo, Oxford Internet Institute 
  • Professor Maura Conway, Dublin City University 
  • Professor Paul Gill, University College London* 
  • Dr Scott Hale, Oxford Internet Institute 
  • Dr Katie Hardcastle, Public Health Wales* 
  • Ivan Humble, Me & You Education 
  • Haydn Kemp, College of Policing* 
  • Ashton Kingdon, Southampton University 
  • Alex Krasodomski-Jones, Demos 
  • Dr Benjamin Lee, Lancaster University* 
  • Dr Caroline Logan, Greater Manchester NHS Mental Health Trust 
  • Professor Stuart Macdonald, Swansea University 
  • Dr Raffaello Pantucci, Royal United Services Institute for Defence and Security Studies* 
  • Dr Lorenzo Pasculli, Coventry University* 
  • Dr Bertie Vidgen, Alan Turing Institute and Oxford University* 
  • Charlie Winter, Kings College London 

* denotes people and organisations who acted as external reviewers of the briefing 

Documents to download

Related posts

  • Green steel

    Greenhouse gas (GHG) emissions from the iron and steel industry make up 14% of industrial emissions in the UK. Decarbonisation of the steel industry is needed if the UK is to meet its target of net zero GHG emissions by 2050. This POSTnote outlines current steelmaking processes in the UK, the technologies and measures that can be used to reduce CO2 emissions, and the supporting infrastructure and policies that could enable a ‘green steel’ industry in the UK.

    Green steel
  • The impact of digital technology on arts and culture in the UK

    This POSTnote provides an overview of the impact of digital technology on the arts and culture sector in the UK. It focuses on the uses of emerging digital technologies and the impact of COVID-19 on stakeholders. It summarises the policy priorities, challenges and barriers in accessing technology in the sector.

    The impact of digital technology on arts and culture in the UK
  • Geothermal energy

    Geothermal energy is a source of low-carbon, homegrown, renewable energy. It is available throughout the UK and can provide heat or power all year long independent of weather conditions. It currently delivers less than 0.3% of the UK’s annual heat demand, using only a fraction of the estimated available geothermal heat resource. There is the potential to increase this proportion significantly, but this will require long-term government support to develop a route to market and overcome high upfront capital costs and geological development risks.

    Geothermal energy