Approved work: Security of UK technology infrastructure
This POSTnote will outline what UK technology infrastructure is, potential impacts of disruptions, risks, options for mitigating risks and relevant policy.
Extremism is possible in any ideology, including (but not limited to) politics and religion. Extremism can affect mental well-being, amplify hostility and threaten democratic debate. The global reach of the internet poses social and technological challenges for safeguarding citizens from extremism online. When the Commission for Countering Extremism surveyed over 2500 members of the public in 2019, 56% agreed that a lot more should be done to counter extremism online. This POSTnote outlines how the online environment can be used for extremist purposes, how exposure to online extremism can influence people and potential strategies to counter extremist content online.
Online extremism (243 KB , PDF)
Extremism lacks a clear definition, which has contributed to difficulty in regulating it. The UK Government characterises extremism as “opposition to fundamental values, including democracy, the rule of law, individual liberty, and respect and tolerance for different faiths and beliefs”. However, this definition lacks the legal precision for extremism to be grounded in UK law. Extremism is not the same as terrorism. However, exposure to extremism can encourage an individual’s support for terrorist tactics. Assessing the scale of exposure to extremist content is difficult. People may access it covertly, unknowingly or be unsure whether they should report it.
The internet may facilitate extremism in multiple ways, including recruitment, socialisation, communication, networking and mobilisation. Content can be posted instantly without verification, so information can be produced rapidly and disseminated widely. Contributors consciously evade detection by using multiple accounts and avoiding specific terms. Responsibility for regulating the internet is shared across multiple public and private bodies, including government, counter-terrorism police and private internet companies. Existing counter-extremism strategies include content removal (often using automatic detection) and deplatforming (removal of a group or individual from an online platform), as well as social interventions. Many stakeholders believe that current counter-extremism responses are too focused on law and technology, and do not address the underlying reasons that people are drawn to extremist content. UK Government, public sector practitioners and industry stakeholders have called for a coordinated response to address the range of social and technological challenges around extremism. In 2019, the UK Government put forward proposals to address online extremism in the Online Harms White Paper.
Key Points
Acknowledgements
POSTnotes are based on literature reviews and interviews with a range of stakeholders and are externally peer-reviewed. POST would like to thank interviewees and peer reviewers for kindly giving up their time during the preparation of this briefing, including:
* denotes people and organisations who acted as external reviewers of the briefing
Online extremism (243 KB , PDF)
This POSTnote will outline what UK technology infrastructure is, potential impacts of disruptions, risks, options for mitigating risks and relevant policy.
This POSTnote will outline the challenges and opportunities for spatial planning and climate change across national, regional and local decision-making levels.
This POSTnote will outline the energy system application of AI and Machine Learning. It will also consider the data, cybersecurity and ethical challenges that will need to be considered for application in the UK to enhance energy security.