Documents to download

Online Information and Fake News

Social media platforms and Internet search engines enable users to find information that they find most interesting or relevant by filtering content. Information is filtered by both algorithms and user behaviour (e.g. selecting who to follow or which pages to like). There are differing views on the potential effect that these technological changes are having on the opinions of individual users.

Some have suggested that filtering could lead to users only seeing content that conforms to their pre-existing beliefs, and that it could unintentionally limit the range of information that users see. Two phenomena have been proposed:

  • Echo-chambers – in which people form social networks with those who largely reflect their own viewpoints.
  • Filter bubbles – in which search engines, social media sites and news aggregators automatically recommend content that an individual is likely to agree with, based on the previous behaviour of the user and others.

However, a growing body of research suggests that these filtering effects do not fully eliminate exposure to attitude-challenging information, for example because users on social media typically have a diverse social network spanning multiple geographic regions.

Concerns have been raised internationally by politicians, journalists and others about the spread of false information (“fake news”) online, and the effect that it may have on political events such as elections. There is no clear, agreed definition for fake news. Generally, it is defined as content intended to misinform or influence the reader. It is often financially or politically motivated.

The UK Government has no specific policies for addressing fake news, filter bubbles or echo-chambers. Attempts to address these issues have mainly focused on fake news, and have been largely industry-led, although other approaches include regulation and user education.

Key Points

  • Social media platforms and Internet search engines have made it easier to produce, distribute and access information and opinions online.
  • These technologies, combined with user behaviour, filter the content that users see. On the one hand, some studies suggest that this limits users’ exposure to attitude-challenging information and that echo-chambers or filter bubbles may form. On the other hand, other studies argue that users still see a wider range of information than offline.
  • Online fake news has the potential to confuse and deceive users, and is often financially or politically motivated.
  • UK efforts to address these issues are largely led by industry and focus on fake news. They include better identification, fact-checking and user education. 

Acknowledgements

POSTnotes are based on literature reviews and interviews with a range of stakeholders and are externally peer reviewed. POST would like to thank interviewees and peer reviewers for kindly giving up their time during the preparation of this briefing, including:

  • Prof Rob Procter, University of Warwick*
  • Dr Jonathon Bright, Oxford Internet Institute, University of Oxford*
  • Prof Philip Howard, Oxford Internet Institute, University of Oxford
  • Sam Woolley, Oxford Internet Institute, University of Oxford
  • Monica Kaminska, Oxford Internet Institute, University of Oxford*
  • Dr Helena Webb, Computer Science, University of Oxford*
  • Dr Richard Fletcher, Reuters Institute for the Study of Journalism, University of Oxford*
  • Prof Adam Joinson, University of Bath*
  • Dr Emma Williams, University of Bath*
  • Dr Ana Levordska, University of Bath*
  • Dr Felipe Romero Moreno, University of Hertfordshire*
  • Dr Frederik Zuiderveen Borgesius, University of Amsterdam*
  • Amy Sippett, FullFact
  • Phoebe Arnold, FullFact*
  • Claire Wardle, First Draft News
  • Jessica Montgomery, The Royal Society*
  • Fergus Bell, Dig Deeper Media*
  • Emma Collins, Facebook*
  • Karim Palant, Facebook
  • Nick Pickles, Twitter*
  • Dave Skelton, Google
  • Niall Duffy, Independent Press Standards Organisation*
  • Jim Waterson, BuzzFeed News*
  • Patrick Worrall, Channel 4 News*
  • Department for Digital, Culture, Media and Sport*
  • Ofcom*
  • Department for Education*
  • Cabinet Office

*Denotes people who acted as external reviewers of the briefing.


Documents to download

Related posts

  • POST has published 20 COVID-19 Areas of Research Interest (ARIs) for the UK Parliament. ARIs were identified using the input of over 1,000 experts. They were then ranked in order of interest to UK Parliament research and select committee staff, following internal feedback. Each ARI comes with a series of questions aiming to further break down each broad area. The ARIs focus on the impacts of the global pandemic and range from economic recovery and growth, to surveillance and data collection, long-term mental health effects, education, vaccine development, and the NHS.

  • Machine learning (ML, a type of artificial intelligence) is increasingly being used to support decision making in a variety of applications including recruitment and clinical diagnoses. While ML has many advantages, there are concerns that in some cases it may not be possible to explain completely how its outputs have been produced. This POSTnote gives an overview of ML and its role in decision-making. It examines the challenges of understanding how a complex ML system has reached its output, and some of the technical approaches to making ML easier to interpret. It also gives a brief overview of some of the proposed tools for making ML systems more accountable.

  • Over 350 experts have shared with us what they think the implications of the COVID-19 pandemic will be in the next 2 to 5 years. This work was done to inform the House of Lords COVID-19 Committee inquiry on Life beyond COVID, and is based on 366 expert responses. Areas of concern include work and employment, health and social care, research and development, society and community, the natural environment, education, arts, culture and sport, infrastructure and crime and justice.