DOI: https://doi.org/10.58248/RR14

Overview

  • Many general election polls ask what party the person being polled would vote for in an election, if held tomorrow.
  • There is no official regulator for the polling industry in the UK. Polling organisations may choose to be members of relevant industry bodies.
  • Poll estimates can differ for multiple reasons, including differences between the general population and the group who have been polled, and differences in approach between pollsters, for example, in how they analyse the data.
  • Some commentators suggest that it is becoming harder to predict the electorate, and that response rates to polls are falling. They have also noted that education and age have become more significant factors in determining how people vote. There are challenges around how polls are reported in the media.
  • There is uncertainty around how polling may shape public behaviour. For example, there is mixed evidence around whether polling may influence who people vote for.

What is general election polling?

Polls are usually online or telephone surveys of a sample of individuals. Polls can examine the public’s views on a range of policy issues or ask opinions on political parties.

This briefing focuses on public opinion polling related to general elections carried out in the run up to an election.

Polls provide a snapshot estimate of voting intention. Many general election polls ask what party the person being polled would vote for in an election, if held tomorrow.

Stakeholders have commented that, when interpreting a poll, it is relevant to consider when it was carried out relative to key events.

This briefing does not cover exit polls, which sample voters in person as they leave polling stations, and which are published on the day of the election after polling stations close.

Who conducts and funds general election polling in the UK?

As of April 2024, there were 34 members of the British Polling Council (BPC), an association of polling companies (‘pollsters’). BPC members commit to publishing certain information about their polls, including the client commissioning the work, sample size and geographic coverage.

Polls may also be carried out by other organisations and members of the public, including through social media. The House of Lords Committee on Political Polling and Digital Media’s 2018 inquiry report on ‘The Politics of Polling’ concluded that an increase in use of the internet has made polling easier and more affordable to carry out.

Election polling may be funded by a range of stakeholders, including news organisations, television programmes, pressure groups, political parties, and polling organisations themselves.

There may be further sources of funding behind the organisation commissioning the poll. The Committee on Political Polling and Digital Media recommended that funding sources of polls be declared.

In January 2024, some commentators questioned the funding source of a YouGov poll that predicted an ‘electoral wipeout’ for the Conservative Party. The Electoral Commission is monitoring the organisation that funded the poll to see if it falls under its regulatory remit.

Ipsos, a polling company, polls the public about their trust in different professions. In 2023, 45% of the public said they generally trusted pollsters to tell the truth.

What oversight is there of the polling industry in the UK?

There is no official regulator for the polling industry in the UK. Polling organisations may choose to be members of the relevant industry bodies – the BPC and Market Research Society (MRS). The think tank UK in a changing Europe has observed that polls published by non-BPC members tend to be viewed with more scepticism.

The Representation of the People Act 1983 prohibits the publication of the results of polls conducted on Election Day whilst voting is taking place. This is outlined in Ofcom’s (the communication services regulator) Broadcasting Code.

Some countries ban political opinion polling, or have more stringent regulations about how far in advance of an election the publication of opinion polls must stop.

How are general election polls carried out?

Polls seek to take the views of a random sample of people. Different pollsters use different ways to identify their sample. For example, some pollsters use random-digit phone dialling, while others may use panels of recruited respondents to carry out their polls online.

The main ways of carrying out general election polls are:

  • Quota sampling with post-stratification adjustments: This involves finding a sample of people that is representative of the wider population, by setting quotas for characteristics such as age. Results can be further ‘weighted’ in analysis using information collected in the survey, to improve representativeness.
  • Multi-level regression with post-stratification (MRP): Since the 2000s, MRP has become more popular as a technique which may improve polling accuracy. MRP uses larger national poll samples to generate local-level estimates. Pollsters use the samples to create a statistical model to predict, for example, how voting intention varies for groups with different characteristics. They then combine this model with other, local information on these groups (such as census data) to create estimates for smaller individual areas such as parliamentary constituencies.

How accurate is general election polling in the UK?

Following polling errors for the 1992 general election, opinion polling accuracy in UK general elections was perceived to have been acceptable or high for the result in the 1997, 2001, 2005 and 2010 elections.

However, polling companies received criticism in 2015 and 2017 when their polls did not reflect the result. Polling accuracy was perceived to have been high for the result in the 2019 general election.

Research published in 2018, which looked at 30,000 general election polls in 45 countries between 1942 and 2017, did not find evidence to suggest that polling errors have increased over time.

However, average error had fluctuated.

The researchers also suggested that, for a group of countries where there had been regular election polling over almost 40 years, “the evidence is that polls have become more accurate, not less”.

Poll accuracy for recent UK general elections

For the 2015 general election, the polls failed to accurately predict what party would win. Some stakeholders suggested that inaccurate polling “informed party strategies and media coverage during the campaign and may ultimately have influenced the result itself”.

Others suggested that the pollsters had “herded” together, making decisions about their polls that were impacted by other polls, leading to similar results.

In 2016, a panel of polling experts published a report about the 2015 poll results. They concluded that the primary cause of the polling errors was unrepresentative sampling. The panel made recommendations, including improvements on how uncertainty is calculated and reported. In response to the report, the BPC started to require its members to publish the variables they used to weight data. BPC members also made some changes to their practices.

Many polls carried out in advance of the 2017 general election did not reflect the outcome.  However, YouGov used MRP to correctly anticipate that the Conservative Party would lose their majority.

Following the 2017 election, the Committee on Political Polling and Digital Media:

What factors can lead to polling errors?

Poll results can differ for multiple reasons, including:

  • sampling error caused by differences between the general population and who had been polled. Many polls publish their expected ‘margin of error’ (or ‘confidence interval’). This reflects the variability that can be expected from poll results, as each one is based on a different sample of people. For polls of more than 1,000 people, the margin of error is usually plus or minus 3%. Technically, it is not possible to apply the statistical theory that margins of error are based on to all modern polling techniques. However, they are still frequently used as a guide to the robustness of the data.
  • differences in approach such as question wording, sampling, response rates, or different weighting and analysis techniques.

Pollsters and academics have observed risks around biased polls that are likely to give an inaccurate result. Different forms of bias can arise including:

How are polls used?

Academics and other stakeholders have observed uses of polling carried out by reputable pollsters:

Challenges of general election polling

Several challenges have been identified by stakeholders:

Impact of polling on public perceptions and behaviour

There is some complexity and uncertainty around how polling may shape public behaviour and under what circumstances, including related to whether people turnout to vote, and who they vote for if they do.

Impact on vote decision

Evidence submitted to the Political Polling and Digital Media Committee highlighted a research project on the 2016 Scottish Election that asked voters how much attention they had paid to the polls, and about any influence of the polls on their choice. This research suggested that polls do not exert a strong influence on voters.

Several specific effects have received academic attention through analysis of whether they may be influenced by polling. These include:

There is some mixed and limited historic evidence from polling contexts in a range of countries for these effects.

Commentators disagree over the extent to which voters may use polls to tactically plan which way to vote, and how effective tactical voting, for example, through websites driven by polling data, may be.

Impact on turnout

Some academic stakeholders have suggested that polling can increase turnout in close elections.  There is limited research to show the impact of polling on the turnout of different groups of people.

A research study from 2022 suggested that the electoral history of a constituency may influence how opinion polls affect voter behaviour. For example, the research indicated that turnout is lower when polls predict non-competitive elections, with a stronger effect in safe seats.

Further reading

Acknowledgements

POST would like to thank Professor Susan Banducci, Director of the Exeter Q-Step Centre, College of Social Sciences and International Studies, University of Exeter, and Professor Stephen Fisher, Professor of Political Sociology, Trinity College, University of Oxford, who acted as external peer reviewers in preparation of this article.


Photo by Ono Kosuki on Pexels.

Related posts