Uncertainty is a fundamental part of all research. It requires that those who produce research and those who communicate it to others, can recognise it and understand how best to explain it. Uncertainty in research can take several forms:
- numbers accompanied by extra data that explain statistical uncertainty
- claims based on limited data or low-quality research
- contested research or disagreement between experts
- a complete lack of evidence.
There has been a lot of research into how people interpret and understand uncertainty, and how best to communicate it clearly. High quality communication about uncertainty changes how people understand and interpret research, and the level of trust they place in it.
Several reports outline the consequences of poor communication about uncertainty and highlight best practice.,, In a policy context, failing to effectively communicate uncertainty in research evidence can lead to suboptimal decisions.
This is why, when summarising research evidence for parliamentarians, communicating uncertainty is critical to presenting research findings in a responsible, transparent and meaningful way. Those providing impartial research services for parliamentarians must therefore be clear about the strengths and weaknesses of studies, they must explain numerical data in ways that are easy to understand, and they have to be clear about how uncertainties might be resolved and when.
Understanding how POST communicates uncertainty in its briefings
POST publishes three types of reports:
- POSTnotes – short summaries of research evidence set in a policy context
- POSTbriefs – longer in-depth reports that discuss evidence in more detail and in response to major developments in current affairs or select committee inquiries
- Rapid Responses – a format created to respond to the need for rapid analysis during the pandemic, these are shorter research briefings on topical issues. Most so far have focused on COVID-19.
In order to understand how POST communicates uncertainty in research in these briefings, and if this might be improved, we worked with Dr John Kerr (Winton Centre for Risk and Evidence Communication, University of Cambridge), who joined POST on a Parliamentary Academic Fellowship in 2021. The project involved two analyses. One examined the readability and use of uncertain language across all POST’s outputs. The other was an in-depth content analysis of how uncertainty about evidence is communicated in a sample of 40 recent health-related briefings.
Readability and general use of language conveying uncertainty
Figure 1 shows readability scores across POST’s three briefing types, calculated using the Flesch Reading Ease formula. The formula uses the length of words and sentences to calculate a score for a given text, with higher numbers meaning easier to read.
Most POST briefings had readability scores in the ‘difficult’ range of 30-50 on the readability scale, indicating they would be challenging for people without a university education. On average, Rapid Responses were more readable than POSTnotes and POSTbriefs. Nearly all POST documents were more readable than the average summary (called an abstract) in a scientific research paper (approximately a 10 on the readability scale), but less readable than news articles in The Times newspaper (about 49). Analysis shows that the average readability of POST briefings does not vary by author and has not changed over time.
Considering the use of uncertain language (words like ‘unsure’ and ‘approximately’), there are no significant differences, on average, between POST’s different report types. The briefings with the most uncertain language were those that focus on future scenarios. Examples include population growth, natural hazards and climate change.
Figure 2 shows how the percentage of uncertain words in Rapid Responses decreased over time. Almost all Rapid Responses focused on COVID-19. This pattern tracks with the overall trajectory of decreasing uncertainty, as the scientific understanding of the virus and its impact increased.
Detailed analysis of how POST communicates uncertainty
POST briefings frequently communicate uncertainty about research evidence. However, some ways of expressing uncertainty in POST’s work are more common than others.
Uncertainty around specific statistics is rarely quantified, such as by providing a range around a given number. It is more often expressed verbally through words such as approximately, or around.
Briefings often include information to indicate how much confidence should be placed in research evidence. For example:
- When citing results of specific studies, briefings often provide information about the type of study, or less frequently, the size of the study and population.
- A lack of peer review is mentioned in only relation to COVID-19 research and has not been highlighted consistently.
- Briefings occasionally highlight consensus, or lack thereof, between research findings or experts.
- Briefings regularly note where there is a lack of evidence or data for a given issue.
- Briefings often discuss the quality of research, occasionally explicitly—describing evidence as high or low quality—but more often by outlining specific limitations or strengths of studies. Limitations are highlighted more often than strengths. In several briefings, entire sections are devoted to discussing the quality of available research.
- Only very rarely do briefings link to POST’s research glossary or other online resources as a way of providing more information about the strengths and limitations of different study types.
POST’s response to these findings
Based on the recommendations in the report POST is updating its training and guidance for authors, making better use of relevant resources, and enhancing editorial practices.
To improve readability, POST will be using software tools to provide authors with greater insight on readability during drafting and editorial stages. POST aims to publish briefings that have higher readability scores, scoring at least 30 on the Flesch scale (to avoid the ‘very difficult’ category).
POST will be working to ensure that best practice in communicating uncertainty is applied to its research. This involves updating training and guidance for staff, and by providing clear information on how to communicate uncertainty in all its forms. Editorial processes will also be amended so that the communication of uncertainty is considered specifically for each publication. Examples include:
- Clearer and more precise explanations of numeric uncertainty in research evidence. For example, citing margins of error, and by describing data using both numbers and words.
- Including enough context for a non-specialist reader to interpret the significance of studies.
- Providing clearer descriptions of research quality.
- Being clearer about citing research that has not yet been peer reviewed.
- Considering how additional information about uncertainty and research quality could be integrated into briefings. This might include providing readers with links to other resources that can provide additional information or explanations.
For general information about this project please contact Dr Sarah Bunn at email@example.com.
For academic inquiries about this research please contact Dr John Kerr, Winton Centre for Risk and Evidence Communication, University of Cambridge.
 How to Communicate Uncertainty, 2020, Full Fact
 Uncertainty Toolkit for Analysts in Government, 2020, Government Actuary’s Department
 Communicating Quality, Uncertainty and Change, 2018, Office for National Statistics
 Fischhoff Baruch, Communicating Uncertainty: Fulfilling the Duty to Inform, Issues in Science and Technology 28, No. 4 (Summer 2012)
Photo by Marcin Nowak on Unsplash