Development action with informed and engaged societies
After nearly 28 years, The Communication Initiative (The CI) Global is entering a new chapter. Following a period of transition, the global website has been transferred to the University of the Witwatersrand (Wits) in South Africa, where it will be administered by the Social and Behaviour Change Communication Division. Wits' commitment to social change and justice makes it a trusted steward for The CI's legacy and future.
 
Co-founder Victoria Martin is pleased to see this work continue under Wits' leadership. Victoria knows that co-founder Warren Feek (1953–2024) would have felt deep pride in The CI Global's Africa-led direction.
 
We honour the team and partners who sustained The CI for decades. Meanwhile, La Iniciativa de Comunicación (CILA) continues independently at cila.comminitcila.com and is linked with The CI Global site.
Time to read
3 minutes
Read so far

Five Rules for Evidence Communication

0 comments
Affiliation

Winton Centre for Risk and Evidence Communication, University of Cambridge

Date
Summary

"During the coronavirus pandemic (and 'infodemic') the problem of poor evidence communication has been writ large, with questions about whether science can be trusted, and confusion about what science is or means."

"Trust is crucial." This is the conviction of the Winton Centre for Risk and Evidence Communication, an interdisciplinary group that collects data on issues such as how to communicate uncertainty, how audiences decide what evidence to trust, and how narratives affect people's decision-making. This document offers a guide to what the group calls "evidence communication" in the context of the COVID-19 pandemic. On this approach, the aim is to design communications that do not lead people to a particular decision but, instead, that help them understand what is known about COVID-19 and decide what to do on the basis of that evidence. The key for communicators - e.g., politicians or scientists - is to be clear about motivations, to present data fully and clearly, and to share sources.

Specifically, the 5 rules include:

  1. Inform, not persuade - A survey by the Winton Center of people across 13 countries early on in the pandemic revealed that people are sensitive to the aims and interests of communicators. The philosopher Onora O'Neill, cited here, argues that, rather than focusing on trust, we should focus on trustworthiness. Learning that someone is an expert is not enough to make us trust a message; beyond that, researchers need to demonstrate traits like honesty, good intentions, reliability, and competence, presenting evidence in ways that make it accessible, intelligible, useful, and easily assessed (so that laypeople can check the workings for themselves, if they wish). "The media might urge us to aim for memorable sound bites or go beyond the strength of the data: be honest and aware of such traps."
  2. Offer balance, not false balance - The authors are concerned that partial presentation of evidence can only exacerbate the psychological biases that mean we sometimes apply evidence to confirm our own beliefs. One tip when communicating written information is to display the pros and cons in a table rather than expounding upon them in the text. Also, know what key pros and cons are pertinent to each particular audience.
  3. Disclose uncertainties - "Part of telling the whole story is talking about what we don't know." Research has found that people correctly interpret messages without having their trust undermined by an upfront description of uncertainties. Other research finds little downside in expressing findings as a range rather than an exact number.
  4. State evidence quality - Again with a focus on the issue of trust, the authors suggest overtly stating that a piece of evidence is of high or low quality and even specifying the size and source of data sets; even a non-specialist audience can notice these types of clarifications and use them gauge, say, the relevance of data to them.
  5. Inoculate against misinformation - Studies show that, if people are pre-emptively warned against attempts to sow doubt ("prebunking"), they resist being swayed by misinformation or disinformation. This approach requires understanding the concerns of the audience, such as by reading public forums and popular news sources.

An extended version of the article (available here [PDF] and at the link below) offers a set of questions to ask yourself as you embark on any kind of scientific communication, including:

  • What principles guide your communication? - What is your personal ethical standpoint? What are the red lines you will not cross?
  • What are the specific aims of your communication? - Where do your objectives lie on the spectrum from information (e.g., improving understanding about alternative options) to overt persuasion (e.g., changing intended or actual behaviour by choosing the "preferred option")?
  • Who are your intended audiences? - How have they come to hold particular views, and why?
  • What to communicate? - If your aim is to inform, follow the "5 rules" articulated above.
  • When to communicate? - When does your audience need information? (Consider the ethics and implications of not sharing information.) Professor John Krebs' checklist for trustworthy communication in a crisis says you should tell people:
    • What you know - knowledge
    • What you don't know - uncertainty
    • What you are doing to find out - plans
    • What they can do in the meantime to be on the safe side - self-efficacy
    • That advice will change - flexibility
  • How best to communicate your information? - In order to try to minimise biases, it is advisable to use multiple media, frames, and formats, including text, numbers, and graphics when possible, such as providing both survival and mortality rates.
  • Are you achieving your aims, within your ethos? - In evaluating, consider applying O'Neill's criteria for "intelligent information", described in #1, above.

One last question the authors pose: How can professional practice in risk and evidence communication and public critique of such information be improved? They suggest that, "[t]o improve the ability of audiences to critique what they hear, we need education at all levels to identify issues with claims based on data, whether in schools or professions. The encouragement of fact-checking organisations alongside public recognition and celebration of accurate, impartial communications - a clear separation of information and opinion - would help. But ultimately, scientific researchers hold the key and the main responsibility. We in science can choose to research and incentivise good, clear and ethical communication above popularity and unwarranted simplicity....Fail, and the dangers are becoming all too clear."

Source

Nature 587, 362-64 (2020). https://doi.org/10.1038/d41586-020-03189-1 - sourced from: "Those Who Tell Us What to Do during the Pandemic Must Earn Our Trust", by David Spiegelhalter, The Guardian, November 26 2020. Image credit: Freepix