A blanket of misinformation has enveloped every corner of this pandemic. Endorsements of pseudoscientific cures have resulted in public confusion, death and financial loss. Conspiratorial origin stories – from the idea that the coronavirus is a bioweapon to the belief that it is caused by 5G technology – have enabled an ideological polarization of public discourse and helped to erode public trust in public health authorities seeking to promote needed prevention strategies.
Aggressively fighting the spread of misinformation has emerged as a top public health priority. And there is a growing body of research that suggests that countering misinformation can work, if it is done well. But we can’t fight misinformation unless the public trusts the relevant science and the public entities that are using that science to craft policy.
Alas, there have been a host of recent scientific controversies and communication glitches that have made it increasingly difficult to rely on the “good science” as a remedy for the infodemic.
As such, there is an urgent need for greater attention – from researchers, research institutions, clinicians, public health authorities and the media – on three (rather obvious, but seemingly forgotten) science policy fundamentals.
First, we need the science to be done well. One of the fastest ways to create confusion and lose public trust is to publish and publicize weak, shoddy or, worse, fraudulent research. Unfortunately, this has happened too often in this era of panicky, pandemic publishing – as exemplified by the recent and much publicized research on the dangers of hydroxychloroquine. The study, which was based on questionable and unverifiable data, was published in the renowned journal The Lancet. The paper was quickly retracted, but the damage to public trust was done.
The desire for quick results (and high impact) should not be allowed to erode scientific standards. As succinctly put by Alex John London and Jonathan Kimmelman, “Crises are no excuse for lowering scientific standards.”
Second, we need the science to be communicated well. Much of the evidence surrounding the pandemic remains uncertain. Given this reality, it is essential for public representations – whether in a government recommendation, the popular press or on social media – to be honest about the actual state of the evidence and the limits of the relevant methodologies.
Hyping science is almost always a mistake. Indeed, much of the kerfuffle surrounding hydroxychloroquine was the result of U.S. President Donald Trump’s endorsement of one small and methodologically flawed study. The subsequent noise regarding the alleged benefits led to unjustified and heightened public expectations. (In Canada, for example, despite a lack of good clinical data, 23 per cent of Canadians – and 30 per cent of Quebecers – wrongly believe the drug is effective.) The science hype also helped to fuel unnecessary and potentially harmful prescriptions and questionable public investment in further research, including clinical trials.
Research has shown that the public can handle the truth about scientific uncertainty, whether it is about masks, asymptomatic transmission or potential therapies. Indeed, being explicit about the unknowns and limits of knowledge can actually heighten credibility, trust and public understanding. But, as we have already seen, failure to be transparent will inevitably lead to both confusion and a breakdown in trust.
Finally, we need to be clear that science is a process, not an immutable list of facts. It is always evolving and, as a result, public health recommendations will (and should) also evolve. By avoiding overly dogmatic language about science-based policies – such as those surrounding mask use – we can moderate public frustration (and the concomitant loss in trust) if the science and recommendations change.
Good science is an essential tool in the fight against the spread of misinformation. But it also needs to be presented to the public in a sensible and respectful manner. As noted in a recent study from the London School of Economics on this point, how we handle the dissemination of science during the pandemic will have long-term ramifications on the public’s relationship with science and, as such, we “should think harder about how to communicate trustworthiness and honesty.”
*This was originally published in La Presse.
The comments section is closed.
Chapeau! Respect!
Bine punctat!
The (cruel) irony is that Canada / CIHR has ignored research funding into post viral illnesses, like long COVID and MEcfs. This, in spite of Canada having the highest MEcfs rates in the world.
Why? Because the medical system told sick patients it was ‘in your head’.
So when you write of medical misinformation, it should start with dismantling the embedded medical error of psychologizing post viral illnesses.
Would this involve having the federal government shipping a container of personal protective equipment to China or recommending that the borders need not be closed?
Love your well-reasoned advice, as usual. Keep up the good work!
Excellent points Tim. I have been writing a few articles about viral transmission to explain the concept of exponential increase and exponential decline related to transmission rate. I circulate mainly to friends and relatives and to residents in my own condo. I don’t understand why the scientific community and the press have not put emphasis on this concept because it is at the heart of controlling transmission. I would like to be on your mailing list and will circulate your articles to others as well. Stay safe. Ron Worton