News article
Role models: The need for responsible modelling in the age of pandemic
By: Neil Vowles
Last updated: Friday, 10 July 2020
Too many examples of coronavirus modelling have been insufficiently transparent about their uncertainties and too precise in their claims, a team of international science policy experts has warned.
In an article published today in Nature, the academics are critical of modelling designed to inform decision-making on the pandemic raising concerns that it has often been too opaque about uncertainty and too slow to admit limitations.
Drawing on modelling experience across a variety of fields, the group of 22 experts warn that bold and under-substantiated claims by some modellers have given politicians the ammunition to pursue predetermined agendas and offload accountability.
The group are calling on modellers to follow five key principles of best practice around uncertainty, complexity, transparency and acknowledging ignorance to ensure responsible mathematical modelling.
Andrew Stirling, Professor of Science and Technology Policy in the Science Policy Research Unit (SPRU) at the University of Sussex Business School, said:
“There is no substantial aspect of this pandemic for which any researcher can currently provide precise, reliable numbers. There are simply too many known unknowns. Excessive regard for asserting simple single numbers can push modelling away from being roughly right toward being precisely wrong.”
Examples of poor practice with unrealistically precise figures cited by the academics include the Imperial College London claim of 510,000 UK deaths predicted if no mitigation action were taken and the “highly speculative” claim from WHO Africa that 190,000 deaths would occur on the continent.
Among the recommendations the academics are making is for modellers to perform global uncertainty and sensitivity analyses in order to explore a wide range of the uncertain variables, mathematical relationships and boundary conditions that are always involved in any modelling.
They are also calling for the establishment of a set of social norms about how to produce a model, assess its uncertainty and communicate the results in order to avoid the risk of models hiding their assumptions.
The group argue for the universal adoption of international guidelines already established in several disciplines that outline how best to produce a model, assess its uncertainty and communicate the results through processes involving stakeholders, accommodating multiple views and promoting transparency, replication and analysis of sensitivity and uncertainty.
Andrea Saltelli, Professor at the Centre for the Study of the Sciences and the Humanities at the University of Bergen, said: “Rather than using models to inform their understanding, political rivals often brandish them to support predetermined agendas. To make sure their predictions do not become mere adjuncts to a political cause, modellers, decision makers and citizens need to establish new social norms such that modellers are not permitted to project more certainty than their models deserve, and politicians are not allowed to offload accountability to models of their choosing.”
Daniel Sarewitz, Professor of Science and Society at Arizona State University, said: “In our view, that good modelling cannot be done by modellers alone. It is a social activity. Like sensible use of weather forecasts, all must take responsibility for helping to acknowledge and manage the unavoidable uncertainties – so that these can be openly discussed and their implications examined. We are not calling for an end to quantification, nor for apolitical models, but for full and frank disclosure.”