These figures were based on Imperial College modelling that has since been challenged by Mark Harper, deputy chair of the Covid Recovery Group of MPs. He argued the model did not account for key factors shown to change the course of the pandemic such as the most up to date evidence on the protective effect of the vaccines as well as the “seasonal effect” as the country moves into summer. Modelling has driven much of the pandemic response. The initial reaction in the UK, the US and other European countries was shaped by the dramatic headlines in March last year, suggesting 550,000 deaths in the UK and 2.2 million in the US if mitigation measures were not put in place.
Dr Thomas House, who sits on SPI-M – the group that models the pandemic for the government – said while models were an essential tool they were far from foolproof.
Speaking in a personal capacity, Dr House said: “The thing that worries me most is when people expect too much accuracy from models of a complex system.
“This does not mean we need to ditch models. You can’t do science without them.
“Modelling tells you about what might happen. Models can also be useful conceptual tools.
“However, the models are only as good as the assumptions that they are based upon. They may not account for many things such as seasonality, details of vaccine rollout individual behaviour.
“Nor can they account for our values and culture for example that people have different tolerances of risk. There are always values in politics and these need to be explained and it needs to be clear how the government has made their decisions.”
Dr House, a reader in applied mathematics at the University of Manchester added: “It’s no good waving curves and graphs at people – if these only show covid cases and deaths. We cannot be clear that the precautionary principle of first do no harm applies, because we don’t know the costs of lockdown measures.
“Models are a valuable tool but my worry is that the epidemic curve coming out of a computer represents just one part of a complex situation.
“You cannot make a decision without a model in a pandemic but it needs to be clear about what has been assumed in making the model, and what is missing and what might happen which does not follow the model.”
Professor David Paton, an expert in health economics at Nottingham University said: “The use of models during this pandemic has been a great source of frustration. The government has relied too heavily on models predicting deaths, hospitalisations and cases.
“The problem is that the assumptions behind many of these have been flawed.
“Modellers have a responsibility for explaining the assumptions and scenarios that they have based their modelling on so that the public and politicians understand their weaknesses.”
The modelling that led to the Prime Minister imposing drastic curbs last March was carried out by Imperial’s Neil Ferguson.
Prof Ferguson was also behind research that sparked the mass culling of farm animals during the 2001 epidemic of foot and mouth disease, a crisis which led to the widespread pre-emptive culling of millions of cattle and sheep – many of which were not infected.
Prof Ferguson has said that so many animals were infected by the time the outbreak was uncovered a more precise cull was impossible.
But Michael Thrusfield, professor of veterinary epidemiology at Edinburgh University, carried out an investigation into the modelling which triggered the cull which concluded it was based on flawed assumptions.
He said: “The basic principles on modelling described in our papers and textbooks apply to this Covid-19 crisis as much as they did to the foot and mouth outbreak.
“We need to show the fault lines of trying to use mathematical models in real disease outbreaks. Modellers sit for many years theoretically exploring models and when a real disease occurs it’s like a feeding frenzy.
“But modellers do not have to consider the consequences attached to their conclusions or the conclusions drawn by policy makers. With Covid we risk catastrophe by computer. With foot and mouth it was carnage by computer. You can only understand a subject and the consequences if you experience it yourselves.
“Modellers do not have to accept personal responsibility for the fact their scientific advice translates to policy and that policy as we can see can be extremely damaging. Millions of animals unnecessarily lost their lives, not to mention the emotional toll it took on farmers.”
Professor Carl Heneghan, Director of the Centre for Evidence Based Medicine at Oxford University said: “At the moment we have a tug-of-war. Many argue the government has over relied on models which are then torn apart by critics. The key is not to be panicked by models.
“They are not a fait accompli. We need a new approach to integrate the emerging data with the models to keep them up-to-date so we can react appropriately as data emerges. We need to use models to help guide our thinking, but in a pandemic we also need to be flexible.
“The data is shifting all the time. In the last week deaths are down by a third. We should speed up our responses with the data as it emerges and this has not been happening. The world will look very different in two weeks time and the idea that you can sit down and predict this or even predict that by mid June we will be out of lockdown is not evidence based.”
A Department of Health and Social Care spokesman said: “The Government’s response to the pandemic has always been informed by evidence-based scientific advice which helps us prepare for a wide range of scenarios.
“An extensive range of data and advice is used in our decision-making process. This includes, but is not limited to, scientific advice and infectious disease modelling from SAGE and SPI-M.
“Modelling is not a forecast or prediction of what will happen. It reflects a responsible government ensuring we are ready for all eventualities and following expert scientific advice to inform our response.”