ATLANTA (AP) — Last fall, when Martin Meltzer calculated that 1.4 million people might contract Ebola in West Africa, the world paid attention.
This was, he said, a worst-case scenario. But Meltzer is the most famous disease modeler for the nation’s pre-eminent public health agency, the Centers for Disease Control and Prevention (CDC). His estimate was promoted at high-level international meetings. It rallied nations to step up their efforts to fight the disease.
But the estimate proved to be off.
Way, way off.
Like, 65 times worse than what ended up happening.
Some were not surprised. Meltzer has a lot of critics who say he and his CDC colleagues have a habit of willfully ignoring the complexities of disease outbreaks, resulting in estimates that over-dramatize how bad an outbreak could get — estimates that may be skewed by politics. They say Meltzer and company also overestimate how much vaccine is needed and how beneficial it has been.
Overblown estimates can result in unnecessary government spending, they say, and may further erode trust in an agency that recently has seen its sterling reputation decline.
“Once we cry wolf, and our dire predictions turn out not to be the case, people lose confidence in public health,” said Aaron King, a University of Michigan researcher who, in a recent journal article, took Meltzer and others to task for making what he called avoidable mistakes.
Meltzer, 56, is unbowed. “I am not sorry,” he said.
He dismisses his peers’ more complicated calculations as out of touch with political necessities, telling a story about President Lyndon Johnson in the 1960s. Johnson was listening to an economist talk about the uncertainty in his forecast and the reason a range of estimates made more sense than one specific figure. Johnson was unconvinced.
“Ranges are for cattle,” Johnson said, according to legend. “Give me a number.”
Meltzer does not shy away from providing a number.
What Meltzer does is not particularly glamorous. He and others use mathematical calculations to try to provide a more precise picture of a certain situation, or to predict how the situation will change. They write equations on chalkboards, have small meetings to debate which data to use, and sit at computers. Meltzer spends a lot of time with Excel spreadsheets.
But modelers have become critical in the world of infectious diseases.
Epidemics often have a “fog of war” aspect to them, in which it’s not clear exactly what just happened or what’s about to happen next. That’s true both of common infections and rare ones.
Take flu, for example.
Each winter, flu is so common that it’s impossible to test and confirm every illness. It’s also difficult to determine every flu-related death — it’s often not clear flu was responsible for the demise of everyone who had flu-like symptoms when they died. So, when the CDC cites an average of 24,000 flu-related deaths in the U.S. each year, that comes from modeling, not an actual count.
Ebola is another example. CDC leaders came to Meltzer early last August, when the epidemic was spiraling out of control and international health officials were quickly trying to build a response. Meltzer was asked to project how bad things could get if nothing was done, as well as to estimate how stepped-up aid could bend the curve of the epidemic.
Meltzer and his colleagues created a spreadsheet tool that projected uninterrupted exponential growth in two countries, Liberia and Sierra Leone.
His prediction — published last September — warned that West Africa could be on track to see 500,000 to 1.4 million Ebola cases within a few months if the world sat on its hands and let the epidemic blaze.
About 21,000 cases materialized by mid-January — a terrible toll, to be sure, but also just a tiny fraction of the caseload Meltzer and his CDC colleagues warned about. Today, the epidemic is considered to be on its last legs.
No modeler claims to be 100-percent correct. Indeed, modelers have a saying: “All models are wrong, but some are useful.”
They mean that a model’s mathematics can be correct, but the resulting predictions can still prove to be terrible if the wrong kinds of data are used or key assumptions are off. Unexpected intangibles, like a change in the weather, can also mess things up. (Of course the math can also be wrong — as in a ballyhooed 2004 CDC estimate of how many Americans die annually from obesity. It later proved to be over-inflated, with officials blaming a computational error.)
During last year’s Ebola crisis, the World Health Organization made its own set of projections for the epidemic’s course, released at about the same time as the CDC’s. But the WHO chose to project cases only as far out as early November, saying 21,000 people could be infected in Guinea, Liberia and Sierra Leone by then.
Also, the WHO decided not to make a key assumption Meltzer did — that Ebola cases were being under-reported by a factor of 2.5.
Did Meltzer blow it? Many say no. He and his colleagues clearly stated they were providing a worst-case scenario of how bad things could get. They also predicted a far lower number of cases if more help was sent — which already was happening when the model estimates were released.
But the worst-case figures got the most attention. The media focused on them in headlines. Health officials highlighted them in their push to get more money and manpower devoted to the epidemic. And interestingly, those are the numbers health officials describe as the most successful part of Meltzer’s prediction paper.
“I think it galvanized countries — and people — to put in more effort” into fighting the epidemic, said Dr. Keiji Fukuda, formerly a colleague of Meltzer’s at CDC who is now assistant director-general of the World Health Organization.
Dr. Tom Frieden, the CDC’s director, said the estimates were helpful in those difficult days of pushing for more action. But he disagrees with contentions that the agency was crying wolf. The agency was clear that the estimates were a worst-case scenario and probably wouldn’t come true, he said. But “I don’t think it’s possible to have exaggerated the risk the world faced in the fall.”
Columbia University’s Jeffrey Shaman, a modeling leader, echoed the perception that existed when Meltzer was given his assignment. As far as Ebola epidemics go, “we’d never seen anything like this before. This thing looked like AIDS on steroids,” he said.
Meltzer was born in 1958 in Southern Rhodesia, a British colony in Africa — a white, Jewish boy growing up in a privileged enclave in a country that was 99 percent black. Drafted into the military at 18, he went on reconnaissance missions in the Zambezi valley during the later stages of a civil war that led to 1980 elections that brought independence and created the nation of Zimbabwe.
His early scientific interest was in the health of animals, not humans. He earned a degree in agricultural economics in Zimbabwe, then wrote a doctoral dissertation at Cornell University on control of tick-borne diseases in African livestock. He was working on animal diseases at the University of Florida when some work on rabies brought him to the attention of CDC, which was recruiting economists to develop numbers for policy discussions. He joined the agency in 1995, when disease modelers were still a tiny group on the margins of public health.
“At the time I came on, hardly anyone at CDC did modeling,” said Anne Haddix, who joined the agency in 1992 and became Meltzer’s mentor.
Three factors were prodding more infectious disease modeling in the United States:
- Advances in computers and mathematics enabled modelers to do increasingly sophisticated work.
- British scientists successfully used models to guide government decision making. Most notably, modelers influenced how the United Kingdom handled a devastating 2001 epidemic of foot-and-mouth disease in animals. The epidemic was tamed by the end of that summer, after the slaughter of millions of animals.
- In the aftermath of 9/11, government officials pushed for greater preparations against bioterrorism and disease disasters, and needed to know how much money they needed to budget.
Haddix and Meltzer helped establish a corps of dozens of economists at the CDC who performed such tasks as assessing the effect and cost of prevention programs. Their work became crucial when agency officials went to Congress for funding. The economists also were the ones who ended up doing the bulk of the agency’s disease modeling work.
Some of Meltzer’s peers build sophisticated models that have been likened to jet aircraft, sometimes requiring a large team of experts to create them and keep them running. Those are known as stochastic models that focus on the effects of chance and other potential factors, and emphasize the range of possibilities. Most stochastic modeling work is done at universities.
Meltzer’s models are more like a bicycle; much more easily understood and modified. Deterministic, they’re called. They more simply describe what might happen in a population given general trends. Meltzer says he uses these models because that’s what plays well with policy makers — they are easy to explain, can be quickly altered to respond to a new question, and can spit out simple answers quickly.
Within CDC, he’s been lauded for his work. One Meltzer project was creation of free software — called FluAid — that gave local health officials an idea how pandemic flu might affect different geographic areas. He’s also been praised for co-creating a model that helped CDC officials make the case for dropping a long-standing federal restriction that prevented HIV-infected foreigners from staying and working in the United States. The restriction was dropped in 2010.
In 2011, Meltzer discovered an error in CDC estimates of how many illnesses, hospitalizations and deaths were prevented during the 2009 flu pandemic through use of vaccines and medications. He initiated a published correction.
But some of his work has drawn ridicule. In 2001, shortly after the nation endured a series of anthrax attacks, Meltzer co-authored a paper that forecast a global smallpox epidemic could reach 77 trillion cases within a year if there were no intervention and an unlimited supply of smallpox-susceptible people. He included the number, he said, to give people an idea just how dramatically cases could escalate if unchecked by public health measures.
Some viewed that number as nonsensical, given that the Earth had only about 6 billion inhabitants.
“Every now and again, Dr. Meltzer loses control of his computer,” said Dr. Donald A. Henderson, a revered public health expert who led a global smallpox eradication effort in the 1970s.
There is no doubt some envy among modelers for the influence Meltzer holds. Modeling-produced numbers become valuable currency in debates about what public health measures to take and what programs to fund; they can drive policy decisions.
Many modelers go into the field because “it has real implications you can see in your lifetime,” said Irene Eckstrand, who until last year was scientific director of a government-funded modelers network called MIDAS — Models of Infectious Disease Agent Study.
CDC is supposed to prepare America for the worst, so it makes sense for CDC modelers to explore extreme scenarios. If Meltzer’s estimates push policymakers to bolster public health defenses, it’s all to the greater good, they say.
“The primary purpose of these models is to say why we need to take action,” said Glen Nowak, a former CDC director of media relations who now heads the University of Georgia’s Center for Health and Risk Communication.
But there are those who feel that the result corrupts both science and politics.
“Public health officials are well aware that their statistics get used — and misused — to justify an increase in their funding” or to bolster vaccination campaigns and other efforts, said Peter Doshi, an assistant professor at the University of Maryland School of Pharmacy and an associate editor of BMJ, the British medical journal.
Modeling — so poorly understood by the public, the media, and even many people in public health — provides an opportunity to bend numbers to support goals, he argued.
“This is an area again where the CDC is free to produce numbers and nobody can really say they’re right or wrong. You can say ‘well, they don’t seem plausible,'” but then it just looks like some experts are arguing over whose model is better, he said.
Said David Ozonoff, a Boston University environmental health professor who formerly — under the pseudonym Revere — wrote a blog on public health policy and science called “Effect Measure” that was closely read by CDC employees: “The way risk assessment is done in this country is the policy makers shoot the arrow and the risk assessors paint a target around it. There’s a flavor of this with modeling, too. If you say the purpose (of a modeling estimate) is motivational, that’s another way of saying it’s not scientific.”
Some say more of a separation between CDC administrators and the modelers might engender more trust in the numbers the agency uses. Perhaps an outside agency — an NIH institute on public health, if one were ever created — could do the modeling and report their findings to CDC, said Lone Simonsen, a research professor at George Washington University who formerly worked at the CDC and at the National Institutes of Health.
More immediately, CDC could increase its collaboration with top academic modelers, she added.
But some experts noted that’s not always possible, especially in fast-moving and sensitive situations like Ebola, when the agency might receive information about epidemics from countries or organizations that don’t want the data shared with the academic community or others.
Meltzer is wary of proposals for greater collaboration or reliance on non-agency modelers. And more sophisticated models do not interest him.
“Accuracy for the sake of accuracy is merely interesting,” he said. “And interesting is not good enough.”
Copyright 2015 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.