Or in “The Cutter Incident” in 1955, where a rush to get a polio vaccine out in advance of the next polio season resulted in some batches containing live polio virus, with several children receiving the vaccine actually getting polio instead: https://en.wikipedia.org/wiki/Cutter_Laboratories#The_Cutter_incident
There’s definitely a history of incidents in public health of perceived overreaction followed by public backlash, which could potentially be playing into public health officials’ heads nowadays. I don’t know if becoming more conservative and less-quick-to-take-action is necessarily a wrong lesson, though – even if you think, just simply on the numbers, that taking preventative measures in each of these incidents was correct ex ante given the stakes involved, reputational risks are real and have to be taken into account. As much as “take action to prepare for low probability, high consequence scenarios when the expected cost < expected benefit” applies to personal preparation, it doesn’t translate easily to governmental action, at least not when “expected cost” doesn’t factor in “everyone will yell at you and trust you less in the future if the low probability scenario doesn’t pan out, because people don’t do probabilities well.”
This does put us in a bit of a bind, since ideally you’d want to have public health authorities be able to take well-calibrated actions against <10%-likely scenarios. But they are, unfortunately, constrained by public perception to some extent.
See these news stories about the WHO being blamed for being too aggressive about swine flu, which probably caused it to learn a wrong lesson:
https://web.archive.org/web/20100420235803/http://www.msnbc.msn.com:80/id/36421914
https://web.archive.org/web/20100531094130/http://www.timesonline.co.uk/tol/news/world/article7104253.ece
You might also be interested in the 1976 mass vaccination program in the US for swine flu, which was a case of perceived overreaction (given the anticipated pandemic never materialized) and also hurt the reputation of public health generally: https://www.discovermagazine.com/health/the-public-health-legacy-of-the-1976-swine-flu-outbreak
Or in “The Cutter Incident” in 1955, where a rush to get a polio vaccine out in advance of the next polio season resulted in some batches containing live polio virus, with several children receiving the vaccine actually getting polio instead: https://en.wikipedia.org/wiki/Cutter_Laboratories#The_Cutter_incident
There’s definitely a history of incidents in public health of perceived overreaction followed by public backlash, which could potentially be playing into public health officials’ heads nowadays. I don’t know if becoming more conservative and less-quick-to-take-action is necessarily a wrong lesson, though – even if you think, just simply on the numbers, that taking preventative measures in each of these incidents was correct ex ante given the stakes involved, reputational risks are real and have to be taken into account. As much as “take action to prepare for low probability, high consequence scenarios when the expected cost < expected benefit” applies to personal preparation, it doesn’t translate easily to governmental action, at least not when “expected cost” doesn’t factor in “everyone will yell at you and trust you less in the future if the low probability scenario doesn’t pan out, because people don’t do probabilities well.”
This does put us in a bit of a bind, since ideally you’d want to have public health authorities be able to take well-calibrated actions against <10%-likely scenarios. But they are, unfortunately, constrained by public perception to some extent.