When does being careful become its own kind of reckless?: The Story
Twenty-one days
On December 11, 2020, the FDA issued an Emergency Use Authorization for the Pfizer-BioNTech COVID-19 vaccine. Pfizer had submitted efficacy data on November 20. Twenty-one days from submission to authorization — and every one of those days, roughly 2,500 Americans died of COVID. By the time full approval arrived in August 2021, the cumulative toll had passed 600,000.
Operation Warp Speed had compressed a ten-year development timeline into eleven months. But the regulatory apparatus that received the results operated at its own metabolic rate, one calibrated to thalidomide, Vioxx, fen-phen. The immune system that protected the public from bad drugs could not distinguish between a pandemic and a normal Tuesday. It processed both at the same speed because it was designed to process both at the same speed.
The receipts on both sides
The precautionary principle carries the memory of every drug that passed initial safety reviews and then crippled a generation — thalidomide’s 10,000 children, tetraethyl lead in every gallon of gasoline for sixty years. Speed has a body count. They have the receipts. The action-biased have receipts too — they calculated the cost of each deliberation day in QALYs — quality-adjusted life years, the unit health economists use when they want to make suffering arithmetic. The three-week gap between submission and EUA: 5,000 to 10,000 lives.
The context weighers hold the position that makes both sides uncomfortable: the binary between move fast
and be careful
is the actual recklessness. The FDA was careful about Type I error — approving a bad drug. It was reckless about Type II error — failing to approve a good one during a mass casualty event. The system had no dial for weighing them against each other. The practitioners — the nurses, the ER doctors, the people running clinics in January 2021 — watched both camps argue about frameworks while their waiting rooms filled.
The fault line runs deeper than vaccines. It surfaces in AI regulation, where decades of precaution may prevent harms that do not yet exist at the cost of benefits that would have arrived sooner. It surfaces in climate policy, where caution about economic disruption produced the atmospheric disruption the caution was supposedly preventing. The question is whether a safety system built to prevent one kind of catastrophe can recognize when it is causing another — or whether the institutional immune response has an off switch at all.
Every safety system ever built has a failure mode it cannot see: the harm it causes by doing exactly what it was designed to do, at exactly the speed it was designed to move, while the world outside the process changes faster than the process can adapt.
Perspectives:
- Precautionary principle
- Action bias
- Context weighers
- Practitioners