The hidden cost of zero-tolerance food safety: safe food thrown away, real risks ignored
A food product tests positive for Listeria monocytogenes. The batch is destroyed. The recall is issued. Consumers lose trust. And in many cases, the amount of bacteria detected was so small it posed no realistic threat to anyone who might have eaten it.
This is the paradox at the center of a new paper published in Frontiers in Science: the tools we use to detect food pathogens have become extraordinarily sensitive, capable of picking up trace quantities of microorganisms that may never cause disease. But the regulatory frameworks built around those tools have not kept pace. The result, according to an international team of researchers, is a system that wastes vast quantities of perfectly edible food, inflates consumer costs, and - perhaps most counterintuitively - may actually make us less safe by pulling attention and resources away from the contamination pathways that pose the greatest danger.
When detection does not equal danger
The core argument is straightforward. Foodborne pathogens sicken roughly 600 million people and kill approximately 420,000 every year worldwide. Nobody disputes that food safety matters. But the current system, the authors contend, confuses hazard with risk.
A hazard is the mere presence of something that could potentially cause harm. A risk is the probability that it actually will, given the dose, the food matrix, how the food is stored and prepared, and who is eating it. Modern testing methods can detect vanishingly small quantities of bacteria - quantities that, in many foods and for most consumers, fall well below the threshold that would cause illness. Yet many regulations treat any detection as a failure, regardless of context.
Consider the bacterium Listeria monocytogenes. It is genuinely dangerous at high concentrations, particularly for pregnant women, elderly people, and those with compromised immune systems. But a trace positive on a surface swab in a processing facility does not mean the finished food product poses a meaningful risk to consumers. The distinction between these scenarios is precisely where the current framework falls short.
The cascade of consequences
When zero-detection standards trigger recalls, the immediate cost is wasted food. But the downstream effects compound. Packaging requirements increase to extend shelf life and minimize contamination risk. Heat treatment protocols intensify, degrading nutritional content. Cold chain requirements become more stringent, driving up energy consumption. Each of these interventions carries environmental and economic costs that are rarely weighed against the marginal safety benefit they provide.
There is also an opportunity cost. Resources spent responding to trace-level detections - investigating, recalling, retesting, reformulating - are resources not spent on the interventions that actually prevent foodborne illness. Process controls, sanitation protocols, and hazard analysis at critical control points have a far stronger track record of protecting public health than end-product testing. Yet the regulatory emphasis on detection can crowd out investment in these more effective strategies.
A speed-limit analogy
Lead author Professor Martin Wiedmann from Cornell University frames the issue bluntly. Zero risk does not exist. We accept calculated risk in virtually every other domain of public life - highway speed limits balance safety against mobility, building codes balance structural integrity against construction costs. Food safety, the authors argue, should apply the same logic.
That does not mean relaxing standards. It means calibrating them to the actual risk rather than to the mere presence of a pathogen. In practice, this would involve replacing hazard-based assessments - which ask only whether a pathogen is detectable - with risk-based approaches that evaluate the probability and severity of harm under realistic conditions of exposure.
Competing hazards, competing priorities
One challenge the paper addresses is prioritization. In the United States, norovirus causes thousands of times more cases of illness than Listeria monocytogenes, yet Listeria causes more deaths per year. How should regulators weigh incidence against mortality? How should they factor in the economic and environmental costs of different intervention strategies?
The authors propose that computational tools incorporating data from across the food production system - geographic information, genomics, artificial intelligence - could help answer these questions with greater precision. Co-author Dr. Sriya Sunil, also at Cornell, points to evidence showing that end-product testing is generally ineffective at ensuring safety when used in isolation. Validated process controls, applied at the right points in production, tend to deliver greater public health benefit.
International food safety standards add another layer of complexity. Countries with different food systems, disease burdens, and economic constraints may need different approaches to balancing safety against sustainability. What works in a well-resourced European supply chain may not translate directly to a developing economy where food availability itself is a public health concern.
Sustainability enters the equation
Co-author Professor Sophia Johler at Ludwig Maximilian University of Munich emphasizes the sustainability dimension. At a time when global food systems face mounting pressure from climate change, population growth, and resource constraints, the waste generated by overly cautious safety policies is not a trivial concern. Food that is microbiologically safe to eat but fails a zero-detection standard represents a direct loss - of the land, water, energy, and labor that went into producing it.
The paper calls for a framework that considers food safety, food security, nutritional health, and environmental impact together rather than treating safety as an isolated variable. That integration would require collaboration across social sciences, economics, and life sciences - disciplines that have traditionally operated in separate lanes when it comes to food policy.
Not a call for complacency
The authors are careful to distinguish their argument from any suggestion that food safety should be deprioritized. Foodborne illness remains a massive global burden, and the pathogens involved - Salmonella, E. coli, Listeria, norovirus, among others - are genuinely dangerous under the right conditions. The point is not that testing should stop or standards should weaken. It is that the standards themselves should be grounded in evidence about actual risk, not in the technological capacity to detect ever-smaller quantities of microorganisms.
Whether regulators will embrace this shift remains an open question. Zero-detection policies are politically defensible - no one gets blamed for being too cautious about food safety. Risk-based approaches require explaining to the public that some level of pathogen presence is acceptable, a message that demands both scientific literacy and public trust. Building that trust may prove the hardest part of the transition.