The detection of food contaminants—whether this be pesticide residue in produce, antibiotics in beef, Salmonella in poultry, gluten in baked goods, or wood slivers in grain—is a continually improving science. Not so long ago, state-of-the-art testing detected to parts per million; today testing is more likely to be in parts per billion, with levels continually lowering to nano and “mega-nano” detection.
With such evolving technology, the question is: Can elimination keep up with detection? That is, as the detection technologies serve to define zero by ever more minute traces, can (and should) the industry feasibly and cost-effectively “chase zero” to reach those same levels in eliminating residues, antibiotics, or even pathogens?
The question of chasing zero is further challenged by the variation in accepted tolerance based not only on the contaminant itself, but also on the country manufacturing or importing the food. And even among a class of contaminants, e.g., pathogens, countries will have varying levels of strictness in their regulations.
Take Listeria and Salmonella as examples: In the U.S., the USDA Food Safety and Inspection Service (FSIS) “zero-tolerance” policy applies to the detection of Listeria monocytogenes in ready-to-eat (RTE) products. FDA also maintains a policy of zero tolerance in RTE. The EU, however, currently allows a tolerance up to 100 cfu/g during the shelf life of non-infant and non-medical RTE products in which the growth of L. monocytogenes cannot occur throughout the stated shelf life.
Within the last month, though, the European Food Safety Authority (EFSA) put out a call for research and development for “closing gaps for performing a risk assessment on Listeria monocytogenes in ready-to-eat (RTE) foods.” The requested R&D is intended to provide EFSA with a quantitative risk characterization of L. monocytogenes in various RTE food categories, starting from the retail stage.
In relation to Salmonella, there is contradiction within the U.S., with FDA maintaining zero tolerance, within the detection limit of the methodology, for human and pet foods, but USDA having permitted levels for some meats and poultry.
With the ever-lowering detectability levels, what, exactly, is zero? What does chasing it to the nth degree really mean for the industry and to the consumer—pro and con? And with the variance among “science-based” global regulation, what level truly ensures food safety?
In this article, we present a look at the pros and cons of Chasing Zero from both the industry and consumer viewpoints. However, because of the controversy of the issue and the perception that one’s opinion could inaccurately reflect the safety processes of the associated company, we are presenting the arguments without citation. Additionally, with the number and breadth of factors involved in zero tolerance pro and con, this can be only a simplification of the sides, posed as a generator for discussion.
We’d love to hear your views in response.
Tolerance: Balance the Risk
“As the amount of microbial contamination gets low, the chances of accurately identifying contaminated food plummets, especially when the contamination is not uniformly distributed throughout the food. ... Furthermore, microbiological testing methods sometimes mis-identify harmless microorganisms as dangerous pathogens. ... Greater attention to preventing cross-contamination and undercooking may have more impact on the public’s health than further reductions in the already small numbers of pathogens occasionally present in certain foods.” (IFT - FAQ: Application of Science to Food Safety Management)
Tolerance. While we must do everything we can to ensure that food is safe to eat, there are a number of factors to take into consideration when discussing zero tolerance, including infective dose, industry and consumer cost, food security, sampling methods, farm-through-fork responsibility—and the source of the food itself.
In one way or another, all food is grown in nature. By its very definition, nature is not contaminant free. Produce sits on or in the ground, takes in ground waters, and is subject to droppings of birds flying overhead. In fact, studies have shown “natural” range-free or farmers-market poultry often contain higher levels of foodborne illness-causing bacteria.
It is for such reasons that Salmonella in chicken is not considered to be an adulterant, and for such reasons that the responsibility for food safety, and pathogen elimination, must rest with the entire food chain—including the consumer. There is no question that farmers, breeders, suppliers, and manufacturers must all exercise due diligence in preventing and eliminating pathogens, and other contaminants, to ensure that food is safe to eat. In most cases, elimination comes down to the application of a kill step; and, in some cases, that kill step needs to reside with the consumer.
Chicken is, again, an obvious argument here. Salmonella is a known hazard of raw chicken, but thorough cooking of the meat (to at least 165°F as measured with a thermometer) is a known kill step. Thus, in this case, the chicken is safe to eat, if consumers do their part in applying the kill step. That is not to say that the industry doesn’t have a responsibility, but that responsibility is to ensure food safety practices are followed in the plant, that research continues toward solutions, and that any necessary consumer kill steps are discernibly communicated to the consumer on the label.
It is at this point that cost, sampling, and infective dose also come into play. One standard food safety practice is that of sampling. But, by its very definition, “sampling” is random at best and provides a hit-or-miss process of determining food safety depending on where the sample was taken. Add to that the fact that technology has enabled detection to ever lower levels, below infective doses and below the ability to eliminate, then put it into the context of zero tolerance. If a nano level of a pathogen is detected in a single sample taken in a lot, the entire lot will be destroyed or recalled if already distributed—even when the detected level is below an infective dose; that is, it has no relevance to food safety.
The lower this detection goes, the higher the costs to both the industry and the consumer. It is a cost that is not simply related to dollars and cents, but also to sustainability. It has been widely reported that the world’s population is expected to double by the year 2050, and that we will not be able feed that world at our present rate of food growth, consumption, and destruction. As we go forward in time, can we afford to throw away ever larger amounts of product due to ever-lower detection when there are options for end-user elimination?
Consumers need to understand the role they play in food safety and do their part to follow label directions, ensure against cross contamination, and maintain food-safe storage and handling practices. It doesn’t seem that it is the right thing for society in general to spend a lot of money—and destroy a lot of food—to protect a few people who don’t do it right. Zero tolerance is still zero tolerance, but it is for the entire food chain; the consumer has a responsibility too, and sometimes needs to be the source of the kill step.
Zero Tolerance: Go as Low as We Can
Raw Pet Food Diets Can Be Dangerous to You and Your Pet
If you choose to feed raw pet food to your pet, be aware that you can infect yourself with Salmonella or L. monocytogenes by spreading the bacteria from the contaminated food to your mouth. ... If you get Salmonella or L. monocytogenes on your hands or clothing, you can also spread the bacteria to other people, objects, and surfaces. (The Food & Drug Administration)
Zero Tolerance. In determining the need for zero tolerance, there are a number of factors to consider, but it really all comes down to consideration for the consumer. We are all consumers. As such, do you want to buy food that is contaminated with Salmonella, Listeria, E. coli, or any other bacteria? Do you want to take the chance of serving a food to your family that may have levels of these pathogens because it is tolerated? If a pathogen can be detected, shouldn’t it be eliminated—or kept out of the marketplace?
This is already true with some pathogens, such as E. coli for which there is zero tolerance and a ban on selling any food product in which any level of the bacteria has been detected. But that is not enough. We need to continue to work to get better control of all pathogens, such as Salmonella.
Technology is enabling lower and lower detection, and there is no question that this is providing a challenge for the industry in controlling to those levels. However, zero is achievable in control as well as detection.
If you look at where we were ten years ago and where we are now, you see significant improvement in the effectiveness of technology. We need to continue moving in that direction—for both detection and control. We need to strive for zero. And the industry should have the desire to do so, to continually improve; not simply take on responsibility in response to regulation.
Yes, the consumer has a responsibility to properly prepare and cook food, but consumers should be able to expect that the industry is doing everything it can to place safe food on their tables. And manufacturing safe food is not just about eliminating pathogens and contaminants, it’s about preventing their incursion and ensuring the entire supply chain is applying comprehensive food safety practices—from Good Agricultural Practices (GAPs) at the farm to Good Manufacturing Practices (GMPs) in the plant, and the preventive and food safety controls of the pending rules of the Food Safety Modernization Act (FSMA) across the food chain.
The ability of CDC to track outbreaks to their sources has helped to drive industry toward constantly improving practices of food safety, and we can expect that FSMA’s final rules will lead to further improvement. However, regulation will not carry us to zero unless industry has the desire to get there. The cost of chasing zero is a challenging question, but we have to look at all the different ways we can work toward elimination. There will likely be some investment needed, but you have to look at the big picture—the overall cost of recalls, liability, closures, etc. And look at the human costs associated with foodborne outbreaks caused by pathogens that go undetected and uncontrolled: the “costs” of illness, hospitalizations, and even deaths. Although this may or may not come back to bear on a specific company, it is a cost—and responsibility—that industry must bear.
It is also understood that industry’s investment in striving to zero—to whatever level detection takes “zero,” will likely cause food costs to rise for consumers. While consumers may be willing to pay a higher price for safer food, there should be a balance in which the industry accounts for a lowered risk, and associated costs, of outbreaks and recalls, before passing along the full investment cost to the consumer.
GAPs, GMPs, FSMA, supply chain management, evolving technologies … all play a part in “chasing zero” and all need to be applied to their utmost potential to catch zero as low as we can go.
The author is Editor of QA magazine. She can be reached at email@example.com.