The seventh annual CPS Research Symposium was held in Seattle, WA on June 28-29, 2016. As in years past, the produce safety research community came together to share the latest results from CPS-funded programs and to discuss how the data can be used to build risk and science-based food safety programs for produce companies all along the supply chain. The interpretation of food safety research results and application to individual companies is most appropriately the undertaking of those that reside within those specific operations. However, we highlight these key learnings from the CPS Symposium to create awareness and stimulate thought.
It is important to learn from illness outbreaks and recalls to prevent repeating the same mistakes . The U.S. apple industry was dramatically impacted by a 2014 outbreak of listeriosis traced back to caramelized apples originating from a relatively small California apple producer. A general session at the Symposium dissected the outbreak from a scientific and public health perspective. The role of whole genome sequencing as a tool in the epidemiological traceback, observations from the production facility investigation, research results demonstrating the
potential for Listeria monocytogenes (Lm) growth in caramelized apples, failures in communications between industry and regulatory agencies and between the U.S. and export partners and the role that social media played were examined. Most importantly, the actions of the apple industry in the Pacific Northwest subsequent to the outbreak were highlighted. The industry moved quickly to provide education and training to apple producers about Lm and focused on equipment and facility sanitation. It is important to evaluate equipment and sanitation practices to insure that cleaning and sanitation is effective and Lm is not permitted to become resident. It is also important to understand how produce is being used in the manufacture of other products and what impact that might have on its safety. Lastly, being prepared for a food safety event is imperative. Having proper lines of communication within your company, the industry, the regulatory agencies and even at a country to country level are keys to insuring proper public health protection is maintained and the event stays focused properly.
Generic E. coli has limitations as an indicator for irrigation water quality . We have three options for measuring the microbial quality of irrigation water: (a) test for microorganisms that serve as indicators for the presence of fecal contamination and a proxy potential public health risk, (b) test for indexing microorganisms that estimate a microbial hazard whereby an increase in number correlates with an increased probability of a human pathogen presence and (c) measure human pathogens directly. Generic E. coli is often used as an indicator for irrigation water testing. It is relatively inexpensive and a number of test procedures are available. It is also the law as mandated by Food Safety Modernization Act (FSMA); more specifically the produce rule dealing with irrigationPhoto Credit: Dr. Trevor Suslow, University of California, Davis water. However, testing for generic E. coli may have limited value. In irrigation ponds in Georgia, Kahler (CDC) reported generic E. coli was not correlated with positive findings for Salmonella . Jay-Russell (UC-Davis) reported the same lack of correlation between generic E. coli and Salmonella when sampling irrigation ponds . Suslow (UC-Davis) reported similar observations in western U.S. irrigation waters tested positive for Salmonella and STEC’s at a rate of 24% even though generic E. coli tests fall under the 126 MPN/100 ml threshold used in EPA recreational water standards and subsequently irrigation water action levels. However, Suslow noted that generic E. coli has value when used within the context of longer-term evaluations of irrigation water. When an irrigation water source and delivery system has been characterized using generic E. coli over a period of growing seasons, significant fluctuations from the baseline may indicate a compromised irrigation source and/or delivery system and should trigger a system inspection and perhaps more concentrated testing.
Alternative microbial water quality indicators and indexing organisms are on the horizon . There are a number of exciting efforts to find better indicators and indexing organisms to permit actionable irrigation water quality evaluations. Bright (University of Arizona) pointed out the importance of understanding the physical and chemical parameters of irrigation waters regionally when looking for indicator organisms. Some potential indicators; Enterococci and bacteriophage, were not associated with the presence of Salmonella in pond water samples (Kahler), while work continues on certain enteric viruses (Bright) and a number of bacteria including; Bacteriodales , Bacteroides , Bacillales , Fluviicola and Rhizobacter (Gu and Suslow). Bacteriodales have the advantages of not growing in the environment and they are strongly associated with their host enabling source tracking assessments. The take home message for most growers is to keep a watch on this area of research. As these candidate indicators and indexing organisms are better understood, we may have more relevant targets for irrigation water testing. It is likely that no single indicator organism will meet all needs and collections of indicators may well hold the key for future water assessments.
It is important to sample irrigation water sources correctly . That certainly sounds logical, but in recent years a number of questions have been raised about how to sample various irrigation water sources when testing microbial quality. Previously, we have learned from work by Vellidis et al (University of Georgia) working on ponds in the Suwanee River region about the importance of not stirring the pond bottom and that sampling safely from the edge of the pond is comparable to sampling in the middle of the pond. At this year's Symposium, we heard Verhougstraete (University of Arizona) conclude that it was appropriate to sample irrigation canals anywhere that provided safe access . This builds upon previous reports from Rock (University of Arizona) at the 2015 Symposium where her results also cautioned against stirring up the silt that might reside in the bottom of irrigation canals . Verhougstraete also reported that time of day matters when taking water samples; higher levels are found before noon, I.e. Perhaps before the UV of the sun has an opportunity to kill bacteria. He also found that taking five samples and making a composite yielded comparable results to doing five separate tests and calculating a geometric mean. For some this might be an option to save resources in certain irrigation water testing scenarios. Kahler (CDC) provided data that showed that dead end ultra-filtration where large volumes of water can be filtered and concentrated into small volumes has value for detecting low level contamination of Salmonella in irrigation ponds. This certainly supports previous reports where increased sample volumes provide better opportunities of detecting pathogens. Lastly, when samples that exceed compliance values are found, Verhougstraete’ s data points to the importance of inspecting the canal system 1,000 meters upstream to look for possible sources of contamination. These results all point to the importance of sampling correctly when characterizing the quality of irrigation water. Sampling early in the day and collecting larger sample volumes provides a grower with the best chance of finding a potential contamination problem and affords them the opportunity to make early, proactive decisions to reduce contamination risks. the data also point to the produce safety axiom of knowing your irrigation sources and delivery systems and inspecting them frequently during crop production.
Photo Credit: Dr. Marc Berhougstraete, University of Arizona
Irrigation water sources can be treated with disinfectants, but... If a grower finds an irrigation water source that is out of compliance, it would be desirable to be able to treat the water to mitigate the problem. CPS has previously funded work by Kniel (University of Delaware) to explore the use of sand filters enhanced with activated iron . At the 2016 Symposium, Gu (Virginia Tech) reported that short treatments with common disinfectants like sodium hypochlorite, chlorine dioxide or proxy acetic acid (PAA) may not be sufficient to eliminate human pathogens from irrigation water. The contact time and concentration of disinfectant is heavily influenced by the organic matter in the irrigation water. Buchanan (University of Tennessee) observed the same relationship for time and concentration and the impact organic matter in pond water can have on any treatments. One can reduce the contact time needed to kill pathogens if the disinfectant concentration is high enough and the water has low levels of organic matter, but ideal conditions are not commonly found. He also cautioned that high concentrations of disinfectants mayPhoto Credit: Dr. Robert Whitaker have negative consequences for soil health. Similarly, low concentrations of disinfectants might be employed but obtaining sufficient contact time with the water is difficult owing to flow rates used in many irrigation systems. Lastly, Buchanan warned that irrigation pipes and delivery systems can become sources of contamination if they are not properly drained and maintained. These results again point out the importance of knowing your irrigation water sources and water delivery systems and inspecting them frequently for potential contamination risks. Preventing contamination from occurring, though difficult in some instances, is preferable to trying to mitigate contamination once it has occurred. It is also important to note, that FDA has recognized in the produce rule that different irrigation water systems pose different levels of risk (e.g. overhead spray irrigation where the water contacts the edible portion of the plant versus drip irrigation that delivers water directly to the root system) and that FDA recognizes that pathogens, if present, die off owing to UV from the sun and the harsh conditions encountered in the typical production environment.
Validation and verification – know the difference . These terms have certainly become a focal point in industry food safety discussions as a result of FSMA and the preventive controls rule . Basically, validation means that an operator knows their preventive control works and verification means that your preventive control has been implemented correctly. More precise definitions can be found elsewhere (21 CFR §117.3 ). The CPS Symposium has featured a number of research reports over the years on wash water disinfection and efficacy. In 2016, Wang (University of Maryland) noted that the chemical composition of various commodities can impact the organic content of the wash water and hence the effectiveness of chlorine sanitizers. As we have seen before, as the organic load increases, free chlorine becomes less effective in killing planktonic E. coli O157:H7 because it reacts with the organic constituents. Additionally, pH, product contact time, chemical oxygen demand (COD) and turbidity all play a role in the efficacy of free chlorine as a disinfectant. Wang has developed a microfluidic device that permits testing of wash water parameters against various commodities and pathogens that may hold promise for basic efficacy testing. Certainly wash water validation and verification methods are critical to industry efforts to meet regulatory requirements and to provide packers and processors with a level of assurance that their systems are operating properly. Tortorello (IFSH) presented the work of a multidisciplinary industry working group that has examined various factors involved in validating wash water control. Their work has been submitted for publication and represents a working guide for the produce industry in establishing validation protocols for wash systems. It is important to recognize that wash systems take on many formats and designs, use different disinfectants, use different water sources and process many different commodities which in turn creates variable water chemistries, contact times, parameter measurement challenges and ultimately control. It is important for operators to understand the parameters for their systems to achieve the most effective results.
Photo Credit: Dr. Mary Lou Tortorello, FDA, IFSH
The search for surrogates continues . Surrogate microorganisms represent an essential tool for conducting validation studies. A surrogate is simply a microorganism that can be used to test preventive controls like wash water disinfection or those that might reduce the survivability of pathogens in the production environment. Facility operators and growers cannot use human pathogens in process facilities for fear of cross contamination so they need a surrogate that closely mimics or exceeds the survivability of the authentic pathogen without the public health consequences. CPS has prioritized surrogate research over the last several years. Most notably, Suslow has reported on the use of attenuated strains of Salmonella and E. coli O157:H7 and their use as surrogates in field-level persistence studies and Teplitski (University of Florida) has described Salmonella surrogates with specific deletions in virulence factors . In 2016, Cook (USDA) reported that the genetic diversity of E. coli strains might hold great promise for identifying useful surrogates . She is proposing to “mine” E. coli strain collections to find suitable surrogates that closely mimic or exceed the survivability of STECs. For example, non-pathogenic E. coli’s isolated from lettuce leaves might have the lettuce binding factors of an STEC and the genetic and physiological capacities to survive the lettuce production environment that would permit their use in validating wash water systems without the risk that using a pathogenic STEC would represent. Both Cook and Wiedmann (Cornell University) suggested that the physiological state of a surrogate, i.e. the organism’s response to the production environment or a sanitizer is as important as strain diversity when measuring the efficacy of a surrogate . While the search continues for pathogen surrogates, it is important to understand that a researcher’s ideal surrogate might not be required for some immediate industry needs. For example, to measure the efficacy of a cleaning and sanitation process, it may be enough to know that total plate count has been reduced to zero by the treatment or that a wash water disinfectant has caused a 4-5 log reduction in total plate count in water samples. As surrogate research matures, specific surrogates or collections of surrogates can be substituted and preventive controls fine-tuned using these more representative tools.
Bacterial detection is not really the problem, separating the pathogen from the other bacteria is the key . This quote came from Dr. Sam Nugen (University of Massachusetts) when he discussed his research on using bacteriophage to specifically bind pathogens and permit their extraction from other non-target organisms from complex food matrices. He uses a 90-minute Photo Credit: Dr. Sam Nugen, University of Massachusetts digestion of the plant tissues followed by phage-based magnetic separation to separate the pathogen so that in can be measured. Bacteriophage have a higher degree of specificity for host bacteria than antibodies and bind irreversibly. The joining of the phage with a nano-magnet permits recovery of the pathogen for measurement without an enrichment period. The downside is that phage are so specific for their host strains that it is likely that multiple phage cocktails might be required to sample for pathogen species. This program perhaps provides a glimpse of next generation of human pathogen detection strategies for produce that combine biologics with nanoparticles.
The challenge of balancing the risk of animal intrusion and conservation is benefiting from emerging data acquisition technologies and understanding of the impact of the environment on pathogen growth and persistence . Photo Credit: Dr. Alan Franklin, USDA, APHIS, WS Franklin (USDA) reported on alternative strategies to preventing animals from entering production fields , understanding the types of animals that might enter a field, the duration of their visits, and their activity while in the field by using field-implanted cameras to record animal movements. They set up motion-sensitive cameras to capture wildlife visitations in a leafy greens field in Colorado’s San Luis Valley. With pictures taken approximately every 5 minutes, the researchers took nearly 2 million photos. These images are currently being evaluated and the team hopes to establish a method for estimating wildlife visitation rates, patterns and frequencies by specific species, the impact of weather patterns on animal movements and perhaps adjacent crop and land use factors. Wiedmann also reported on the development of a predictive model to estimate risks related to Lm contamination in the field production environment. He identified weather, specifically rain events, as a predictor for higher risk of Lm contamination, e.g. there is a 25-times higher likelihood of finding Lm the day after a rain event. Proximity to roads, pastures and water sources also were predictors for finding Lm contamination. The value of these programs to growers is in their predictive, as opposed to reactive, nature. It is unlikely that growers will routinely deploy cameras in fields to monitor wildlife intrusion, but this technology may permit growers with recurrent problems an opportunity to quantify the scope of the problem, identify the species that visit the fields and perhaps help direct soil or raw product sampling efforts to areas most impacted by intrusion. Understanding the predictive factors for Lm can also help inform grower decisions. For example, knowing that Lm is 25-times more likely to be detected following a rain event, might lead a grower to delay harvest 24 hours following rain or to time irrigation activities to permit a 24-hour period before harvest. These studies also point out how important it is for a grower to assess the production environment and the risk factors that may be present.
Understanding the genetics and gene expression in production environments will drive the next level of understanding in produce food safety . A theme that surfaced across a number of presentations was the importance of not only the genetic make-up of a pathogen but also physiological state of the pathogen reflective of gene expression. Wiedmann reflected that growth conditions can impact survivability when doing challenge studies to validate preventive controls. In other words, the growth environment can cause the microorganism to turn genes on and off that might make it more tolerant of harsh conditions like treatments with oxidizing sanitizers. Suslow also cautioned attendees about over-reacting when they find clinically insignificant STECs stressing that it was important to know the genetic composition (i.e. virulence factors) of these strains. Teplitski and De Moraes (University of Florida) reported on comparative genomic analysis and physiological assessments of avirulent Salmonella surrogates and suggested that some Salmonella strains may have gene clusters that facilitate adaptation to tomatoes as a host. This report together with previous reports from Teplitski that suggested some tomato varieties may be more susceptible to Salmonella infection raises interesting avenues for further research into the genetic interaction between human pathogens and certain produce items. For growers and others in the supply chain, the key learning from this focus on genetics, including genome sequencing is to monitor this area of research going forward as it has the potential to yield new diagnostic tests for pathogens, better inform growers on how to interpret those pathogen tests in fields and packinghouses, provide public health officials and operators tools for conducting traceback investigations and evaluating sanitation programs respectively and perhaps lead to the development of targeted next generation cleaning and sanitation treatments for equipment.
Note : The authors would like to thank Bonnie Fernandez-Fenaroli, Executive Director of CPS, the scientists who participated in the seventh annual CSP Research Symposium and the industry representatives who volunteered their time to participate at the Symposium. Their presentation of research and discussion of what that research might mean to the produce industry certainly informs the content of this paper. More detail on these research projects can be found at www.centerforproducesafety.org . The author would also like to thank Susan Leaman and Diane Wetherington of IDS for their help in capturing notes during the presentations and preparing an original draft for this document. Lastly, we wish to thank Cyndi Neal at PMA for her efforts in formatting this document and doing all the things it takes to publish this work.
This work is meant to inform and provoke thought with an eye towards inspiring readers to examine their own food safety programs and using the research to make improvements. It is not meant as a directive on what must be done to produce safe food. As discussed in several places in this paper, food safety needs to be determined on an operation by operation basis; there are no one size fits all solutions. If you have additional questions, please feel free to contact Dr. Bob Whitaker, PMA Chief Science and Technology Officer ( [email protected] ) or Dr. Jim Gorny, PMA Vice President of Food Safety and Technology ( [email protected] ).
This and all of our food safety-related material is made possible by the members who support PMA's Gold Circle Campaign for Food Safety . Find out how your company can help improve produce food safety throughout the supply chain.