Editor’s note: This column was originally posted by Growing Produce. To subscribe, visit https://www.growingproduce.com/subscribe/
“What do you think was different this season?” While not the first question that needs to be asked when facing a product recall or being implicated in a foodborne illness outbreak, that question generally ranks in the top five. In my experience, invariably something, or several somethings, were different in a season when an incident hits.
It isn’t always possible to pinpoint the specific root cause and the contributing or compounding factors. However, the effort of information and practices recall, document review, personnel interviews, and investigative discovery always yields benefits. Most often the benefits of an internal food safety incident review translate into real monetary gains in efficiencies, product quality, and food safety system enhancements.
THE DISCOVERY PROCESS
One important part of that discovery process, for the individual firm but equally for the commodity or category, is a deep dive into the official, public investigative reports provided by federal and state regulatory agencies. There are always important insights within these investigation reports, regardless of whether the industry always agrees with or can unequivocally contradict specific details with data and records.
One good example is the FDA 2020 Investigation Report: Factors Potentially Contributing to the Contamination of Peaches Implicated in the Summer 2020 Outbreak of Salmonella Enteritidis. This brief column isn’t the place to review the key findings, in relation to broader stone fruit grower and shipper questions and concerns, but these findings raise broad and important context for what actions must be taken and which should be taken.
At the top of the list is the finding of different Salmonella associated with peach tree and environmental samples. The fact that the outbreak defining (cause of illnesses) Salmonella Enteritidis was not recovered is not that unusual. Neither, in fact, is that multiple different Salmonella were recovered during the environmental investigative effort. As with any crop, Salmonella strains are going to be around and found in an open environment if you look hard enough.
Tree fruit are no exception. Different known factors influence the prevalence and likelihood of transference to fruit and survival. Clearly, the FDA and state partners in California identified hazards and risk factors of concern in some of the implicated orchards.
The California stone fruit industry has been evaluating this information and has held several forums to discuss the implications, prioritize research needs, and assess immediate orchard and packing operation actions and options.
This brings us back squarely to “What do you think was different this season?” For now, let us set aside the understandable concerns and broadly communicated precautionary statements by FDA regarding adjacent land uses and animal feeding operations associated with several recent outbreaks. The situation described in the report is not recent or prevalent in California tree fruit districts but also not unique or isolated to a specific firm. If proximity to animal feeding operations was, in fact, the fundamental root cause, the brain-torturing question is “What tipped the balance of risk exposure in 2020?” The follow-up question has been how to prevent or dramatically reduce the vulnerability in a perennial crop at these land use interfaces.
OPTIMIZE POSTHARVEST MANAGEMENT
Given the high degree of uncertainty of year-to-year risk, the primary opportunity has focused on further optimizing the postharvest management phase. If I had to pick one overarching priority for risk reduction in peaches and other stone fruit, I would pick this aspect for the concentration of industry effort and resources. Having addressed this issue of animal: stone fruit proximity decades ago, the most immediate positive actions will be to greatly improve the physical removal and lethality of spray-brush bed operations.
So, what is the point of this column? For many years the produce industry, in general, had been encouraged to emphasize wash line performance criteria on prevention of cross-contamination over optimizing reductions on product surfaces. This is sensible for many commodities but appears to be less applicable to several tree fruit, including peaches, nectarines, and plums/pluots.
This renewed work is in its early phases, but one cardinal rule of experimental design of validation and verification studies towards addressing this goal is that performance should not be based on the average degree of removal and lethality from contaminated fruit.
The range of reduction across a population of individual fruit is the essential comparator of performance. Recent preliminary studies reinforce past experiences that the range of reductions from experimentally proven, equally inoculated fruit can be very large. In collaboration with long time stone fruit and industry colleague, George Nikolich, and the support of the California industry, we hope to be able to share some promising optimization findings soon.
About the author: Trevor Suslow is an Emeritus Extension Research Specialist faculty member at University of California-Davis and continues to interact across trade associations and consult for the produce supply chain on safety and quality issues. His career has focused on the quality and safety of whole and fresh-cut produce from planting to postharvest management.
]]>What are auditors doing to protect against transmission or acquisition of SARS-CoV-2?
In the midst of the produce industry responses to the supply chain and marketplace disruptions from the SARS-CoV-2 (also called COVID 19 and coronavirus) pandemic, audits under the Global Food Safety Initiative (GFSI) benchmarked and FSMA-integrated audit schemes for Good Agricultural Practices, Good Handling Practices, Good Manufacturing Practice, and various other aspects of food safety systems are continuing. Audit service providers appear very uniform in providing assurance and sound policies for conducting ranch and facility site-audits and, as possible, virtual audits.
In line with the fluidity of guidance and specific protective measures from authoritative sources, these policies resonate with the current information provided by the WHO, CDC, FDA, CDC, and local, state and federal authorities in countries of their operation. Flexibility and accommodations for certificate extensions have been communicated to support the industry during this challenging time, if full audits must be delayed because of COVID-19. A few examples are provided here for those interested in the measures being taken to protect the firm, its employees, and the auditors themselves as they must travel within and among affected regions. Some key elements include daily self-monitoring, 14-day self-isolation if having traveled from or through a restricted area, separate vehicle travel from firm’s host, frequent handwashing, and frequent sanitization of any hand-held equipment and clothing.
Post-harvest water audit challenges have sparked controversy
Few produce safety control point operations have generated as much confusion and heated discourse as water quality management during postharvest cooling, washing, fluming, and quality retention treatments. This is triply true for recirculated water systems.
Prior to the coronavirus pandemic, the challenges of optimizing postharvest wash and cooling water quality management had once again risen to the forefront as the full FSMA ‘covered’ produce industry came into the compliance, inspection, and enforcement dates. Industry market-access, marketing association, and marketing order audit schemes reflected this increased expectation and scrutiny around scientifically valid water quality management parameters and verification programs. One of the more controversial compliance elements frequently arriving in my e-mail Inbox (nobody calls anymore) involves having a measurable and verifiable foundation for the frequency of partial or complete clean water exchanges.
Added to this, several firms brought the issue of some auditors and inspectors insisting that measurement of turbidity (water clarity) in postharvest water uses is a compliance requirement, a must, of the FSMA Produce Safety Rule. So, let’s deal with this here; I can find no evidence of this specific requirement or provision. Determining an effective and reproducible method for maintaining adequate water quality in postharvest applications is expected but not prescribed.
One of the common battle lines has been drawn around the use of turbidity as the practical and inexpensive trigger for freshwater introduction to recirculated dump systems, flume transport, cooling, and wash/treatment water in postharvest handling. Freshwater replenishment is one key tool to maximize the performance and dose management of diverse antimicrobial treatments to postharvest water. This is essential to minimize the risk of cross-contamination within and among lots over time, whether hours or, in rare cases, days.
Simply stated, turbidity is a simple and readily on-site deployable measurement but a limited indicator of an operational ability to manage microbial water quality in postharvest handling. Turbidity, or clarity, is an optical measurement of the light scattering properties of a liquid. For simplicity, let’s just say water. The intensity of light scattering is related to the specific traits and concentration of the materials in the water. These turbidity “factors” may include any single or combination of clay, silt, small and very small inorganic or organic suspended aggregates, dissolved organic substances, humic acids, and pigmented plant cell particulates and exudates to name a few.
The more suspended and dissolved materials in the water the greater the turbidity or cloudiness. Suspended particulates have been shown, in several recent studies and models of postharvest water quality, to provide a protective effect to human pathogens already attached to these surfaces or matrices. These particulates interfere with optimal dose efficacy of common water antimicrobials or prevent contact by their hydrophobic (water repelling) properties.
Many operations have selected various methods to measure or judge postharvest water turbidity and use these adopted or in-house generated values to determine when to add fresh water or execute a full off-schedule water exchange, due to prevailing seasonal conditions. Obviously, dilution of particulates and reduction of dissolved organic compounds will benefit towards operating within scientifically valid limits and above any established critical food safety limits or levels provided as guidance.
The controversy being encountered during inspections and audits arises when a turbidity standard has been set within a SOP and the firm is challenged to provide acceptable evidence of a reference validation study. These are exceptionally hard to come by. Very few peer reviewed studies accurately reflect the specific or even general commercial systems. Auditors or inspectors correctly observe that the boundaries of experimentally defined limits may be difficult to manage with accuracy (correct) and precision (consistent) in commercial systems. Those studies which do, based on on-site testing, typically report a low correlation to predicting antimicrobial dose-management control and achieving microbial water quality management goals, based on turbidity measurement alone.
Interestingly, all sources of turbidity are not the same. It would be very easy to get deep into the weeds on this subject. You might even be ready to take a weedwhacker to this, but it is complex and specific to the situation and there are a number of recent journal papers and a few lengthy reviews on the topic. Suffice it to say that all current research points to the fact that all turbidity is not created equal.
The same measured turbidity in different soil and organic constituent burdens may be perfectly adequate or inadequate to facilitate antimicrobial dose uniformity and maintenance. These differences directly influence the risk of cross-contamination of foodborne pathogens (primarily bacteria) in recirculated postharvest water. In some water systems a turbidity of 300-500 NTU (the units of clarity; FAU is an alternative unit but comparable) comprised of dissolved simple organic compounds would be excessive to prevent microbiological exceedances. In recirculated water with largely inorganic soil, with low humic acids and other sources of phenolic compounds, 1500 NTU may be a manageable operational upper limit.
This same body of research, from several research groups, points to Chemical Oxygen Demand (COD) as one of the better correlating traits to predicting the typical and worst-case accumulation of acute and long-term chlorine demand as raw or minimally-processed product and non-product materials (such as soil, leaf debris, decayed or damaged product) is added repeatedly to water systems. Even here, the specific composition that makes up the COD matters. Much of the focus has been on hypochlorite’s (chlorine) as the more impacted antimicrobial as compared to peroxyacetic acid and other effective organic acids, such as lactic acid. However, specific components of a system’s turbidity will interfere with these formulation’s efficacy and, as with chlorine and chlorine dioxide, can interfere with accurate dose measurement.
The current research is detailed, systematic, and robust but still very difficult to derive and develop simple guidance for end-users. One good source for open access to these details is available at producefoodsafety.org, a website dedicated to the outcomes of a large multi-investigator and multi-institution project under the Principal Investigator leadership and administrative coordination of Dr. Yaguang (Sunny) Luo (https://www.producefoodsafety.org/).
There are many in-line turbidity sensors and combined turbidity and conductivity sensors for commercial systems. My lab at University of California-Davis had an opportunity to conduct multi-visits to a grower-shipper to assist with assessing a newly installed hydrocooler management system with an in-line turbidity sensor, in-line PAA sensor, and suspended solids removal system. Focusing just on turbidity, the in-line sensor gave a different but very consistent read-out compared to a replicated ‘grab-sample’ from the system return water. Compared to the portable colorimeter, the sensor reading was always lower but uniformly so over several visits. With this outcome, it would be possible to derive a correction factor and use the on-site verification values, together with the other measurements of COD, dose, and microbial water quality control, to establish process control parameters.
Lastly, there is a growing interest in the potential for using ultraviolet light absorbance at 254 nm (UV254) as a surrogate for COD measurements. Several labs have been evaluating the predictive value as a real-time measure of antimicrobial demand. Thus far, across several fresh and fresh-cut commodities, the results have shown good but, again, incomplete correlations and a limited consistency across reported studies. However, as in many imperfect systems, integrating multiple measurements in models being developed would appear to provide a functional and practical approach to improved postharvest wash water management for food safety and quality.
About the information: This outreach article was developed as part of the objectives of National Institute of Food and Agriculture, U.S. Department of Agriculture, agreement number 2016-51181-25403.
(To sign up for a free subscription to Food Safety News, click here.)
]]>It goes without saying that the current crisis response situation to SARS-CoV-2 (coronavirus) has and will continue to dominate our conversations and activities for some time to come. The economic, social, and emotional impacts across the produce supply chain have been monumental but also asymmetric in the specific effects within diverse sectors.
The current surge in consumer demand for produce being experienced within wholesale membership stores, brick and mortar retail markets, and diverse online and home delivery services has been met with still evolving capacity responses among suppliers. At the same time, new policies and practices to protect and prevent interpersonal contamination of SARS-CoV-2 (COVID-19 also known as coronavirus), from farm to shipping dock, are being implemented. These “new normal” SOPs are demanding and designed to ensure uniform and clear communication to cross-functional teams to achieve consistent implementation. We need to recognize and appreciate that these more intensified measures are consistent with over two decades of long-standing Best Practices guidance, but significantly increased in their demands for meticulous execution by all personnel.
During this serious and shared battle to minimize new illnesses and spread of SARS-CoV-2, we cannot afford to falter in attention to our prerequisite and foundational prevention and food safety systems management.
Some reported SARS-CoV-2 prevention responses include assigning individuals to clean and sanitize all common facility operational contact surfaces after each human touch-event. This example of the newly adopted policies and practices, and many others, together with increased order fulfillment demands, have the potential to lessen the attention and rigor given to standard food safety practices and stretch the limits of process and associated perishables inventory control systems. Without suggesting in any way that lapses in food safety systems have occurred, I am confident that we all recognize that any outbreak event would further tax public health agencies and the medical and health services systems.
Prior to the current coronavirus pandemic, several challenges of implementation had risen to the forefront of attention as the full FSMA “covered” produce industry came into the compliance, inspection, and enforcement dates. We all recognize that more needs to be known and done to prevent outbreaks attributed to consumption of fresh produce. However, it is valid to point to substantial improvements in the general level of baseline compliance with FSMA provisions across the industry over the past five years. Some of this increase in awareness and adoption of food safety programs are, in part, a reflection of the increased expectation and scrutiny around basic prerequisite programs. In addition, where recommended or required, best efforts to identify and adopt scientifically valid process management parameters and verification programs has increased.
As the current response to provide the elevated demand for diverse fruits and vegetables may tax handling and throughput capacities and labor, several key areas standout for mention, on an individual firm and case by case consideration, as requiring renewed commitment to basic food safety system management beyond normal peak packing and processing efforts.
One final comment, generally applied to the content above, which relates to making risk management decisions based on preliminary studies or typical journal publication summaries. If I haven’t learned anything else from decades of applied microbiological research and extension outreach activities in produce food safety, I am confident this is something I know I know.
The microbial world, including pathogens, doesn’t behave in the real world around the averages of research outcomes. This is especially true in relying on predictions of survival, die-off, and persistence. In commercial risk decisions and policy or standard setting, it is the high outliers that matter most, and these tend to be greatest as timelines progress. Very low probability of survival may seem inconsequential. However, when dealing with tens of thousands or millions of individuals in a plant population, the frequency of these surviving outliers may represent a significant public health risk.
The commitments across the produce supply chain to protect employees, customers, and consumers from exposure to SARS-CoV-2 has been tremendous. Let’s all guard against reduced vigilance in basic food safety practices.
(To sign up for a free subscription to Food Safety News, click here.)
]]>
(To sign up for a free subscription to Food Safety News, click here.)
At this time, two diametrically opposed understandings and information transfer communications of the compliance dates for growers to have completed initial water‐testing profiles are still propagated in some PSA Grower Training events.
The purpose of this communication is to assist in correcting this situation and supporting PSA efforts to have one standardized message on agricultural water testing as the specific provisions and requirements evolve and are resolved.
Recognizing that the PSA Director and staff made an effort in December 2017 to clarify the modified dates proposed in Extension of Compliance Dates for Subpart E Agricultural Water ∙ § 112.41 (title abbreviated for simplicity), released in September 2017, the correct interpretation did not uniformly penetrate the full cadre of trainers and industry food safety professionals. When first released, many others and I failed to consult our copy of Bureaucracy‐speak for Dummies and thought we understood the intent and implications, especially for those growers not currently conducting routine water testing.
Though some may be shocked to learn the proposed revisions to compliance dates for establishing a Microbial Water Quality Profile were unclear, the nuance of when initiation of baseline testing was required evaded even ardent PSA‐engaged individuals. To many of us, common‐sense views of compliance meant you were ready to go not ready to start.
A few of us, notably including Don Stoeckel of the PSA staff ([email protected]), have been working for some time to get definitive and simplified clarification, for trainers and growers, on the proposed compliance dates by which covered farm operations must have a fully developed and on‐going Microbial Water Quality Profile.
I have recently communicated with the Center for Food Safety and Applied Nutrition (CFSAN) staff at the Food and Drug Administration (FDA) as this variable interpretation of compliance dates has come up multiple times and been problematic in grower training conducted in California. The uncertainty led to subsequent vigorous debates between suppliers and buyers as they work to update audit specifications to meet Food Safety Modernization Act (FSMA) requirements.
Once more acutely aware of the on‐going split in interpretation, FDA CFSAN was very helpful and responsive to me in making their intent clear, unofficially, and indicating a timeframe for finalization of the dates.
I was informed that the proposed extension of compliance dates is moving close to review by FDA general counsel and finalization for publication is anticipated in the near future. Though not yet published, I felt this note was important to disseminate now and as broadly as possible to PSA trainers and the industry in the interim.
The clarification and confirmation received (see below for the formal request submitted to the FSMA Technical Assistance Network (TAN) in November 2017 and the very recent official response*) is immensely crucial to PSA trainers and university extension responding to questions of industry and auditor interpretation.
Equally, as state agency staff have been attending PSA training and preparing for their role in the On‐Farm Readiness Review program and, ultimately, in compliance inspections, it is critical that uniform and clear information sharing regarding the tiered compliance dates are communicated (See this PSA web page).
In closing the circle, through sharing my FDA CFSAN communication discussions with PSA staff, here is our best “unofficial” understanding of when water testing must be initiated.
In accordance with the revised and extended compliance dates for establishing a microbial water quality profile (MWQP) for covered activities under Subpart E– Agricultural Water (this, of course, assumes that some testing program and an MWQP is retained during the interval between 1/26/2018 and 1/26/22), the following applies:
Bottom line: Those of us who interpreted the language of the proposed extension and shared our understanding with many others that, if further revisions were not developed, the earliest testing programs would need to be in progress following the 1/26/18 compliance date for the Produce Rule and completed for a compliant MWQP by the 1/26/22 Subpart E compliance date for larger businesses (and so forth for other business tiers) got it wrong, according to FDA.
A copy of this note and a simplified table is being sent to all attendees of our PSA training and the clarification will be shared at the upcoming Water Summit, February 27‐28, 2018.
Question to TAN
This question is about FDA’s Proposed Rule “Standards for the Growing, Harvesting, Packing and Holding Produce for Human Consumption; Extension of Compliance Dates for Subpart E.
The proposed rule states: “As part of this proposed extension, we also propose to simplify the subpart E compliance period structure such that all compliance dates for subpart E provisions as applied to non-sprout produce would occur at the same time.”
This language of the proposed rule is not explicit about whether the proposed compliance dates of 2022, 2023 and 2024 (depending on business class size) apply to the provisions about calculating the microbial water quality profile (MWQP( and using the results to determine the appropriate ways in which water may be used (112.45 (b)).
This question has immediate relevance to farms, because if the intent of the proposed rule is to have an MWQP in place at the proposed compliance dates, then some farms would be required to being sampling as early as 2018.
Thank you for whatever information you can provide about the intent of the proposed rule related to timing of sample collection in support of MWQP calculations and operational decision making.
TAN Response
Response Thank you for writing. First, we note that since finalizing the Produce Safety Rule, FDA has received feedback that some of the standards outlined in Subpart E, “Agricultural water” (21 CFR Part 112, §§ 112.41-112.50), which include numerical criteria for pre-harvest microbial water quality, may be too complex to understand, translate, and implement. These factors can be important to achieving high rates of compliance.
In response to these concerns, the FDA is exploring ways to simplify the microbial quality and testing requirements for agricultural water while still protecting public health. For more information, see “FDA Considering Simplifying Agricultural Water Standards” at https://www.fda.gov/Food/GuidanceRegulation/FSMA/ucm546089.htm
As discussed in your inquiry, FDA has proposed to extend, for covered produce other than sprouts, the dates for compliance with the agricultural water provisions in the Produce Safety Rule (82 FR 42965 (Sept. 13, 2017)). The proposed compliance dates are in the table, below. See 82 FR 42965 (Sept. 13, 2017) for more information on this proposed rule.
Table:
Proposed Compliance Dates for Requirements in Subpart E for Covered Activities Involving Covered Produce
(Except Sprouts Subject to Subpart M)
Proposed time periods starting from the effective date of November 27, 2015,
produce safety final rule (January 26, 2016)
Size of covered farm Compliance Period Compliance Date
Very small business 8 years January 26, 2024
Small business 7 years January 26, 2023
All other businesses 6 years January 26, 2022
If finalized, these compliance dates would mean, for example, that a farm that is not small or very small must begin sampling (emphasis and highlight added by Suslow) and testing untreated surface water in accordance with § 112.46(b)(1)(i)(A), as applicable, no later than January 26, 2022.
Additionally, the farm has discretion under § 112.46(b)(1)(i)(A) as to both (1) the number of samples they include in their initial survey, provided that the total must be 20 or more samples (unless the farm establishes an alternative testing frequency in accordance with § 112.49); and (2) the time period over which such samples are taken, provided that the period must be at least 2 years and no more than 4 years.
FDA intends to use the extended time period to work with stakeholders as it considers the best approach to address their concerns while still protecting public health. The extended compliance dates will also give farms an opportunity to continue to review their practices, processes, and procedures related to agricultural water and how it is used on their farms.
It is important that as FDA implements FSMA, the agency strikes an appropriate regulatory balance and decreases regulatory burdens whenever appropriate. FDA remains committed to protecting public health while implementing rules that are workable across the diversity of the food industry.
Thank you for contacting FDA’s FCIC/TAN.
View popular Food Safety Modernization Act (FSMA) questions and answers identified by the Technical Assistance Network (TAN), on our website.
This communication is intended for the exclusive use of the inquirer and does not constitute an advisory opinion (21 CFR 10.85(k)). Also, note that this response is not intended to be a comprehensive list of all applicable requirements. Please check FDA’s web page (www.fda.gov) regularly for guidance reflecting our current thinking.
Additional information on FSMA can be found on FDA’s FSMA web page (www.fda.gov/fsma). This communication may contain information that is protected, privileged, or confidential. If you have received it in error, please immediately delete all copies.
(To sign up for a free subscription to Food Safety News, click here.)
]]>
Yes, once again this type of bacterial testing activity has caused a flurry of concern and confusion. I support the notion that there is always room for improvement in food safety management and that FDA should increase the specificity of their guidance and regulations, where warranted and defensible, to include science-based standards and microbiological limits for fresh produce.
However, I feel it is grossly unfair to consumers to raise a specter of fear well beyond what is supported by available science and our everyday shared experiences. What I rely on for my personal confidence in regularly consuming lettuces, spring mix, and spinach salads is that there are billions and billions of servings of these items consumed every year in the U.S. alone and the predominant experience we have is of safe consumption.
No one wishes to dismiss the fact that such consumption likely results in sporadic cases of illness that aren’t known by the public health system and have caused multiple outbreaks and tragic consequences for individuals and families. Continued efforts by the industry, FDA, and consumer advocacy groups to elevate performance standards for prevention and process management along the whole food chain at a national level are certainly warranted.
Uniform and accepted microbiological standards, as stated in the Consumer Reports report (See Study Finds Bacteria in Packaged Greens, Feb. 3), are not available at this time. I believe the criteria that were chosen do not provide sufficient information, by themselves, to judge the sanitation performance or risk to consumers.
First let’s take care of one issue, from my perspective; a normal head of lettuce is colonized–not contaminated–with a diversity of microbiota, including diverse types of bacteria. Only a small fraction of the total normal bacteria on lettuce can be grown or cultured in the lab. The total numbers of bacteria on a leaf far exceed the number of a single group like the total coliforms that were a prime target in the survey. A smaller subset of total coliform bacteria are the fecal coliforms. We eat lots and lots of microbes all the time.
Second, total coliforms and fecal coliforms are defined by a set of culture-dependent lab criteria. This long-standing and convenient trait-based classification includes non-harmful E. coli and other related bacteria associated with fecal origin.
An estimate of the number of total coliforms generated by the lab tests also includes many other related bacteria that are part of the normal and expected group of plant colonizers. We are all exposed to plant-associated bacteria and consume them on a regular basis, often in large numbers like those reported in the survey.
Some that are not necessarily of fecal origin are recognized to be opportunistic pathogens, as a group, but the role of environmental isolates in causing human illness, as compared to the same taxonomic species from a hospital environment, is much less certain. Even here, illness with this group is more associated with problems that arise from inhalation or injection with non-sterile medical devices and equipment and other predisposing health factors.
However, I am certainly not a medical or public health expert and I am simplifying this quite a bit just to ensure that you are aware that a total coliform or fecal coliform doesn’t necessarily indicate fecal contamination in the plant world. Their numbers on a leaf or fruit do not relate well to risk of illness or true and serious pathogens being present. When one follows standard protocols, developed for dairy, meat, drinking water, and wastewater reclamation, for example, for enumerating total coliform populations from plants, one often gets high numbers of these plant colonizers. They are very tough to wash off and are not killed 100 percent even with the most elegant and sophisticated wash disinfection system.
It is certainly conceivable and has happened that contamination we should be concerned about would be present among these coliform bacteria, but it isn’t automatic. The normal level of “fecal coliforms” (I prefer and always use the alternate classification Thermotolerant Coliforms; grows at 42 to 44 degrees Celsius or 107 to 111 degrees Fahrenheit) is generally a subset of this and often varies more widely from head to head and leaf to leaf; here again this is not a strong predictor of pathogen presence or risk of illness to consumers.
The suitability of enterococci as strong indicators of recent fecal contamination or pathogen presence is not well established for plant products. This group has also been shown to have an environmental phase (growth in soil and sediments) which complicates the interpretation of their presence. While enterococci are generally considered better indicators of fecal contamination, their presence is simply not a perfect associative indicator for direct environmental contact with fecal matter or gross sanitation failures.
That the survey results found higher numbers of total coliform near the end of Use By Date is not at all surprising as there will always be some at the end of the most vigorous wash and sanitizer treatment. These survivors can grow (slowly) at typical refrigeration temperatures and certainly could multiply more quickly if exposed to warmer temperatures.
Growth would be expected especially if exposed to fluctuating temperatures that go from coldest to warmer to cold. Higher numbers are also consistent with the stage of decline of freshness and natural plant senescence, the inevitable process of quality loss that goes hand in hand with an increase in spoilage organisms.
The Consumer Reports study results may be consistent with widely held concerns for better cold-chain control, especially with packaged salads and other pre-cut or ready-to-eat fruits and vegetables, all the way to the home consumer. Have we seen high counts seasonally or wash procedures that aren’t optimal? Sure, but there is another possible explanation. Because all the samples were taken from retail stores, the numbers of bacteria (not that fact that they were present) may tell us more about the temperature history of the product than provide clear evidence of poor sanitation.
Purchasing packaged salads or whole heads is a matter of personal choice. We do both in my family. I always wash loose leaf lettuces to remove any adhering soil. I never wash packaged salads. I do not support or believe that re-washing packaged salads should be a recommendation for the home consumer. A large and diverse panel of experts published a comprehensive article in 2007* detailing the scientific evidence for the lack of benefit and the greater risk of cross-contamination in the home.
If one chooses to take advantage of the convenience and diversity of greens available in sensible serving portions or as complete salad meals, it is always best to look at the Best if Consumed By dating and take notice of the display case arrangement. Bags should be vertical in a row, not laid one on top of the other in stacks. Clamshell containers are displayed in various stacking or slanted row patterns which allow generous space for airflow.
I always make it a habit to check the display temperature by hand. This isn’t perfect or necessarily an indication of safe or unsafe product but it is at least easy to tell if the air is really cool and the bags are very cool to the touch. Maybe our cell phones and smart-phones sho
uld come with an infrared di
gital thermometer function.
Comments regarding cold-chain management, product temperature at point of purchase (POP), and the role of the home consumer in handling of packaged salads have prompted additional requests for information. Two main questions regarding consumer recommendations emerged:
1. Is post-purchase temperature equally relevant for quality and safety?
2. Can consumers really judge if product has been temperature-compromised at POP?
Simple answers to the theme of Question #1 are not possible because exceptions to lower risk or higher risk can always be made and are equally valid. The most responsible answer is “It depends.” However, this is unsatisfactory, especially when trying to provide information consumers can use as an everyday rule of thumb. So I will make a brief general attempt and hope any backlash is not too intense. To limit the scope of the response, I will stick with packaged salads for the most part.
Is post-purchase temperature equally relevant for quality and safety?
Temperature management and cumulative cold-chain history is predominantly a quality issue and determines a product’s visual, sensory, and nutritive keeping-potential. The FDA Food Code (2009) has identified Time/Temperature Control for Safety (TCS) limits, at or below 41F (5C), for certain value-added produce that must be applied to distribution, storage, and display. This includes cut leafy greens as well as fresh cut cantaloupe, pre-sliced or diced tomatoes.
These are designated as TCS foods due to recurring outbreaks AND the known growth potential of bacterial pathogens on the product. The recognized low infectious dose of many pathogens may be sufficient to cause illness in highly susceptible individuals and growth on the product is not necessary to cause great harm. However, not all possible pathogens and variants of these pathogens, that may infrequently find their way onto or into product, are equally infectious to all individuals.
Proper post-purchase temperature management may and likely has kept a bacterial contaminant, such as Salmonella or pathogenic E. coli, below the threshold for illness for an individual consumer. Improper post-purchase temperature management may and likely has contributed to elevating these pathogens above an individual’s personal threshold and, by cross-contamination in serving, increased the chance of exposure in an individual portion from the same bag.
The absence of visual signs of improper temperature exposure, such as spoilage or decay, provides no assurance that significant growth of bacterial pathogens has not occurred. Recent research evidence suggests that the pre-consumption environment may increase the aggressiveness (lowering the threshold) by activating mechanisms for human infection.
In summary, with all best efforts at prevention and control, if pathogens are present in packaged salads the consumer is at risk of illness, possible long-term health effects, or death. Keeping packaged salads cold is essential to quality and may reduce risk to individual consumers though not likely all consumers of the same lot.
Can consumers really judge if product has been temperature-compromised at the Point of Purchase?
Yes and No. I’ll bet you knew that was coming. Realistically the Yes is very small and No the more sensible response. So to keep this answer simple for a change, let’s stick with the No side of the equation and talk briefly about a potential consumer-oriented solution that always crops up.
Time:Temperature Indicators or Integrators (TTI) have been around for a long time and used on many perishable products. The function of a TTI is to make improper and abusive temperature exposure, linked to known quality defect-inducing conditions, readily apparent by a simply visual inspection, usually a color change, color development (invisible to highly visible), or progressive loss of color bars on a small patch or tag. No equipment is needed and no special training is required for anyone to get the information.
There are many types and have been many improvements in accuracy and readability over the past 15 years. For the consumer, TTI’s affixed to a bag, clamshell, or other individualized purchase unit would be the relevant location. These have been used in the EU for many years, including on value-added produce.
There are many arguments for and against the value of TTI labeling which is beyond the details of this response; retailers in the U.S. have consistently argued against their use. Do TTI’s tell the consumer anything about product safety? Not really, apart from considerations for TCS in the answer to Question #1 above.
If the TTI validations, and therefore the rate of color-change, were adjusted to pathogen growth response rather than quality loss and shelf-life parameters it could be argued that a level of consumer protection had been achieved. Under the current boundaries at the low end of cold-chain performance, would safe product be destroyed? Highly likely. Could TTI’s help simplify a consumer’s POP decision about quality? I think so. Would the use of TTI complicate a retailer’s liability? I will let the experts answer that.
*Recommendations for Handling Fresh-cut Leafy Green Salads by Consumers and Retail Foodservice Operators. 2007. Food Protection Trends. 2: 892-898
]]>