Testing is a tool: Improved microbial food safety assurance: tools and technologies to reduce the guesswork

My friend, Dr. Tom Ross (left, exactly as shown) at the University of Tasmania, gave a talk a while ago and was published in Food Australia. He has some important messages.

Tom-RossMost people know the basic rules of food hygiene, don’t they? 

We teach our children to wash their hands after going to the toilet, we know to keep left-overs in the fridge and to cook, or at least wash, raw foods because they might be contaminated with ‘germs’.  We cover foods, we avoid mixing cooked and raw and, if the food is old or we’re not sure about how its been stored, we apply the old adage of “if in doubt, throw it out”.  They’re really simple rules that reflect our awareness that invisible microbes might make us sick, and ways to minimise the risk.  Its hardly rocket science, is it? And if these are simple rules that ordinary people apply, how much safer must it be when food professionals prepare and process foods?

If it is that easy, then its hard to understand why – particularly given the enormous advances in biological science and technology over the last few decades – that there seems to have been no reduction in the incidence of microbial food-borne illness in decades.  

ANZFA (1999) estimated that there were 4 – 5 million cases of microbial food-borne illness in Australia every year or an average individual risk of foodborne illness of once every four to five years for all Australians.  Since 1999 the incidence rate has changed little. National Notifiable Diseases Surveillance System data (NNDSS, 2014) shows foodborne illnesses incidence, including Salmonellosis, Campylobacteriois, Listeriosis and even typhoid have not fallen since 1999: if anything, per capita rates have increased slightly. This situation seems to apply across the ‘developed’ world (CDC, 2013).

While some of that increase is due to better detection and surveillance systems, commentators (e.g., Altekruse and Swerdlow, 1996, Hall et al., 2002, Nyachuba, 2010) point to changes in the way food is provided to human populations, particularly the increasingly large proportion of people in ‘first world’ nations that live in urban areas. Our food comes from increasingly remote locations, and from increasingly large, centralised production and processing facilities, that bring increased challenges for the modern food industry and increased risks for consumers if those challenges are not met.

food.lab.testingOur foods, at source, are not free from microorganisms.  Irrespective of technological advances, foods are still produced in natural environments that can harbour pathogenic microbes. Common food animals have a gut microbiota that can also harbour pathogens.  Cows udders can become infected and contaminate milk with pathogens like Staphylococcus aureus or Listeria monocytogenes. Microbial hazards arise from myriad sources, often without signs that contamination has occurred.

The problem is compounded by expectations that fresh food is inherently ‘healthier’, and by longer food supply chains that can extend across continents. Longer supply chains with more handlers involved, and reduced use and choices of food preservatives, increase the chance of contamination and for microbes to grow to hazardous levels before consumption.  Nonetheless, consumers expect that food should not harbour hazards, an expectation encouraged by law firms now actively specializing in compensation claims and class actions suites.  The increasing proportion of consumers in developed nations with increased susceptibility to infectious illness due to age, underlying chronic disease, or reliance on chemotherapeutics that weaken their immune system further heightens the challenge. Put simply, much higher standards of food hygiene are expected, and needed, but with fewer ‘weapons’ in the arsenal.

‘End product’ testing is useful only for batches of product that contain a high proportion of defective units, i.e., units that fail to meet relevant food safety criteria.  If we assume that a just tolerable foodborne illness risk is one per 100 meals, to assure this incidence by testing we would need to be able to detect batches of product that have ≥ 2 contaminated units per 100.  We have the methods, particularly those involving enrichment and/or signal amplification (e.g. PCR), to detect a few microbes in a large volume  (e.g., 125g) of food, but only if we know where to look.  The problem is finding those one or two contaminated units among 100 with confidence. The probability of detection can be estimated using the “binomial distribution”[1] equation, that tells us how many samples are needed to be 95% certain that the batch as a whole has less ≤ 1 in 100 unacceptable units.

The binomial distribution tells us we’d need to take 299 samples, and they’d all have to test negative!  To be confident that the frequency of contaminated units was less than one in 10,000 (essentially the estimated status quo), however, we’d need to take nearly 30,000 samples and for all of them to be “clear”[2].  Those sorts of sampling numbers are simply not feasible.

So, what is the answer?

On May 25, 1961 then US President John F. Kennedy set a vision for his nation, that USA should “commit itself to achieving the goal …  of landing a man on the moon and returning him safely to the earth. No single space project in this period will be more impressive to mankind, or more important for the long-range exploration of space”. That speech started the ‘space race’, but the race was not always smooth.

crew_eatinghi_fullThe US space program had had many spectacular failures resulting in massive explosions in the Gemini and Mercury rocket programs. To explain, the propulsion system of a rocket is essentially a controlled explosion, propelling the rocket forwards.  Minor mistakes in construction of the rockets, particularly the boosters (engines), could lead to catastrophic failures. Fortunately, few such disasters resulted in loss of life. But the rocket scientists realized there was a weakness in the way that the rockets were constructed, particularly because that huge national project involved different regions constructing different components of the rockets, which were then transported to different locations for assembly.  Through the massive failures it became clear that new techniques for assuring the quality of  far-flung  systems  manufacture  and  final  integration were needed. A technique called Failure Mode, Effects, and Criticality Analysis (FMECA) first developed by the US Army in 1949 was applied to the Apollo program. It is a procedure for analysis of components and processes to determine those that, if they fail, could lead to catastrophic outcomes, especially those that endangered the crew. That analysis focused attention on ensuring the absolute reliability of ‘mission critical’ components.  It also become apparent that the astronauts themselves were mission critical components and that anything that affected their performance at critical moments (such as atmospheric re-entry, where an incorrect manoeuvre could lead to incineration of the spacecraft), were also critical components. Thus, the safety of the astronauts’ food supply was regarded as critical and led to the application of FMECA to food production and, eventually, spawned the Hazard Analysis Critical Control Points (HACCP) system.  HACCP is now the most widely endorsed approach to food safety management in the world.

Like FMECA, the basic principle of HACCP is that by understanding where hazards arise in food processes and by putting in place procedures to prevent, control or remove them, those hazards can be controlled in the end product to ensure the safety of the food and to minimise reliance on “end product” testing. Indeed, quality assurance for the early stage consumed most of the food through testing for food safety assurance!

But, sooner or later, if you do HACCP properly, you end up asking questions that need quantitative answers, like “how much control is needed” and “how can it be achieved”?, e.g., what times, and temperatures, or product formulations are needed to control specific microbial hazards?

To answer those questions requires a high level of expert knowledge because of the diversity of behaviour and environmental limits of different microbial hazards.  Thus, while HACCP is founded on a logical a system that allows for the early detection and elimination of specific hazards the correct application of the concept requires comprehensive expert knowledge.    

The zenith of the US space program is the International Space Station (ISS), orbiting some 330 km above Earth. It’s home to six astronauts/scientists at any given time and is so large that it can easily be seen from Earth when the sun has gone down, by reflection of sunlight or moonlight. NASA provides an email alert service  (http://spotthestation.nasa.gov) that, for any location on Earth, advises when the ISS will become visible, from what direction, its the height in the sky, and for how long it will be visible. You can set your watch by the space station’s appearance! Given the complexity of interactions of the ISS’s orbit, and position of the sun and moon, that this information is calculated for any point on Earth for any day of the year, the accuracy of the predictions of the ISS appearances seems incredible. But, at some levels, the Universe is very predictable. Despite the experience of many food scientists, food microbiology is also predictable. While not with the same confidence as the position and visibility of the ISS, the reproducibility of microbial behaviour in foods does offers great potential to food safety managers.

Microbes can’t think, ergo Predictive Microbiology

Bacteria and fungi can’t think. They don’t have free will.  As such, they tend to behave reproducibly in response to their environment, which has led to the development of the discipline of predictive food microbiology.

The basic premise of predictive food microbiology is that the behaviour (growth potential, growth rate, inactivation) of microorganisms is deterministic and able to be predicted from:

  • specific characteristics of the micro-organism itself
  • the immediate environment of the micro-organism (i.e., food composition and storage conditions)
  • time the organism is in those conditions and – sometimes –
  • the previous environment (because it affects lag time, and may affect resistance to inimical conditions).

In practice, the information about those responses is derived from systematic studies in research laboratories or gleaned and collated from the published scientific literature. The patterns of response are characterised and the data and patterns summarised as mathematical equations, called “predictive microbiology models”. In essence, these equations represent condensed quantitative knowledge of the microbial ecology of foods. 

No matter how much a researcher knows, or how well that knowledge can be summarised in a mathematical model, to be useful that knowledge still needs to be communicated and made accessible to people in the food industry in a form that they can use to improve food safety or shelf life. Accordingly, the equations are usually integrated into computer software that automates the calculations to enable quick predictions of microbial changes in foods over time.

Many of these models can be downloaded, or used, for free. As an example of the depth of information ComBase, which is the most developed predictive microbiology application in the world, is based on ~ 50,000 determinations of microbial growth, or inactivation rate, or survival, relevant to foods. 

Australia is an international leader in the use of predictive microbiology, having adopted the “Refrigeration Index” (RI),  a predictive microbiology model, into legislation. The RI evaluates the effects of temperature and time on the safety of red meat by converting that data into the potential growth of E. coli. The RI is enshrined in Australia’s Export Controls (Meat and Meat Products) Orders (1985).  In consequence of the Garibaldi EHEC outbreak in Adelaide in 1995 another Australian model, that predicts the inactivation of enterohaemorrhagic E. coli in fermented meats, was developed and adopted by industry and regulators for evaluation of process safety.

Recently, Australia adopted Codex Alimentarius Commission (CAC) criteria for L. monocytogenes in foods. Those regulations differentiate between foods that do, or do not, support the growth of L. monocytogenes. For foods, that do not support growth, tolerance for L. monocytogenes is much higher (≤ 100 CFU/g) than in products that do support growth (<1CFU/25g), greatly reducing the probability of product recalls and the burden of microbiological testing.  In the guidelines the use of predictive microbiology models to differentiate foods that do, or do not, support the growth of L. monocytogenes is specifically endorsed.  Among such models, the Mejlholm and Dalgaard  (2009) model, and available in the SSSP software suite (see Table 1), is the most extensive and best validated.

The discussion above does not consider the limits of application of predictive microbiology. Its clear that to make predictions about the number of bacteria in a specific food after a certain amount of time, and under given storage conditions, requires that we know the initial number, and also how the storage conditions fluctuated over time.  Low-cost data logging technology now exists that can wirelessly communicate details of product storage conditions over time,. But sources of variability might include differences between strains, and inhomogeneity in the foods that might be enough to allow some cells to be able to grow, while others of the same population cannot. Worst still, under certain conditions, bacteria are genetically programmed to behave unpredictably and for multiple phenotypes, with very different physiology, to be present among a single population. Fortunately, this so-called ‘bet-hedging behaviour is based on quorum sensing and would only be expected occur when cell densities are very high (Veening et al., 2008). For these reasons, models usually make predictions that take this variability into account and can provide predictions that include the probability of different responses occurring in different environment,

Both theory and experience show that end-product testing isn’t a practical for food safety assurance, particularly for the low incidence of contamination that consumers expect.  The HACCP philosophy approach provides the most reliable means of food safety assurance, but for that approach to be practical its necessary to prioritise among potential hazards and understand how to control them: from among the myriad potential hazards we need to identify those that represent the greatest risks, and to understand their individual behaviour and environmental limits to design foods and processes that  limit their growth or inactivate them, while minimizing affects on product quality. This challenge requires expert knowledge of the physiology of individual microbial hazards. That knowledge is increasingly being made available through the development of predictive microbiology mathematical models and software.

While basic principles of food safety aren’t rocket science, the complexities of the modern food industry mean that food safety managers can gain much from lessons learnt and technologies developed in the space program.  The HACCP concept had its genesis in the USA space program:. The modelling approaches and software now being used to optimise food safety management rely on high level mathematics to develop tools and strategies to best satisfy the paradoxical consumers expectations of minimally processed foods with maximum levels of safety. 

 

References

Altekruse, S. and Swerdlow, D. (1996). The changing epidemiology of foodborne disease. American Journal of  Medical Science, 311: 23-29.

ANZFA (Australia New Zealand Food Authority), (1999).  Food Safety Standards Costs and benefits: An analysis of the regulatory impact of the proposed national food safety reforms. ANZFA, Canberra, Australia. 154 pp.

CDC (Centers for Disease Control and Prevention) (2013).   Incidence and trends of infection with pathogens transmitted commonly through food – foodborne diseases active surveillance network, 10 U.S. sites, 2006. 2013.  Morbidity and Mortality Weekly Report, 68:328-332.

CDNANZ (Communicable Diseases Network Australia and New Zealand – Foodborne Diseases Working Party) (1997). Foodborne Disease: Towards reducing foodborne illness in Australia.  Tech Report Series No. 2. Australian Commonwealth Department of Health and Family Services, Canberra, Australia. 85 pp.

Hall, G.V., D-Souza, R.M. and Kirk, M.D. (2002). Foodborne disease in the new millenium:  out of the frying pan and into the fire? The Medical Journal of Australia, 177:614-618.

Mejlholm, O. and Dalgaard, P. (2009). Development and validation of an extensive growth and growth boundary model for Listeria monocytogenes in lightly preserved and ready-to-eat shrimp. Journal of Food Protection, 72:2132-2143

Membré, J-M. and Lambert, R.J.W. (2008).  Application of predictive modelling techniques in industry: From food design up to risk assessment. International Journal of Food Microbiology, 128: 10–15.

NNDS (National Notifiable Diseases Surveillance System), (2014). Notifications of a selected disease by State and Territory and year. Accessed on 20 September 2014 at: http://www9.health.gov.au/cda/source/rpt_4_sel.cfm

Nyachuba, D.G. (2010).  Foodborne illness: is it on the rise? Nutrition Reviews, 68:257–269.

Veening, J-W., Smits, W.P. and Kuipers, O.P. (2008). Bistability, epigenetics, and bet-hedging in bacteria. Annual Reviews in Microbiology, 62:193-201.

 

[1]      To be strictly correct, we should use another, related, equation called the ‘hypergeometric distribution’, but for almost all practical purposes the binomial distribution gives the same result.

[2]      Reliable on-line tools that can perform these calculations to design or assess the reliability of sampling plans can be found at:  http://www.icmsf.org/main/software_downloads.html, or http://www.fstools.org/samplingmodel/

 

This entry was posted in Food Safety Policy and tagged , , by Douglas Powell. Bookmark the permalink.

About Douglas Powell

A former professor of food safety and the publisher of barfblog.com, Powell is passionate about food, has five daughters, and is an OK goaltender in pickup hockey. Download Doug’s CV here. Dr. Douglas Powell editor, barfblog.com retired professor, food safety 3/289 Annerley Rd Annerley, Queensland 4103 dpowell29@gmail.com 61478222221 I am based in Brisbane, Australia, 15 hours ahead of Eastern Standard Time