‘No food is safe’ Blue Bell, industry, flout Listeria guidelines

I treat all food as a risk, because I know how it’s produced, I’m familiar with the outbreaks, and I don’t get invited to dinner much.

But I eat (probably too much).

blue.bell.creameriesBlue Bell Creameries, according to the Houston Chronicle, ignored critical parts of federal recommendations aimed at preventing exactly the kind of foodborne illness that thrust the Texas institution into crisis this year.

Among the most straightforward: If listeria shows up in the plant, check for it in the ice cream.

The draft guidelines for fighting the bacteria inside cold food plants were published seven years ago. They were optional and have yet to be finalized but nonetheless provide a road map for hunting and destroying the bug.

Ice cream companies large and small have flouted the guidelines.

Blue Bell “is no better or no worse than probably 90 percent of the rest of the companies,” said Mansour Samadpour, whose IEH Laboratories runs testing programs and crisis consulting for food producers.

Three ice cream makers got into trouble with listeria within the last year: Snoqualmie Ice Cream in Washington state, Jeni’s Splendid Ice Creams in Ohio and the much larger Blue Bell. It’s among the top purveyors of frozen treats in the United States. Of the other big companies, Unilever, which makes Breyers and Ben & Jerry’s, declined to say whether it follows the 2008 guidance. Nestlé S.A., which produces Häagen-Dazs and Dreyers, also wouldn’t say. Wells Enterprises Inc., maker of Blue Bunny, didn’t return messages.

The 2008 document, called “Guidance for Industry: Control of Listeria monocytogenes in Refrigerated or Frozen Ready-To-Eat Foods,” laid out a plan to attack one of the most ubiquitous and pernicious microbes in the environment. It lives in soil and animal feed. Refrigeration provides little deterrent to growth. It survives freezing. Once it enters a plant, it’s so hard to remove that, in extreme cases, entire facilities have been demolished to eliminate it.

When companies use the guidelines, they find that they work.

listeria4After the Nebraska Department of Agriculture found listeria in a random sample of Jeni’s, the CEO instituted a monitoring program as stringent as what the FDA prescribed in 2008. After destroying product worth $2 million and spending hundreds of thousands on thorough cleanings and plant upgrades, the company again found listeria in its product June 12 – illustrating the pathogen’s resiliency. But this time, Jeni’s caught it before it left the plant.

Blue Bell now is trying to follow suit, committed to becoming “first-in-class with respect to all aspects of the manufacture of safe, delicious ice cream products,” spokesman Joe Robertson said in an email. It now has a team of microbiologists and, like Jeni’s, will test and hold its ice cream until proven safe, once production resumes. He said the company “always tried to do the right thing to produce high-quality, safe products,” but pending lawsuits in the listeria outbreak prevented him from discussing whether Blue Bell previously followed any aspects of the 2008 guidance.

The FDA recommended that even the smallest companies regularly test food contact surfaces and the food itself for listeria. That may seem like an obvious strategy, but industry and consumer advocates have long fought over it.

FDA records show that Blue Bell had written plans to test its plant environments for pathogens. But they didn’t include sampling the surfaces that come into contact with food or the food itself, or finding the root cause of the contamination. From 2013 to early 2015, Blue Bell found listeria on drains, floors, pallets, hoses, catwalks and surfaces near the equipment that fills containers. But it never looked for listeria in the ice cream.

Mandatory microbial testing on plant surfaces and in food has long been viewed by industry groups as a one-size-fits-all approach that doesn’t work and costs too much, especially for small producers. Some have deemed it unnecessary when there are controls – like pasteurization – that kill pathogens. But consumer advocates say those arguments veil a deeper objection: Companies know that if they test for bugs, they will find them, and if they find them, the law says they must act.

If Blue Bell had followed the 2008 guidance, the first listeria positive would have set off an intense hunt for the source and likely triggered recalls or stopped shipment of potentially tainted products.

The list of foods at “high risk” for pathogens continues to grow, a fact that exasperates Samadpour. Peanuts and peanut butter, for example, weren’t on the radar until a series of outbreaks that caused hundreds of illnesses beginning in 2007.

“The way the food industry operates, they have an assumption that any food is safe until proven otherwise,” he said, noting that outbreaks of foodborne illness get detected by chance – Blue Bell’s was discovered only because South Carolina officials randomly tested ice cream early this year. “At what point are we going to say … no food is safe?”

The ultimate stopgap – testing food before it gets shipped – isn’t foolproof. The only way to detect everything is to test everything, which is impossible because the tests destroy the product. But Samadpour points to advances in the ground beef industry, which caused E.coli O157 infections to drop by half since 1997. After regulators declared the bacteria an unlawful “adulterant,” the industry ramped up testing.

Other food manufacturers balk at the cost, but there is a price either way. The FDA estimated the Food Safety Modernization Act will cost the food industry $471 million a year, while foodborne illness costs the nation $2 billion.

“One thing about food safety is it doesn’t regard the size of your company,” said Samadpour, who was hired by Snoqualmie. “You can be a tiny company and kill 50 people.”

New biosensor enables rapid detection of Listeria

A Texas A&M AgriLife Research engineer and a Florida colleague have developed a biosensor that can detect listeria bacterial contamination within two or three minutes.

listeria4“We hope to soon be able to detect levels as low as one bacteria in a 25-gram sample of material – about one ounce,” said Dr. Carmen Gomes, AgriLife Research engineer with the Texas A&M University department of biological and agricultural engineering.

The same technology can be developed to detect other pathogens such as E. coli O157:H7, she said. But listeria was chosen as the first target pathogen because it can survive even at freezing temperatures. It is also one of the most common foodborne pathogens in the world and the third-leading cause of death from food poisoning in the U.S.

“It can grow under refrigeration, but it will grow rapidly when it is warmed up as its optimum growth temperature ranges from 30 to 37 degrees Celsius — 86 to 98 degrees Fahrenheit,” Gomes said. “This makes it a particular problem for foods that are often not cooked, like leafy vegetables, fruits and soft cheeses that are stored under refrigeration.”

Currently, the only means of detecting listeria bacteria contamination of food requires highly trained technicians and processes that take several days to complete, she said. For food processing companies that produce and ship large quantities of foodstuff daily, listeria contamination sources can be a moving target that is often missed by current technology.

Not all meat juice is the same

I’ve always wanted to use the phrase, meat juice, in a peer-reviewed journal article, but researchers from Sweden beat me to it.

meat.juiceMeat juice samples are used in serological assays to monitor infectious diseases within the food chain. However, evidence of inferior sensitivity, presumably due to low levels of antibodies in the meat juice compared to serum, has been presented, and it has been suggested that adjusting the dilution factor of meat juice in proportion to its blood content could improve sensitivity.

In the present study, the agreement between Toxoplasma gondii–specific immunoglobulin G (IgG) levels in meat juice and serum was evaluated, and whether the level of immunoglobulins in meat juice was dependent on its blood content.

Serum and meat juice from diaphragm, heart, tongue, Musculus triceps brachii and M. semitendinosus were collected from 20 pigs experimentally infected with T. gondii. Analysis of total IgG, heme-containing proteins (hematin), and hemoglobin (Hb) revealed significant differences between samples from different muscles, with the highest levels in samples from heart and tongue, and the lowest in samples from leg muscles. Comparison of T. gondii–specific antibody titers in meat juice and serum revealed a strong positive correlation for meat juice from heart (rs=0.87; p<0.001), while it was lower for M. semitendinosus (rs=0.71; p<0.001) and diaphragm (rs=0.54; p=0.02). Meanwhile, the correlation between total IgG and T. gondii titer ratio (meat juice/serum) was highest in diaphragm (rs=0.77; p<0.001) followed by M. semitendinosus (rs=0.64; p=0.005) and heart (rs=0.50; p=0.051). The correlation between Hb and T. gondii titer ratio was only significant for diaphragm (rs=0.65; p=0.008), and for hematin no significant correlation was recorded. In conclusion, the specific IgG titers in meat juice appeared to depend on the total IgG level, but the correlation to blood (Hb or hematin) was poor.

Importantly, large significant differences in total IgG levels as well as in specific antibody titers were recorded, depending on the muscle the meat juice had been extracted from.

 “Meat juice” is not a homogeneous serological matrix

Foodborne Pathogens and Disease. April 2015, 12(4): 280-288

Wallander Camilla, Frössling Jenny, Vågsholm Ivar, Burrells Alison, and Lundén Anna

http://online.liebertpub.com/doi/abs/10.1089/fpd.2014.1863#utm_source=ETOC&utm_medium=email&utm_campaign=fpd

 

Sample-based data model extended to veterinary drug residues

 As two Australian Football League players (the ice hockey of footie) claim their positive tests for clenbuterol came from steak consumed in New Zealand (that’s just scientifically BS, as cyclist Alberto Contador proved in 2010 ), the European Food Safety Authority is extending the use of its harmonised sample-based data reporting model to the collection of data on veterinary medicinal product residues in animals and animal products.

clenbuterol.aflSample-based reporting using standardized description elements is already used to collect occurrence data from Member States in areas such as food additives, chemical contaminants, pesticide residues and antimicrobial resistance. 

Monitoring data on veterinary medicinal product (VMP) residues are currently submitted annually in an aggregated format to a database maintained by the European Commission. EFSA then examines the data and presents the results in annual reports. However, aggregation does not lend itself to complex statistical analysis and is of limited value for quantitative exposure and risk assessments. The move to direct collection of data in a sample-based format will enable EFSA and the European Commission to tackle questions related to the risk assessment and risk management of VMP residues. 

The food taster: Turkish edition

In a modern twist on a self-preservation tactic used by cautious kings and pharaohs, President Recep Tayyip Erdogan of Turkey is having his food tested before he eats — not by a human taster, though, but in the lab.

food.taster-195x300Mr. Erdogan’s physician, Dr. Cevdet Erdol, revealed this week that at least one of the thousand rooms in the president’s extravagant $600 million palace in Ankara, the capital, will hold a special food analysis laboratory to test the president’s meals for radioactive materials, poison or certain types of bacteria that could be used in an assassination attempt.

False positives or not? Scientist skeptical of Winnipeg water monitoring procedures

CBC News reports the boil water advisory was lifted more than 24 hours ago, but questions remain over how the samples at the heart of the Winnipeg-wide scare could’ve tested positive for bacteria and E. coli in the first place.

mi-rick-holley-1212Six routine water samples taken on Monday showed the presence of bacteria, and E. Coli in some cases. After resampling and retesting, samples came up negative and the city lifted the advisory Thursday.

Mayor Brian Bowman assured Winnipeggers tap water was safe, and always had been, as the original tests were proven to be false positives.

Rick Holley, a professor at the University of Manitoba and expert in food safety, said that while mistakes like this do happen, they are unacceptable when hundreds of thousands of lives may be impacted.

“I still had concerns at that time and still do that the false positives might not be scientifically discredited,” said Holley. “It’s all too easy to continue testing until you get the results you want and any results you don’t want you discard as being false. That’s inappropriate.”

Holley said the only way to be sure Winnipeg water is safe is to understand what caused the positive results earlier this week.

“Why were those six samples positive? There has to be a reason why and that has to be established,” said Holley.

One telling detail released by the city was that all of the samples that tested positive were handled by the same employee.

The city provided a list of possible explanations for how the tests came to be positive, including:

  • A contaminated water tap or container
  • The water being contaminated during sampling
  • Mistakes made at the lab during analysis

Holley said while he hopes the city provides more detailed answers and soon, he remains concerned about the city’s water monitoring procedures.

Testing matters: EU Interlaboratory comparison study food on detection of Salmonella in minced chicken meat

In 2013, it was shown that 32 out of 35 National Reference Laboratories (NRLs) in the European Union were able to detect high and low levels of Salmonella in minced chicken meat. Two laboratories made an initial transcription error when processing the raw data, which led to their performance being rated as ‘moderate’.

kevin.allen.labOne laboratory continued to underperform during the follow-up study. Despite a significant improvement, this laboratory still had a sensitivity problem in the detection of Salmonella. Depending on the method used, the laboratories detected Salmonella in 61 to 78% of the contaminated samples. The detection of Salmonella in this study was made more difficult because of high levels of “interfering” bacteria in the minced chicken meat. These are some of the conclusions of the Sixth EU Interlaboratory Comparative Study of Food Samples, which was organized by the European Union Reference Laboratory for Salmonella (EURL-Salmonella).

Interlaboratory comparative study obligatory for EU Member States
The study was conducted in September 2013, with a follow-up study in January 2014. Participation was obligatory for all EU Member State NRLs that are responsible for the detection of Salmonella in food samples. EURL-Salmonella is part of the Dutch National Institute for Public Health and the Environment (RIVM).
The laboratories used three internationally accepted analysis methods (RVS, MKTTn and MSRV) to detect the presence of Salmonella in minced chicken meat. Each laboratory received a package of minced chicken meat contaminated with two different concentrations of Salmonella Infantis, or containing no Salmonella at all. The laboratories were required to analyse the samples for the presence of Salmonella in accordance with the study protocol. In this study, the RVS and MSRV analysis methods produced significantly better results than the MKTTn method in terms of detecting Salmonella in minced chicken meat. This underscores the benefits of using more than one analysis method.

collaboration.powellTwo new procedures were introduced and were positively received. For the first time, a food matrix was artificially contaminated with a diluted culture of Salmonella at the EURL-Salmonella laboratory. The NRLs were no longer required to combine the Salmonella samples. The feasibility of this procedure for subsequent studies will be assessed for each study. Furthermore, the participating laboratories were able to submit their findings via the Internet. This procedure will be optimized and continued.

Testing is a tool: Improved microbial food safety assurance: tools and technologies to reduce the guesswork

My friend, Dr. Tom Ross (left, exactly as shown) at the University of Tasmania, gave a talk a while ago and was published in Food Australia. He has some important messages.

Tom-RossMost people know the basic rules of food hygiene, don’t they? 

We teach our children to wash their hands after going to the toilet, we know to keep left-overs in the fridge and to cook, or at least wash, raw foods because they might be contaminated with ‘germs’.  We cover foods, we avoid mixing cooked and raw and, if the food is old or we’re not sure about how its been stored, we apply the old adage of “if in doubt, throw it out”.  They’re really simple rules that reflect our awareness that invisible microbes might make us sick, and ways to minimise the risk.  Its hardly rocket science, is it? And if these are simple rules that ordinary people apply, how much safer must it be when food professionals prepare and process foods?

If it is that easy, then its hard to understand why – particularly given the enormous advances in biological science and technology over the last few decades – that there seems to have been no reduction in the incidence of microbial food-borne illness in decades.  

ANZFA (1999) estimated that there were 4 – 5 million cases of microbial food-borne illness in Australia every year or an average individual risk of foodborne illness of once every four to five years for all Australians.  Since 1999 the incidence rate has changed little. National Notifiable Diseases Surveillance System data (NNDSS, 2014) shows foodborne illnesses incidence, including Salmonellosis, Campylobacteriois, Listeriosis and even typhoid have not fallen since 1999: if anything, per capita rates have increased slightly. This situation seems to apply across the ‘developed’ world (CDC, 2013).

While some of that increase is due to better detection and surveillance systems, commentators (e.g., Altekruse and Swerdlow, 1996, Hall et al., 2002, Nyachuba, 2010) point to changes in the way food is provided to human populations, particularly the increasingly large proportion of people in ‘first world’ nations that live in urban areas. Our food comes from increasingly remote locations, and from increasingly large, centralised production and processing facilities, that bring increased challenges for the modern food industry and increased risks for consumers if those challenges are not met.

food.lab.testingOur foods, at source, are not free from microorganisms.  Irrespective of technological advances, foods are still produced in natural environments that can harbour pathogenic microbes. Common food animals have a gut microbiota that can also harbour pathogens.  Cows udders can become infected and contaminate milk with pathogens like Staphylococcus aureus or Listeria monocytogenes. Microbial hazards arise from myriad sources, often without signs that contamination has occurred.

The problem is compounded by expectations that fresh food is inherently ‘healthier’, and by longer food supply chains that can extend across continents. Longer supply chains with more handlers involved, and reduced use and choices of food preservatives, increase the chance of contamination and for microbes to grow to hazardous levels before consumption.  Nonetheless, consumers expect that food should not harbour hazards, an expectation encouraged by law firms now actively specializing in compensation claims and class actions suites.  The increasing proportion of consumers in developed nations with increased susceptibility to infectious illness due to age, underlying chronic disease, or reliance on chemotherapeutics that weaken their immune system further heightens the challenge. Put simply, much higher standards of food hygiene are expected, and needed, but with fewer ‘weapons’ in the arsenal.

‘End product’ testing is useful only for batches of product that contain a high proportion of defective units, i.e., units that fail to meet relevant food safety criteria.  If we assume that a just tolerable foodborne illness risk is one per 100 meals, to assure this incidence by testing we would need to be able to detect batches of product that have ≥ 2 contaminated units per 100.  We have the methods, particularly those involving enrichment and/or signal amplification (e.g. PCR), to detect a few microbes in a large volume  (e.g., 125g) of food, but only if we know where to look.  The problem is finding those one or two contaminated units among 100 with confidence. The probability of detection can be estimated using the “binomial distribution”[1] equation, that tells us how many samples are needed to be 95% certain that the batch as a whole has less ≤ 1 in 100 unacceptable units.

The binomial distribution tells us we’d need to take 299 samples, and they’d all have to test negative!  To be confident that the frequency of contaminated units was less than one in 10,000 (essentially the estimated status quo), however, we’d need to take nearly 30,000 samples and for all of them to be “clear”[2].  Those sorts of sampling numbers are simply not feasible.

So, what is the answer?

On May 25, 1961 then US President John F. Kennedy set a vision for his nation, that USA should “commit itself to achieving the goal …  of landing a man on the moon and returning him safely to the earth. No single space project in this period will be more impressive to mankind, or more important for the long-range exploration of space”. That speech started the ‘space race’, but the race was not always smooth.

crew_eatinghi_fullThe US space program had had many spectacular failures resulting in massive explosions in the Gemini and Mercury rocket programs. To explain, the propulsion system of a rocket is essentially a controlled explosion, propelling the rocket forwards.  Minor mistakes in construction of the rockets, particularly the boosters (engines), could lead to catastrophic failures. Fortunately, few such disasters resulted in loss of life. But the rocket scientists realized there was a weakness in the way that the rockets were constructed, particularly because that huge national project involved different regions constructing different components of the rockets, which were then transported to different locations for assembly.  Through the massive failures it became clear that new techniques for assuring the quality of  far-flung  systems  manufacture  and  final  integration were needed. A technique called Failure Mode, Effects, and Criticality Analysis (FMECA) first developed by the US Army in 1949 was applied to the Apollo program. It is a procedure for analysis of components and processes to determine those that, if they fail, could lead to catastrophic outcomes, especially those that endangered the crew. That analysis focused attention on ensuring the absolute reliability of ‘mission critical’ components.  It also become apparent that the astronauts themselves were mission critical components and that anything that affected their performance at critical moments (such as atmospheric re-entry, where an incorrect manoeuvre could lead to incineration of the spacecraft), were also critical components. Thus, the safety of the astronauts’ food supply was regarded as critical and led to the application of FMECA to food production and, eventually, spawned the Hazard Analysis Critical Control Points (HACCP) system.  HACCP is now the most widely endorsed approach to food safety management in the world.

Like FMECA, the basic principle of HACCP is that by understanding where hazards arise in food processes and by putting in place procedures to prevent, control or remove them, those hazards can be controlled in the end product to ensure the safety of the food and to minimise reliance on “end product” testing. Indeed, quality assurance for the early stage consumed most of the food through testing for food safety assurance!

But, sooner or later, if you do HACCP properly, you end up asking questions that need quantitative answers, like “how much control is needed” and “how can it be achieved”?, e.g., what times, and temperatures, or product formulations are needed to control specific microbial hazards?

To answer those questions requires a high level of expert knowledge because of the diversity of behaviour and environmental limits of different microbial hazards.  Thus, while HACCP is founded on a logical a system that allows for the early detection and elimination of specific hazards the correct application of the concept requires comprehensive expert knowledge.    

The zenith of the US space program is the International Space Station (ISS), orbiting some 330 km above Earth. It’s home to six astronauts/scientists at any given time and is so large that it can easily be seen from Earth when the sun has gone down, by reflection of sunlight or moonlight. NASA provides an email alert service  (http://spotthestation.nasa.gov) that, for any location on Earth, advises when the ISS will become visible, from what direction, its the height in the sky, and for how long it will be visible. You can set your watch by the space station’s appearance! Given the complexity of interactions of the ISS’s orbit, and position of the sun and moon, that this information is calculated for any point on Earth for any day of the year, the accuracy of the predictions of the ISS appearances seems incredible. But, at some levels, the Universe is very predictable. Despite the experience of many food scientists, food microbiology is also predictable. While not with the same confidence as the position and visibility of the ISS, the reproducibility of microbial behaviour in foods does offers great potential to food safety managers.

Microbes can’t think, ergo Predictive Microbiology

Bacteria and fungi can’t think. They don’t have free will.  As such, they tend to behave reproducibly in response to their environment, which has led to the development of the discipline of predictive food microbiology.

The basic premise of predictive food microbiology is that the behaviour (growth potential, growth rate, inactivation) of microorganisms is deterministic and able to be predicted from:

  • specific characteristics of the micro-organism itself
  • the immediate environment of the micro-organism (i.e., food composition and storage conditions)
  • time the organism is in those conditions and – sometimes –
  • the previous environment (because it affects lag time, and may affect resistance to inimical conditions).

In practice, the information about those responses is derived from systematic studies in research laboratories or gleaned and collated from the published scientific literature. The patterns of response are characterised and the data and patterns summarised as mathematical equations, called “predictive microbiology models”. In essence, these equations represent condensed quantitative knowledge of the microbial ecology of foods. 

No matter how much a researcher knows, or how well that knowledge can be summarised in a mathematical model, to be useful that knowledge still needs to be communicated and made accessible to people in the food industry in a form that they can use to improve food safety or shelf life. Accordingly, the equations are usually integrated into computer software that automates the calculations to enable quick predictions of microbial changes in foods over time.

Many of these models can be downloaded, or used, for free. As an example of the depth of information ComBase, which is the most developed predictive microbiology application in the world, is based on ~ 50,000 determinations of microbial growth, or inactivation rate, or survival, relevant to foods. 

Australia is an international leader in the use of predictive microbiology, having adopted the “Refrigeration Index” (RI),  a predictive microbiology model, into legislation. The RI evaluates the effects of temperature and time on the safety of red meat by converting that data into the potential growth of E. coli. The RI is enshrined in Australia’s Export Controls (Meat and Meat Products) Orders (1985).  In consequence of the Garibaldi EHEC outbreak in Adelaide in 1995 another Australian model, that predicts the inactivation of enterohaemorrhagic E. coli in fermented meats, was developed and adopted by industry and regulators for evaluation of process safety.

Recently, Australia adopted Codex Alimentarius Commission (CAC) criteria for L. monocytogenes in foods. Those regulations differentiate between foods that do, or do not, support the growth of L. monocytogenes. For foods, that do not support growth, tolerance for L. monocytogenes is much higher (≤ 100 CFU/g) than in products that do support growth (<1CFU/25g), greatly reducing the probability of product recalls and the burden of microbiological testing.  In the guidelines the use of predictive microbiology models to differentiate foods that do, or do not, support the growth of L. monocytogenes is specifically endorsed.  Among such models, the Mejlholm and Dalgaard  (2009) model, and available in the SSSP software suite (see Table 1), is the most extensive and best validated.

The discussion above does not consider the limits of application of predictive microbiology. Its clear that to make predictions about the number of bacteria in a specific food after a certain amount of time, and under given storage conditions, requires that we know the initial number, and also how the storage conditions fluctuated over time.  Low-cost data logging technology now exists that can wirelessly communicate details of product storage conditions over time,. But sources of variability might include differences between strains, and inhomogeneity in the foods that might be enough to allow some cells to be able to grow, while others of the same population cannot. Worst still, under certain conditions, bacteria are genetically programmed to behave unpredictably and for multiple phenotypes, with very different physiology, to be present among a single population. Fortunately, this so-called ‘bet-hedging behaviour is based on quorum sensing and would only be expected occur when cell densities are very high (Veening et al., 2008). For these reasons, models usually make predictions that take this variability into account and can provide predictions that include the probability of different responses occurring in different environment,

Both theory and experience show that end-product testing isn’t a practical for food safety assurance, particularly for the low incidence of contamination that consumers expect.  The HACCP philosophy approach provides the most reliable means of food safety assurance, but for that approach to be practical its necessary to prioritise among potential hazards and understand how to control them: from among the myriad potential hazards we need to identify those that represent the greatest risks, and to understand their individual behaviour and environmental limits to design foods and processes that  limit their growth or inactivate them, while minimizing affects on product quality. This challenge requires expert knowledge of the physiology of individual microbial hazards. That knowledge is increasingly being made available through the development of predictive microbiology mathematical models and software.

While basic principles of food safety aren’t rocket science, the complexities of the modern food industry mean that food safety managers can gain much from lessons learnt and technologies developed in the space program.  The HACCP concept had its genesis in the USA space program:. The modelling approaches and software now being used to optimise food safety management rely on high level mathematics to develop tools and strategies to best satisfy the paradoxical consumers expectations of minimally processed foods with maximum levels of safety. 

 

References

Altekruse, S. and Swerdlow, D. (1996). The changing epidemiology of foodborne disease. American Journal of  Medical Science, 311: 23-29.

ANZFA (Australia New Zealand Food Authority), (1999).  Food Safety Standards Costs and benefits: An analysis of the regulatory impact of the proposed national food safety reforms. ANZFA, Canberra, Australia. 154 pp.

CDC (Centers for Disease Control and Prevention) (2013).   Incidence and trends of infection with pathogens transmitted commonly through food – foodborne diseases active surveillance network, 10 U.S. sites, 2006. 2013.  Morbidity and Mortality Weekly Report, 68:328-332.

CDNANZ (Communicable Diseases Network Australia and New Zealand – Foodborne Diseases Working Party) (1997). Foodborne Disease: Towards reducing foodborne illness in Australia.  Tech Report Series No. 2. Australian Commonwealth Department of Health and Family Services, Canberra, Australia. 85 pp.

Hall, G.V., D-Souza, R.M. and Kirk, M.D. (2002). Foodborne disease in the new millenium:  out of the frying pan and into the fire? The Medical Journal of Australia, 177:614-618.

Mejlholm, O. and Dalgaard, P. (2009). Development and validation of an extensive growth and growth boundary model for Listeria monocytogenes in lightly preserved and ready-to-eat shrimp. Journal of Food Protection, 72:2132-2143

Membré, J-M. and Lambert, R.J.W. (2008).  Application of predictive modelling techniques in industry: From food design up to risk assessment. International Journal of Food Microbiology, 128: 10–15.

NNDS (National Notifiable Diseases Surveillance System), (2014). Notifications of a selected disease by State and Territory and year. Accessed on 20 September 2014 at: http://www9.health.gov.au/cda/source/rpt_4_sel.cfm

Nyachuba, D.G. (2010).  Foodborne illness: is it on the rise? Nutrition Reviews, 68:257–269.

Veening, J-W., Smits, W.P. and Kuipers, O.P. (2008). Bistability, epigenetics, and bet-hedging in bacteria. Annual Reviews in Microbiology, 62:193-201.

 

[1]      To be strictly correct, we should use another, related, equation called the ‘hypergeometric distribution’, but for almost all practical purposes the binomial distribution gives the same result.

[2]      Reliable on-line tools that can perform these calculations to design or assess the reliability of sampling plans can be found at:  http://www.icmsf.org/main/software_downloads.html, or http://www.fstools.org/samplingmodel/

 

Australian importer fined $25K for not testing imported ham

An importer has been fined $25,000 for failing to test 2241 kg Parma ham imported from Italy in 2011.

parma.hamPaqualino Licastro, owner of Perth import company Topas Pty Ltd, was fined $3000 while the company was fined $22,000.

After breaching its import permit, the company then failed to act on a directive from the Department of Agriculture to move the ham to a cold-store facility. The department ordered that the ham be held pending sampling and testing for Staphylococcus, Listeria, E. coli and Salmonella before it could be sold or distributed.

Had the imported ham introduced foot-and-mouth disease into Australia, it could potentially cost more than $50 billion over 10 years, the department estimates.

14K tests, 98.7% compliance: Canadian annual microbiology report 2011-12

The Government of Canada verifies that food produced and/or sold in Canada meets federal food safety standards to ensure Canadians have confidence in what they buy. The Canadian Food Inspection Agency (CFIA) monitors and regulates food products that are produced domestically and moved inter-provincially, or are imported.

professor.fink.Simpsons.jpgWithin Canada, all food products must comply with the Food and Drugs Act and Regulations, which set out criteria for safe food and clearly prescribe restrictions on the production, importation, sale, composition and content of food.

The National Microbiological Monitoring Program (NMMP) is one of many tools utilized by the CFIA to verify that domestically produced and imported products meet Canadian standards. It is designed to sample and test a broad range of imported and domestic commodities for multiple hazards, including microbial hazards and extraneous material. The testing carried out under the NMMP covers red meat and poultry products, shell eggs and egg products, dairy products, fresh fruits and vegetables and processed fruit and vegetable products.

As CFIA focuses its monitoring activities towards specific food-related hazards that may impair the health and safety of Canadians, it is important to note that most testing is in commodities that are not further processed by the consumer as well as in raw food, that if not properly cooked, can lead to illness. It is generally accepted that proper precautions taken in the home will destroy any bacteria that may be present.

During the 2011/12 fiscal year under the NMMP, 14307 tests were performed on 5234 domestic and imported products. Specifically, 9049 tests were performed on 3678 domestic products and 5258 tests were performed on 1556 imported products to verify they were compliant with Canadian standards. Results indicated that domestic products were 99.0% compliant and imported products were 98.0% compliant. Overall, a 98.7% compliance rate for combined domestic and imported products was observed.

In addition to testing food products, wash water samples and surface swabs taken within the food production environment are used to verify that food products are produced under sanitary conditions. This type of environmental sampling was performed in domestic establishments to verify the operator systems’ ability to control the presence of pathogens within the processing environment. During 2011/12, there were 2300 tests performed on 1878 environmental samples which were assessed as 97.5% compliant.

The results of the 2011/12 NMMP sampling activities demonstrate that the products available in the Canadian marketplace are for the majority compliant with national standards.