Hello Charles,
Thks for the two links. I have great respect for the technical capabilities of both FSIS and FDA. However the lengthy document attached does contain some distinctly questionable statements. Some –
FSIS and FDA have a lot of resources but that does not make them immune to mistakes!
Or to amendments ?. i think one reason was maybe that a large proportion of the relevant document (itself huge) was actually reiterated from the previous version. Have a look at the listed reference dates. Updating is a problem for all of us. 
………. Salmonella is considered an indicator of lethality for Lm
(Pg 2.28, 4-8)
I think you will find research papers for and against this statement. Traditionally Salmonella has been considered hardier that L. monocytogenes and E. coli O157:H7
Maybe compare the typical (6D) theorised / approved cooking times for meats based on L.mono / Salmonella respectively ?
Indirectly related to the Hamburger story also I believe.
Actually I had thought that the US logic was fundamentally based more on epidemiological data which demonstrate that health - related incidents related to Salmonella far exceed L.monocytogenes ? E.coli O157's recognition was maybe related to the Jack in the Box episode and the (perceived) Ostrich attitudes of manufacturers? (An analogous but differently caused incident sadly occurred in UK also).
Most food residue and all microbes are rich in ATP ……………
(2.49)
Not trying to defend this statement for the sake of it but technically microbes are indeed rich in ATP and even a food that is commercially sterile is not free of microbes (think spoilage), especially when the food is on an equipment surface where it provides a nutrient source, where usually ATP tests are conducted.
Published documents indicate large variations in ATP between bacterial species.
…….lots having the calculated mean concentrations or greater will be rejected with at least 95% confidence. Each of these plans achieves assurance that Lm is present at <1 in the sample size.
I don’t see any errors in these statements. These have been validated by ICMSF and have been published in several peer-reviewed journals. Researchers from several countries have validated this.
I'm no statistician but there are long-standing arguments (eg Frequentist / Bayesian) over the statistical validity of such statements. The subtleties involved probably represent nit-picking to the users of the conclusions but such is life in the fast lane. It could be argued that the plans are more a support for HACCP.
.
(Pg 4.6)
The IT document also has a few interesting (in some cases repetitively) debatables also –
“An E. coli contaminated beef product must not be distributed until it has been processed into a ready-to-eat product,” the regulations say.
Like I have mentioned several times before, USDA allows reprocessing and sale of a ‘contaminated’ product as long as you reach lethality using validated cooking.
My point was that "E.coli" is not even a definitive pathogen as stated above. The article has numerous errors.
“Even though cooking it to 165 degrees makes it sterile,” said ……………..
Calling it sterile is where I will draw the line at that, even if the statement comes from the chairman of the National Joint Council of Food Inspection!
Quote
But I doubt any other region of the world would be more demanding than “less than 0.04 cfu/g”
< 0.02 cfu/g is also quite popular.
and composites ?
You can make it as tight as possible. In US, as allowed by USDA, for Listeria monocytogenes you can go up to < 0.008 cfu/g and for E. coli O157:H7 in beef you can go as low as < 0.003 cfu/g, as specified standards for absent.
Composites are not allowed in US, at least for RTE food product samples to be tested for Listeria monocytogenes.
The (meaningful) limits are also related to detection abilities / sensitivities / specificities Unfortunately the associated maths can get quite unpleasant (to me anyway), especially where pooling is involved.