Jump to content

  • Quick Navigation
Photo

Guideline for select the Size of test piece of Metal Detector


  • You cannot start a new topic
  • Please log in to reply
6 replies to this topic

#1 jglopez

jglopez

    Grade - Active

  • IFSQN Active
  • 2 posts
  • 0 thanks
0
Neutral

  • Spain
    Spain

Posted 21 June 2012 - 03:28 PM

I´m looking for directive or standard for assesment of size of test pieces for metal detector. Thanks



#2 esquef

esquef

    Grade - SIFSQN

  • IFSQN Senior
  • 374 posts
  • 232 thanks
40
Excellent

  • United States
    United States
  • Gender:Male

Posted 21 June 2012 - 04:30 PM

Unfortunately there's no "standard or directive" which determines test piece sizes. Your company's HACCP team does that in Step 3 (in the 7 point system) or step 8 in the newer 12 step system. This step is called Establish Critical Limits. Before you can effectively do that the HACCP Team needs to complete the previous steps which include conducting a Hazard Analysis (based on the process flow diagrams previously created by the Team).

The United States FDA defines Critical Limit as

"A maximum and/or minimum value to which a biological, chemical or physical parameter must be controlled at a CCP to prevent, eliminate or reduce to an acceptable level the occurrence of a food safety hazard." Although the FDA's Health Hazard Evaluation Board has supported regulatory action against product with metal fragments of 0.3" [7 mm] to 1.0" [25mm] in length I doubt you'll find many customers who'll accept 6.9 mm minimum metal inclusion.

In my experience minimum for ferrous is 1.0 -2.0 mm, 1.5 - 2.5 mm non-ferrous, and 2.0 - 3.5 stainless steel. Your customer(s) may have minimum requirements that they'll require you to meet.



#3 tsmith7858

tsmith7858

    Grade - SIFSQN

  • IFSQN Senior
  • 262 posts
  • 51 thanks
10
Good

  • United States
    United States

Posted 21 June 2012 - 08:05 PM

Unfortunately there's no "standard or directive" which determines test piece sizes. Your company's HACCP team does that in Step 3 (in the 7 point system) or step 8 in the newer 12 step system. This step is called Establish Critical Limits. Before you can effectively do that the HACCP Team needs to complete the previous steps which include conducting a Hazard Analysis (based on the process flow diagrams previously created by the Team).

The United States FDA defines Critical Limit as

"A maximum and/or minimum value to which a biological, chemical or physical parameter must be controlled at a CCP to prevent, eliminate or reduce to an acceptable level the occurrence of a food safety hazard." Although the FDA's Health Hazard Evaluation Board has supported regulatory action against product with metal fragments of 0.3" [7 mm] to 1.0" [25mm] in length I doubt you'll find many customers who'll accept 6.9 mm minimum metal inclusion.

In my experience minimum for ferrous is 1.0 -2.0 mm, 1.5 - 2.5 mm non-ferrous, and 2.0 - 3.5 stainless steel. Your customer(s) may have minimum requirements that they'll require you to meet.


esquef is right you need to establish it based on your findings in hazard analysis. Things to consider beyond the FDA guideline (which is primarily for choking hazards and does not proetect anyone from damaging their teeth or swallowing something harmful)
  • Your supplier requirements (no point in making your metal 1.0 mm if you accept 2.0 mm from your supplier unless you have a way to remove metal like magnet or screens in your process)
  • Where metal hazards exist in your process and their possible impact
  • What are you as a company willing to accept as a liability? (as mentioned FDA says 7 mm but not many companys are willing to accept the claims for biting into a 6 mm metal piece)

And most of all:
  • What do your customers want?


#4 Charles.C

Charles.C

    Grade - FIFSQN

  • IFSQN Moderator
  • 17,378 posts
  • 4836 thanks
943
Excellent

  • Earth
    Earth
  • Gender:Male
  • Interests:SF
    TV
    Movies

Posted 21 June 2012 - 11:02 PM

Dear jglopez,

This topic has an encyclopedia of comments on this forum. Unfortunately there is no consensus agreed perfect answer.

IMHO, the simplest solution is to take the minimum detection level for the appropriate test pieces for which yr detector will validate. These are fairly similar for most detectors and quite well-known from manufacturer publications. And illustrated in various places on this forum. :smile:

I doubt that any auditor will reject this approach. (Anybody?)

Rgds / Charles.C


Kind Regards,

 

Charles.C


#5 Brian Meek

Brian Meek

    Grade - Active

  • IFSQN Associate
  • 13 posts
  • 4 thanks
0
Neutral

  • United Kingdom
    United Kingdom
  • Gender:Male
  • Location:Northampton

Posted 22 June 2012 - 10:32 AM

Ahhhh Metal detection the black art of food production.

Look at a metal detector as a difference detector, it is not capable of finding metal it finds a difference between the settings and your product.

Metal detectors have a few basic settings depending upon the manufacturer but all are essentially the same.

Sensitivity, threshold: These are the trigger points for the contaminant or test sample.
Phase angle, compensation: These are the settings for the product.

How to set up:
The detector will probably have an automatic calibration feature which you should follow. This is normally a good start point. When calibrated you should then look at the signal produced by the product each time it is "scanned" by means of the display. This is either represented by a number or scale of led's. When the product produces little or no effect then the product is tuned out.

Next you can look at the sensitivity or threshold. Pass the product through the detector and introduce your contaminants in 0.5 mm increments to find which sample size works for the product. Do this for each type of metal, Fe, NFe and Stainless 316 or 304. You can then determine the repeatability of the system by watching in production and checking whether your product is consistent. Inconsistency in the product will produce false triggering which will require further tuning. A differential between product and product plus test sample must be achievable or else you will get a system that rejects good product.

Why do you get differing results for different products?
A metal detector is looking for either a conductive or magnetic effect from the product. A good conductor is something like gold silver or copper but another good conductor is salt and water. Since your product is made up of both of these then you have a product effect which requires attention. Product effect and the non ferrous and stainless samples have similar signals as each other so this is where your woes begin.

In the UK the supermarkets produce a requirement for performance based upon the aperture size of the system but sometimes this is not achievable. I will dig out the table for reference for you later.

Kind regards

Brian



#6 Scotty

Scotty

    Grade - MIFSQN

  • IFSQN Member
  • 52 posts
  • 73 thanks
3
Neutral

  • Scotland
    Scotland
  • Gender:Male
  • Location:West Lothian, Scotland

Posted 29 June 2012 - 08:38 AM


Attached table is based on an old M&S standard - generally advised to get best sensitivity for your relative product.

Hope this helps.

Regards

Attached Files


Edited by Scotty, 29 June 2012 - 08:39 AM.


Thanked by 2 Members:

#7 Brian Meek

Brian Meek

    Grade - Active

  • IFSQN Associate
  • 13 posts
  • 4 thanks
0
Neutral

  • United Kingdom
    United Kingdom
  • Gender:Male
  • Location:Northampton

Posted 29 June 2012 - 11:12 AM

Hi Scotty

Thats the one I was looking for, you do however have to concider the relationship with the size of the product signal, the metal detector frequency and the suitability of both with regard to detection.

Kind regards

Brian






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

EV SSL Certificate