Jump to content

  • Quick Navigation
Photo

Calibration/Validation of Equipment

Share this

  • You cannot start a new topic
  • Please log in to reply
6 replies to this topic
- - - - -

Lucas

    Grade - Active

  • IFSQN Associate
  • 20 posts
  • 1 thanks
1
Neutral

  • Portugal
    Portugal

Posted 27 November 2008 - 09:13 AM

Hello all,

I've read here on this site some topics about calibration and, honestly, I think some of you are miss using the word calibration.

Calibration of a certain equipment has to be done against determined rules (for example ISO 17025) and by an acreddited lab (or internally if we use the same methodology). Validation is something we can do internally every day or every shift.

For example, if we want to calibrate a scale, we have to have it calibrated by an external lab (or using the same methodology they use), but we can (and have to) validate the accuracy of the scale, using certain pieces of mass, every day or every shift.

But this is not as simple as that. Some organisations state that we should calibrate all equipments every year, at least the ones which are related with CCP's or with legal issues. We all know that each equipment should be calibrated at least at 3 different points. Imagine a temperature probe that is used to measure temperatures around 30ºC to 50ºC. In this case it may be useful calibrate the probe for temperatures such as 30, 40 and 50ºC.

Now imagine that you are making your calibration plan and your key equipments are:
- Oven (with 3 probes);
- 2 small Scales;
- 4 dataloggers (refrigerators);
- 2 automatic scales (looking like an octupus, used in automatic packaging machines);
- 1 Metal Detector;
- 1 Portable proble.

Now, if you have to calibrate each year each one of these equipments at 3 different points you will get an astronomic bill. But this is something (I think) we can't do nothing against.

Any opinions on this?

Regards,
Lucas



AS NUR

    Grade - PIFSQN

  • IFSQN Principal
  • 582 posts
  • 60 thanks
9
Neutral

  • Indonesia
    Indonesia
  • Gender:Male
  • Location:east java, indonesia

Posted 27 November 2008 - 09:47 AM

Hello all,

I've read here on this site some topics about calibration and, honestly, I think some of you are miss using the word calibration.

Calibration of a certain equipment has to be done against determined rules (for example ISO 17025) and by an acreddited lab (or internally if we use the same methodology). Validation is something we can do internally every day or every shift.

For example, if we want to calibrate a scale, we have to have it calibrated by an external lab (or using the same methodology they use), but we can (and have to) validate the accuracy of the scale, using certain pieces of mass, every day or every shift.

But this is not as simple as that. Some organisations state that we should calibrate all equipments every year, at least the ones which are related with CCP's or with legal issues. We all know that each equipment should be calibrated at least at 3 different points. Imagine a temperature probe that is used to measure temperatures around 30ºC to 50ºC. In this case it may be useful calibrate the probe for temperatures such as 30, 40 and 50ºC.

Now imagine that you are making your calibration plan and your key equipments are:
- Oven (with 3 probes);
- 2 small Scales;
- 4 dataloggers (refrigerators);
- 2 automatic scales (looking like an octupus, used in automatic packaging machines);
- 1 Metal Detector;
- 1 Portable proble.

Now, if you have to calibrate each year each one of these equipments at 3 different points you will get an astronomic bill. But this is something (I think) we can't do nothing against.

Any opinions on this?

Regards,
Lucas


Dear Lucas..

According to

International vocabulary of basic and general terms in metrology (VIM) the definition of :

1. Calibration are:

"definition (a)

operation establishing the relation between quantity values provided by measurement

standards and the corresponding indications of a measuring system, carried out under

specified conditions and including evaluation of measurement uncertainty

definition (b)

operation that establishes the relation, obtained by reference to one or more measurement

standards, that exists under specified conditions, between the indication of a measuring

system and the measurement result that would be obtained using the measuring system "



2. Verification :

"confirmation through examination of a given item and provision of objective evidence that it fulfils specified requirements"

and

3. Validation

"confirmation through examination of a given item and provision of objective evidence that it fulfils the requirements for a stated intended use"

so IMO.. that you state as "validation", according to the above definition as same as verification...

And for calibration process, IMO, we have to include "operational point" during calibration For example if we use thermometer with 0 - 100oC scale and the "operating point" is 50oC. we have to include 50oC during calibration process, in term of the thermometer we can calibrate at 30oC, , 50oC, 70oC and 100oC ....

IMEX.. for metal detector case.. we just doing verification every shift, the procedure is just through the "metal std" without product, and validation every six month, the procedure is through the "metal Std" with product...



Lucas

    Grade - Active

  • IFSQN Associate
  • 20 posts
  • 1 thanks
1
Neutral

  • Portugal
    Portugal

Posted 27 November 2008 - 10:06 AM

Hello As Nur,

I understand what you said, but that doesn't differ about what I had said.

I know that we can avoid the calibration of metal detectors if we managed to prove that we validade it, by monitoring it every day with metal pieces. However, the auditor can still ask for a calibration certificate. Because we are speaking about different things: calibration and validation/verification. And in terms of certification all auditors want to see the calibration certificates.

I will give you an example. Here in Portugal, the law demands that the local council must validate the scales used in the food industry. They test the scales (but they do not calibrate them) and if everything is OK they will put a validation stamp on it (valid for an year). However, the auditors do not consider this a valid certificate for calibrate (because is not a calibration process, is more like a validation step, the we can all do internally).

So, the problem is that the required calibration process creates a lot of financial problems for small companies which have a lot of equipment, like the example I showed.

Lucas



a_andhika

    Generally Recognized As Sane

  • IFSQN Senior
  • 338 posts
  • 7 thanks
4
Neutral

  • Indonesia
    Indonesia
  • Gender:Male
  • Location:The Island of JaVa
  • Interests:Manga, Comics, Anime, Epic & High-tech Movies, Video Games, and CSI stuffs

Posted 27 November 2008 - 12:10 PM

Dear Lucas, long time no see...

I agreed with AS Nur. IMO the validation of the scales on your case is a verification. And also IMO, calibration is a verification activities. IMEX, the calibration activity is comparing the reading standard equipment (belongs to accredited external party) by the reading of our equipment. Then if we found the deviation between them was too wide, then we need to adjust it until meet the requirements (or maybe buy a new one). Or do you ment that calibartion is an adjustment? Free to correct me.

But well... yeah, that kind of activity can be done by yourself, and you may find difficulties if auditor asking you the calibration certificate. But then, the Master Set that you use to internally calibrate (IMO) your equipments of course must be calibrated by the external party/metrology (example: Master Weights, Master Thermometer). IMO, the calibration certificate of your Master Set can be used for the Auditor. Plus, the calibration technician must be well-trained too (you also has the certificate).

You might love to take a look at this proposed guidelines from codex too:
http://www.codexalim...22/cxg_069e.pdf


Regards,


Arya


IF
safety and quality means perfection
AND
nobody's perfect
THEN
why should I bother?

Lucas

    Grade - Active

  • IFSQN Associate
  • 20 posts
  • 1 thanks
1
Neutral

  • Portugal
    Portugal

Posted 27 November 2008 - 01:43 PM

Hello Arya,

Big confusion here. Forget what you understand by verification.

Are you trying to say that I don't need to calibrate the scales? but only the Master Weights I use to check if the scales are working properly? If you so, please mind that there are equipment that we cannot do that, as for example the automatis scales (as said in previous post) used for packaging.

And then, what is your solution for calibrating a oven with 24 metres long with 3 probles? A Master Thermometer is no use in this case.

Of course we can have a Master Thermometer if you use sever thermometers in the monitoring procedures. However, we are not speaking here about removable/portable thermometers. I don't know if I am making myself clear.

And then, a doubt: what do you mean by IMO? What does that stands for?

Please, read the list of equipment that I put in a previous post and tell me how would you do it, when the auditor asked for the prove (certificate) of calibration.


Thanks
Lucas



Charles.C

    Grade - FIFSQN

  • IFSQN Moderator
  • 20,542 posts
  • 5665 thanks
1,545
Excellent

  • Earth
    Earth
  • Gender:Male
  • Interests:SF
    TV
    Movies

Posted 28 November 2008 - 12:38 PM

Dear Lucas,

IMO = In My Opinion, IMEX = In My Experience

Interesting post. I think I agree and disagree with some of your comments. It is possible that Portugal is less flexible in its auditorial demands than elsewhere perhaps.

Without reading all this thread in detail, I suspect most of these posts are correct in different ways.

IMEX, Calibration has certain uniformly accepted basic requirements, ie choice of a reference standard, choice of a reference method, use of primary / secondary standards etc. However, within these items, there may be substantial interpretational differences depending on yr choice of reference. I will give two simple practical examples but with non-simple theoretical backgrounds - (a) some "standard" refs say you can calibrate the 100degC point on a themocouple based thermometer by insertion into "boiling" distilled water. Others sources specify must be located in the steam, eg using a kettle. Personally I hv found that there may indeed be significant differences but often also depending on the actual instrument. (b) Similarly the typical method for the 0degC point is insertion into "melting" ice/water. One of my local calibration companies consistently got results significantly different to my own (difference ca 0.2 - 0.5degC) and claimed was due to their use of distilled water for the ice and "highly" crushed ice for the actual test. Frankly I never believed them but used their certificate just the same and set the zero point as to my own results. A later company got results more closely matching my own and said they suspected the previous deviation was due to an electronic temperature simulation "black' box being used which is indeed another commercial "standard" method. I shall never know the truth of course.

In contrast, the terminologies validation / verification have a multitude of interpretational differences at both the principle and operational level, again depending on the source reference. For example the current standard interpretation of HACCP validation in USA (NACMCF) sets validation as a subset of verification unlike (I think) ISO 22000. The former method clarifies its usage by (from memory) including text like "for the purposes of this document, validation is defined as ........" The result is that the basic HACCP flow layout is somewhat re-arranged. This is a well-known feature although frequently not mentioned in subsidiary "standard" presentations. Codex and related authors hv issued various nice documents on this subject, some of which even discuss these variations. The result is that “it’s up to you”.

My conclusion is (a) there are various acceptable ways to calibrate and even more acceptable ways to validate/verify although local conditions may restrict the options (??) (b) you sometimes need to be very careful when comparing data of any kind, including the above, to avoid the "apples and oranges" effect. My first questions when claimed discrepancies occur in results are – (1) detailed procedure reference and (2) method of calibration if instrumentaion is involved. It is amazing IMEX how often companies use shortcut methods without knowing the method limitations, the justification being that it is the "standard" company method. It's true that procedures like AOAC can be impossibly lengthy so shortened versions are inevitable but then knowledge is also a requirement.

I use similar to the Arya method for scales. Invest in primary stainless steel master weights (externally calibrated yearly with certificate [demanded by auditor]) and then create sub-master secondary brass weights (cheaper) for routine use after using the primary ones to “standardise” a chosen balance(s) (the choice depending on the absolute level and sensitivity required by respective daily use. It’s true a significant (but mainly one-off ) initial cost is involved plus also some deep thinking as to one’s exact usage however the cost of paying someone to on-site calibrate a large range of weight scales can be substantial unless they offer a package deal. I sometimes hv it done professionally first for (particularly) high weight units and then copy their method for ongoing work. However I agree it can get complicated at levels like 100kg or <=a few mg.

(I personally find it equally difficult to reliably achieve routine checking of accuracy for, say, 200 units of 5kg scales. In some respects, analog balances are easier than digital.)

Rgds / Charles.C


Kind Regards,

 

Charles.C


Lucas

    Grade - Active

  • IFSQN Associate
  • 20 posts
  • 1 thanks
1
Neutral

  • Portugal
    Portugal

Posted 30 November 2008 - 11:24 AM

Hello Charles,

It's always to hear from you. Thanks for helping me with the acronyms. I never found easy this way of abbreviation. The reason is we don't do this in Portuguese. And when I was living in London I was always asking what did they mean by asap, spec, etc..

But then, in the end, I found it very useful.

Well returning to the calibration/validation of equipment, I contacted server governmental bodies and I found that they are not also very sure what should be done. This is something that I cannot understand, but happens a lot. Theres a EU regulation that was transposed for our internal legislation that says that the companies are entitle to find the best solution to ensure the equipment is working properly and that the results are reproduced and feasible. Based on this, I decided that I am going to calibrate the oven probes and the scales that are related with CCP's control (such as scales used to weigh the preservatives, data loggers, monitoring thermometers). I will make sure that the metal detector is also calibrated. Equipment such as climatisation rooms (used to levedate the dough), automatic scales, or flux passage devices, used to weigh other raw materials/ingredients I will just verify if they are working properly (as I will do with the calibrated equipment). So I am not planning to conduct any calibration on master weights. I will weigh them on a calibrated scale, prior to use them to verify the automatic scales (where they need to be used). We can then compare the results.

I think, after reading the EU regulations and the transposed internal law, this is more than enough.

IMEX and IMO (here I am using acronyms. :thumbup: ) I think the people who write the codes and standards they shoud be more precise. In this aspect (calibration) all standars are very vague. However, comparing ISO 22k, BRC and IFS I found the first one is much, much more confusing.

A nice weekend for all of you.
Lucas





Share this

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users