What's New Unreplied Topics Membership About Us Contact Us Privacy Policy
[Ad]

Thermometer Calibration - how to calculate the temperature deviation for hot boiling point?

Started by , Sep 24 2020 10:47 PM
12 Replies

Hi

I would like to know on how to calculate the temperature deviation for hot boiling point.

My company’s procedure on thermometer calibration for hot boiling is that once the water is rolling boil (apprx 100C), take out from stove, then test the thermometer along with the master thermometer (for 1 minute).

We always get the reading like this:
Master thermometer:98C
Calibrated thermometer:97C

Therefore can we say the deviation is -1?

By taking out the water from the stove we never get the 100C, the least we can get is 98C & below.

Is our procedure and method of calculating deviation correct?

Thanks in advance!

Share this Topic
Topics you might be interested in
Checking frozen incoming material using digital probe thermometer Metal detectable Infrared thermometer? Infrared Thermometer Calibration Area Thermometer Calibration Major Non Conformance for Missing 3rd Party Calibration for Thermometer
[Ad]

Hi

I would like to know on how to calculate the temperature deviation for hot boiling point.

My company’s procedure on thermometer calibration for hot boiling is that once the water is rolling boil (apprx 100C), take out from stove, then test the thermometer along with the master thermometer (for 1 minute).

We always get the reading like this:
Master thermometer:98C
Calibrated thermometer:97C

Therefore can we say the deviation is -1?

By taking out the water from the stove we never get the 100C, the least we can get is 98C & below.

Is our procedure and method of calculating deviation correct?

Thanks in advance!

 

Type of master thermometer ?

 

Has the master thermometer been recently, officially (eg traceable to NIST) calibrated at 100.0 degC? (typically a deviation of 2degC would justify a repair or dump of thermometer)

 

(Personally I use the generated steam since I find it more consistent but most textbooks use water)

Type of master thermometer ?

Has the master thermometer been recently, officially (eg traceable to NIST) calibrated at 100.0 degC? (typically a deviation of 2degC would justify a repair or dump of thermometer)

(Personally I use the generated steam since I find it more consistent but most textbooks use water)


Yes, the master thermometer is a probe thermometer and has been calibrated at 100C.

Is it due to we take out the pot of water from stove therefore it does not reach 100C? If we put the thermometer in the water when it boils, it does reach more than 100C.

Yes, the master thermometer is a probe thermometer and has been calibrated at 100C.

Is it due to we take out the pot of water from stove therefore it does not reach 100C? If we put the thermometer in the water when it boils, it does reach more than 100C.

I assume the official calibration stated that the deviation at 100.0degC was 0.0degC ?.

 

Yes, the Procedure is illogical.

 

Hopefully the indicated temperature when actually boiling is not 102degC ?

Are you at a higher altitude?  Because water will boil at a lower temperature when you are at higher altitudes.  I wonder if your problem is not with your thermometer, but with your altitude?

 

https://en.wikipedia...ltitude_cooking

Are you at a higher altitude?  Because water will boil at a lower temperature when you are at higher altitudes.  I wonder if your problem is not with your thermometer, but with your altitude?

 

https://en.wikipedia...ltitude_cooking

 

Hi Kara,

 

Thks for link.

150m > 0.5degC change.

 

I suppose it might be on a mountain.

 

I suspect the last paragraph in Post 3 indicates the main problem.

If your reference / calibrated thermometer has been checked and verified I wouldn't worry too much about your specific numbers.  You need to make sure you are achieving a consistent reading and use the reference thermometer as the standard. 

 

If your reference thermometer is reading 98oC, and your other thermometer you are checking is within 1 degree you are fine.  If it is outside of 1oC that could be a problem, but depends on what you have stated in your calibration policy / program.

1 Like1 Thank

If your reference / calibrated thermometer has been checked and verified I wouldn't worry too much about your specific numbers.  You need to make sure you are achieving a consistent reading and use the reference thermometer as the standard. 

 

If your reference thermometer is reading 98oC, and your other thermometer you are checking is within 1 degree you are fine.  If it is outside of 1oC that could be a problem, but depends on what you have stated in your calibration policy / program.

 

Hi Ryan,

 

^^^(red) - IMEX it would be a problem.

 

IMEX It can be difficult to achieve a tolerance of 1degC at both 0.0  and 100.0 but such is life. Sometimes one needs 2 thermometers.

True, dependent on the thermometers it can be difficult so it really depends on the thermometers, how temp is being taken, and what is in the calibration policy.  Most regulations do state + 2oC as an acceptable variance, but with the right equipment and the settings within 1oC is definitely achievable.

 

Hi Ryan,

 

^^^(red) - IMEX it would be a problem.

 

IMEX It can be difficult to achieve a tolerance of 1degC at both 0.0  and 100.0 but such is life. Sometimes one needs 2 thermometers.

1 Like

True, dependent on the thermometers it can be difficult so it really depends on the thermometers, how temp is being taken, and what is in the calibration policy.  Most regulations do state + 2oC as an acceptable variance, but with the right equipment and the settings within 1oC is definitely achievable.

Maybe it varies.

 

(One could argue that if the master is +/-1degC and the sub-master is +/-1degC then the result is max. "2" but this is perhaps somewhat disingenuous. :smile: )

 

The Food and Drug Administration model Food Code specifies that food service thermometers shall be accurate to ± 1°C of the intended range of use.

thermometer accuracy.pdf   277.56KB   33 downloads

 

1 Like

I assume the official calibration stated that the deviation at 100.0degC was 0.0degC.

Yes, the Procedure is illogical.

Hopefully the indicated temperature when actually boiling is not 102degC ?


Well the master thermometer indicates 100C when boiling. Should be ok?


Sent from my iPhone using Tapatalk

Well the master thermometer indicates 100C when boiling. Should be ok?


Sent from my iPhone using Tapatalk

> 1st query post 4.

Well the master thermometer indicates 100C when boiling. Should be ok?


Sent from my iPhone using Tapatalk

What is the purpose of the process? Heating water to 100°C or vaporisation to get mist?

Boiling phenomenon depends on the purity of water & atmospheric pressure. Assuming it's pure/distilled water, the temperature of liquid water never reaches or gets over 100°C when the outside pressure is less than 1atm.

If you need to boil pure water at >=100°C, then increasing pressure is the only choice, e.g. by using a lid.

If the purity doesn't mater, adding some salt helps to raise the boiling point.

1 Like

Similar Discussion Topics
Checking frozen incoming material using digital probe thermometer Metal detectable Infrared thermometer? Infrared Thermometer Calibration Area Thermometer Calibration Major Non Conformance for Missing 3rd Party Calibration for Thermometer Cooler thermometer calibration Freezer Thermometer Calibration Thermometer calibration at hot and cold temps Infrared lazer Thermometer Reference thermometer calibration frequency