Calibration Impact

Calibration

 

Let us never forget the importance of calibration, as the validity of our decisions and product conformity dependent upon it!

 

When a calibration program is successful, we accept conforming product and reject nonconforming product

 

When it’s not, we may then accept nonconforming product and reject conforming product!

 

Like any process, however, the Calibration Process must be managed and controlled effectively and efficiently, making informed decisions with objective data

 

It’s not just about “sending it out for calibration again!”

 

We must be able to assure that we maintain control over the accuracy appropriately and that our measurement equipment remains reliable so we don’t allow it to negatively impact product integrity

 

I’ve chosen to use some real world examples and basic fundamentals for a better general understanding

Theoretical concepts, Uncertainties, Gage R&R, etc.., all have merit, but unfortunately have been over used, misused, and abused.

 

Let’s use the example of the manufacturing facility that has 100 of the 0-1” micrometers utilized in the manufacturing environment

They had chosen a 1 year time period interval, and at the time of calibration they find that less than 90 of those micrometers are within acceptable calibration limits

(The as found condition, prior to any corrections!)

 

So if there are less than 90 micrometers found within limits, our reliability is less than 90%, right?

Calibration process has failed miserably!

 

The “interval” of 1 year was neither reliable nor adequate based on the results data!

 

In addition, if any of the micrometers have exceeded the tolerance more than 50% beyond the calibration tolerance, they are “Significantly Out of tolerance” and we must now consider and assess potential impact to product

 

 

Let’s use another example; a manufacturing facility has a 24” height Gage

 

They had been calibrating the Height Gage themselves, using an 81pc standard gage block set!

They had been measuring and accepting product in the 20” range, but the largest range calibration standard that was used was 4”

Their customer subsequently rejected that product having a length of 19.750 +-.005

 

This condition was previously unknown, prior to the rejection! The device was subsequently then found to have a major accuracy issue above 12” and was significantly out of tolerance, and certainly did impact the product!

 

One more example, a supplier using a micrometer with an accuracy of +-.0002, to calibrate their plug gages to a tolerance of +-.0002

This is an issue with accuracy ratio, we are at 1:1, not at 4:1

 

Always keep in mind that the accuracy ratio of the standard used to calibrate other standards or measuring devices should be at least 4:1, preferably 10:1

 

In turn, the measuring device accuracy should also have an acceptable accuracy ratio to product tolerance, 4:1 to 10:1

 

 

So please don’t even think to use a micrometer to check a product with less than an .0008 total tolerance, nor use a 6” Caliper for a product with a dimensional characteristic of less than .004 total tolerance