I got stumped today.
We have a part with a thickness of 0.035 +/- 0.002"
Our incoming inspection measured multiple points on the surface of the piece using two calibrated height gauges. One height gauge is accurate to 0.001" and the other one is accurate to 0.00012".
With the first height gauge, the measurements all fall within the tolerance range.
With the second height gauge, a few measurements fall outside of the range (eg. 0.0374").
This brings up the question... when evaluating dimensions on a part, how accurate does the measurement tool need to be? I found some information regarding a Precision-Tolerance ratio. I also found a general rule that the tool should be capable of 10% of the tolerance range, which in this case is 0.0004". Our QA guys want some 'evidence' to back this up though. Thoughts?
We have a part with a thickness of 0.035 +/- 0.002"
Our incoming inspection measured multiple points on the surface of the piece using two calibrated height gauges. One height gauge is accurate to 0.001" and the other one is accurate to 0.00012".
With the first height gauge, the measurements all fall within the tolerance range.
With the second height gauge, a few measurements fall outside of the range (eg. 0.0374").
This brings up the question... when evaluating dimensions on a part, how accurate does the measurement tool need to be? I found some information regarding a Precision-Tolerance ratio. I also found a general rule that the tool should be capable of 10% of the tolerance range, which in this case is 0.0004". Our QA guys want some 'evidence' to back this up though. Thoughts?
Aucun commentaire:
Enregistrer un commentaire