Q: I am looking for some information regarding blueprint tolerances and measurement tools used to measure certain features.
For example, can the same type of tolerance be applied over the length of 450 mm as it could be for a distance of 3 mm? Is there additional measurement error or gage error that needs to be applied for longer distances? If one uses a 1” micrometer for measuring a feature, does it make a difference in the measurement error by using the top end of the instrument versus using it to measure just very small features?
A: Thank you for your questions about measurement tolerances. First of all, since your questions were multi-layered, my answers will be as well. Nonetheless, I think I should be able to help you.
As for using the same type of tolerance for a dimension of 450 mm and a dimension of 3 mm, there is more than one answer. We’re talking about 17.7165 inches vs. .118 inches. The 3 F’s must first be considered. That is Form, Fit, and Function. In other words, where will this product be used? If this will be for a medical product or for anything whatsoever where safety is a factor, the design engineer will most likely use a tighter tolerance. So both dimensions could be ± .001 or a more liberal ± .010. The difference between the two sizes would just change the way they are manufactured. For example: a 17.717 inch product with a tolerance of ± .030 could probably be rough machined or even made on a burn table. If the size or location of the smaller dimension is critical, you would machine it with appropriate equipment and hold a tighter tolerance. OK, enough Manufacturing 101 lingo.
With regard to measurement error, larger/longer dimensions can introduce the possibility of increased measurement error. However, if a “qualified” and experienced individual is doing the measurement, that should not be a major factor. The same basic skills and standards would apply. The type of measurement equipment can make a difference. In other words; if you use a Dial Caliper, you can probably rely on it to be accurate within .001-.002 inches. If you use a 0-1 inch micrometer, you should be able to trust it’s accuracy within .0001 inch.
A qualified metrologist and/or a quality technician would know to check a micrometer at numerous points over its measuring range. Measurement error should not increase significantly from one end to the other. If it does, there is something wrong with the calibration or with the tool itself.
I know the above can be perceived as general answers, but I am confident you will see the specifics there as well.
ASQ Senior Member, CQT, CQI