Gauge R & R is a measurement systems analysis technique that uses an analysis of variance (ANOVA) random effects model to assess a measurement system. The evaluation of a measurement system is not limited to gauges but to all types of measuring instruments, test methods, and other measurement systems. A Gage R&R study quantifies the inherent variation in the measurement system, but measurement system accuracy. A Gage R&R study is a critical step in manufacturing Six Sigma projects, and it consists of three things Repeatability – variation from the measurement instrument
Reproducibility – variation from the individuals using the instrument Overall Gage R&R, which is the combined effect of (1) and (2)
The overall Gage R&R is normally expressed as a percentage of the tolerance for whatever being studied, and a value of 20% Gage R&R or less is considered acceptable in most cases. Example: for a 4.20mm to 4.22mm specification (0.02 total tolerance) on a shaft diameter, an acceptable Gage R&R value would be 20 percent of 0.02mm (0.004mm) or less. Conducting a Gage R&R Study
GR&R studies can be conducted on both variable (gaging that produces data) and attribute (gaging that produces a “go/no-go” result) gaging. Prior to conducting a Gage R&R, the following steps/precautions should be taken – Calibrate the gage – ensure that the gage is calibrated through its operating range – keep in mind that Gage R&R and gage accuracy are two different things. Check the gage resolution – the gage should have sufficient resolution to distinguish between several values within the tolerance range of the feature being measured. As a general rule, the gage should be able to distinguish at least ten readings within the tolerance range. See distinct categories for more information. Collect samples to be measured – it’s important to collect samples (parts, materials, or whatever is being measured) that represent the majority of the...
Please join StudyMode to read the full document