In many of today’s manufacturing enterprises, the CMM has supplanted traditional hard gaging systems.
Process control and quality assurance in modern manufacturing operations depend increasingly upon the performance of coordinate measuring machines (CMMs). Over the past 25 years, CMMs have replaced traditional methods of inspection that employed gages and fixtures, and have reduced the time and manpower required for quality-control operations.
CMMs allow you to inspect standard geometrical dimensions, as well as parts with special features such as gears, camshafts, and airfoil shapes. In a traditional manufacturing environment, each of these special inspections would require a single-purpose testing machine.
Product quality doesn’t depend only on the quality of the machine tools used for manufacturing; it also depends on the accuracy and repeatability of measuring and inspection devices. A low-cost, low-performance machining center used in combination with a high-precision CMM can still guarantee product quality, because only parts within tolerance can pass the CMM’s inspection. Conversely, an expensive, high-quality machining center operated in combination with a low-cost, low-accuracy measuring device cannot guarantee quality products. A certain percentage of out-of-tolerance parts will always pass the low-accuracy CMM’s inspection, and likewise, a certain percentage of parts within the tolerance range will be rejected. Consequently, selecting the right CMM is critical.
The first important criterion is the CMM’s minimum required measuring range. This range usually depends on the dimensions of the part to be measured, but is often more complex than that. If the configuration of the part and the inspection routine require the use of probe extensions and fixtures, the actual minimum required measuring range could be considerably larger than workpiece dimensions.
As a guideline to properly sizing your CMM, consider choosing a machine whose X, Y, and Z measuring ranges are twice the width, length, and height of the largest part you need to measure.
The second selection criterion is minimum required uncertainty. Uncertainties and test procedures for CMMs are described in ISO 10360. Some CMM manufacturers do not conform to ISO 10360, but use other performance standards such as CMMA, VDI/VDE 2617, ASME B89, and JIS which are less demanding to comply with.
To compare CMMs from different manufacturers, however, make sure to compare “like” specifications. Most CMM manufacturers already offer their specifications in a variety of formats to support their international customer base. In addition, if you’re an international manufacturer, it may be prudent to request the CMMs’ specifications in the ISO 10360-2 format, because it’s become the world standard. This will allow you to not only compare CMM performance between competitors, but to compare the new machine to existing machines installed throughout the world.
In force since 1994, ISO 10360-2 specifies three uncertainties: volumetric length measuring uncertainty (MPEE); volumetric probing uncertainty (MPEP); and volumetric scanning error (MPETHP). (MPE is the acronym for Maximum Permissible Error.)
A set of five calibrated gage blocks is used to verify a CMM’s MPEE. Measurements are taken in seven different locations (position and direction) within the CMM’s measuring volume for the test. For each of the seven locations, the length of each of the five gage blocks is measured three times for a total of 105 measurements. All 105 measurements must be within the stated tolerance specified by the manufacturer.
A precision sphere between 10 and 50 mm in diam with form and diameter certification is used to verify a CMM’s probing uncertainty (MPEP). (The test consists of measuring 25 equally spaced points on the sphere.) MPEP is computed by adding the absolute values of the minimum and maximum deviation from the radial form. Total measured form deviation is the volumetric probing uncertainty. The result is reported in micrometers (µm), and all 25 probings must be used in the calculation.
Scanning performance is calculated by scanning a high-precision sphere on four exactly defined lines at a speed of 10 mm/sec. The total form deviation on all four lines is the MPETHP value.
These tests are very specific both in definition and execution. It’s important to remember that a CMM’s uncertainty under actual operating conditions can be larger than stated on the manufacturer’s specifications, because of the use of probe extensions, long or slender probes, rotary tables, articulating probe heads, temperature changes, and airborne contaminants in the shop.
For example, MPEE and MPEP as specified are determined by one stylus fixed directly in the probe head with no extensions and no rotation. However, most workpieces require complex probe configurations. A workpiece might require the use of several styli, extensions, rotations of the probe, and perhaps a probe change during the inspection program. Because of these differences, the generally accepted practice is to apply a ratio of uncertainty to tolerance when calculating a required CMM specification. This ratio may vary widely depending on the factors described above, the complexity of the measurement task, and the process. Typical ratios range from 1:3 to 1:20, with 1:5 and 1:10 being the most common. To maintain a 1:5 ratio of uncertainty to part tolerance, the CMM data sheet specification should be five times more accurate than the tightest tolerance inspected.
On almost all workpieces, CMMs must inspect three groups of features–diameters and distances, position tolerances, and form tolerances. An analysis of the required uncertainty must be performed for each group.
- For diameter and distance tolerances, refer to the part drawing and locate the diameter for distances with the tightest tolerances. Because of the length dependency of volumetric uncertainty, a larger tolerance on a very long feature may present more difficulty than a very tight tolerance on a small feature.
- Because position tolerances usually define a tolerance diameter, only the radius is used to determine the deviation from the nominal center.
- Form tolerances include callouts for roundness, flatness, straightness, cylindricity, and profile form.
Environmental conditions impact the uncertainty of every CMM. Consequently, CMM manufacturers usually specify the temperature range, temperature variation per hour, temperature variation per day, and temperature variation per meter within which a particular CMM achieves its performance specifications. These variables must be considered when selecting an appropriate CMM.
In addition, the level of floor vibration is important to optimizing a CMM’s performance. Most manufacturers supply the maximum vibration that the machine can withstand and still meet stated specifications. Optional active and passive vibration damping systems can also be purchased that allow the machine to be installed in less-than-friendly environments, and perform to the published specs. It’s important to have a complete seismic vibration study performed at the preferred installation site if you think vibration is an issue.
All CMM manufacturers provide software for basic measuring routines. Some also provide software for parts with more complex geometries. Be sure you understand the complexity of the application, and select the software package that not only provides the results you need, but is easy to use.
Throughput requirements are also a consideration. The more parts a CMM can inspect per day, the lower the inspection cost per part. Acceleration and the number of probing points per minute are the factors that determine overall throughput. Special fixturing arrangements, such as pallet inspection of parts, can increase throughput.
Performance standards provide the means for manufacturers to rate a CMM. These standards are useful when comparing different brands of machines, to help determine how well the machine will measure parts, and to check that the machine works properly.
There are different standards of measurement or calibration, however, and these differing standards often cause confusion. Three primary standards are used to verify the accuracy of measuring machine performance: ASME B89.4.1, VDI/VDE 2617, and ISO 10360.
Differences between standards lie chiefly in the number of tests used to evaluate CMMs, and the way in which performance specifications are written. To evaluate length-measuring performance, the B89 standard uses multiple tests; VDI/VDE 2617 uses three tests and ISO 10360 uses three tests, one being for the probe. To represent a performance range, B89 specifications use a single number. For example, a specific CMM might have a B89 volumetric-performance specification of 0.010 mm/325 mm. The number after the slash is the length of the ball bar measured. This spec means that the range of measured lengths with the ball bar in many positions is no greater than 10 µm. VDI/VDE and ISO specifications represent length-measuring performance as a formula. A CMM’s volumetric performance can be stated in the VDI/VDE format as U3 = 4 + 5L/1000. This notation means that over the same measured 325-mm length, there could be an error no larger than ±6 (actually 5.625) µm.
The VDI/VDE and ISO standards use measurements made on a calibrated step gage, or an equivalent set of gage blocks. In the VDI/VDE standard, the gage is measured in three positions: axial (U1), planar (U2), and volumetric (U3). Differences between the measured lengths and the calibrated lengths of the gage are compared in the formula U= a + b X L/1000 for the VDI/VDE specification.
While the “a” term is a value representing the error when measuring something of zero length, the “b” and “L” terms divided by 1000 represent the increase in error based on the length measured. The formula represents a line that for zero measured length is the “a” value, for example, 4 µm in the equation above, and goes up by a slope defined by the “b” term. The “b” term is the number of micrometers that the error increases for every 1000 mm of “L” length. So, the formula U3 = 4 + 5L/1000 for volumetric accuracy means that the error for zero measured length is 4 µm, and for every additional meter of length measured it becomes 5 µm larger. The specification is commonly stated as U3 = 4 + 5L.
The measurement approach is the same for the ISO standard, but the formula changes to MPEE = a + L/k, where “k” is the “b” value for the VDI/VDE formula divided into 1000. There are no individual axial and planar specifications; they are included in the volumetric “E” specification.
The basic test of CMM performance under the B89 standard includes five measurements:
- Multiple measurements of the position of a fixed ball. The range (largest minus smallest) is the machine’s repeatability.
- Measurements with a step gage or laser in each axial direction, which determines the machine’s linear accuracy.
- Measurements of a ball bar at multiple positions and orientations in the machine’s working volume. This value is the machine’s volumetric performance.
- Measurement of the ball bar in four diagonal positions in vertical planes. In each position, the ball bar is measured with two right-angle probe-offsets, and the difference in measured lengths is determined. The differences are compared with an offset-probe performance specification.
- Measurement of the length of a short gage block in four orientations. The measurement is compared with a bidirectional accuracy-measuring capability specification.
The most important subject now under discussion in both US and ISO standards committees is that while these performance tests provide an overall characterization of machine quality, they don’t give the user enough information about how accurately the machine can measure a feature. Technical standards committees around the world are working to determine how to characterize what is called “task specific measurement uncertainty” as a way of describing how accurately the machine can perform a real measurement task.