Tolerance and precision are two very important concepts in manufacturing. Tolerance is the maximum allowable within a manufacturing process where the part will still be of acceptable quality. Dimensional tolerances are set in the design phase to establish the upper and lower limits of critical dimensions that still allow a part to perform its intended function. Precision is a measure of the consistency with which parts can be manufactured, this includes both the precision/accuracy of the measuring instrument and the “precision” of the manufacturing process in question, e.g., CNC machining.
This article will take a look at the definitions of tolerance and precision, why they are important, how they are determined, and what factors affect them.
Table of Contents
What Is Precision?
Precision may refer to two types of precision. Instrumentation precision refers to the closeness of two or more measurements to each other when the same object is measured under the same conditions. It is a measure of the variability in the measurement process. A high precision indicates that the measurement process produces very similar results for repeated measurements, showing a low level of random error.
Manufacturing process precision refers to the consistency with which a manufacturing process can produce the same outcomes. Any process has a degree of variability, where the outcomes are slightly different for the same inputs. Process precision aims to measure this variability. Higher rates of process precision mean that a manufacturing process produces parts that are similar to each other.
How Does Precision Work?
Precision as a metric in manufacturing exists because there is random error in manufacturing machinery and processes, and in the instruments that measure the outputs of these processes. This random error means that there will be dimensional variations within any group of manufactured parts. Process precision measures how consistent a specific manufacturing machine or process is. A low-precision machine or process will introduce a high amount of deviation between parts. High precision means that the process produces consistent results, with low deviation between parts.
Instrument precision and process precision both play an important role in gauging the overall precision of a manufacturing process. High instrument precision is required to be sure that you are getting consistent measurements with which to make informed decisions. In general, instrument precision should be around ten times higher than process precision to be able to accurately measure the deviations in manufacturing.
What Is the Use of Precision?
Precision manufacturing becomes useful in batch or mass production, where the same component or product is produced numerous times, and each part is nominally the same. Minimizing variability is essential for producing components with a high degree of consistency and ensures that each item produced is virtually identical to all the others, enhancing the overall quality of the manufacturing output.
How To Measure Precision
Precision can be measured using the following steps:
Determine the instrument precision of the measuring process.
Manufacture a batch of test parts.
Measure the critical dimensions of a statistical sampling of the parts from that batch, and record the data.
Find the mean values of those critical dimensions. The mean value for a dimension is calculated by dividing the sum of all the measurements of a particular dimension by the number of measurements of that dimension.
Determine the deviation of each of the measurements from the mean for those measurements.
The standard deviation can then be calculated by using the following formula:
standard deviation formula for precision
Where x is the deviation of each measurement, is the mean measurement value, and N is the total number of parts measured. Standard deviation is commonly used as a measure of precision, with lower values indicating higher precision.
Mean absolute deviation is another measure used as an indicator of precision. This value does not get affected drastically by outliers. It can be calculated by subtracting the mean value from the measured value for each dimension to get the absolute deviations. The absolute deviations can then be summed, and divided by the total number of parts measured to give the mean absolute deviation.
Precision is reported as a ± range. For example, a precision of ±2.5 mm means that the dimensions of a part are consistent within 2.5 mm of each other.
Are Precision and Accuracy the Same?
No, the terms “precision” and “accuracy,” even though they are sometimes used interchangeably, do not have the same meaning. Accuracy describes how close a measured value is to the true value or expected value. In other words, it is a measure of correctness, indicating the extent to which a measurement agrees with a reference or accepted value. On the other hand, precision refers to the consistency of repeated measurements, showing how closely these measurements are to each other, regardless of whether they are close to the true value. High precision means that the measurements are very similar to each other but does not necessarily mean they are accurate (close to the true value). Accuracy and precision are independent of one another.
In industries like manufacturing and science, it’s important to have results that have both high precision and high accuracy. Figure 1 below shows the difference between precision and accuracy in a bullseye measurement:
precision vs accuracy
Visual representation of accuracy and precision. Image Credit: https://www.antarcticglaciers.org/glacial-geology/dating-glacial-sediments-2/precision-and-accuracy-glacial-geology/
As can be seen in the figure, accurate measurements are centered around the bullseye, whereas precise measurements are closely grouped, regardless of whether or not they are centered on the target. Measurements that are both accurate and precise are closely grouped and lie within the bullseye.
What Is the Importance of Precision in Manufacturing?
Precision is a measure of the consistency with which parts are manufactured. High precision means that a manufacturer can be confident that any parts made within a given batch will be consistent and similar to each other. It is a very important metric and target in manufacturing to ensure consistent, high-quality output.
What Is Tolerance?
Tolerance is the allowable amount of deviation from the design during the manufacturing process where the part is still considered acceptable. Any manufacturing process or machinery will produce a certain amount of variation, as nothing is perfect. The allowable deviations from the design are usually given as a maximum range, i.e., a tolerance of 0 +/- 1 mm means that a deviation of less than 1 mm for any measurement would still deliver acceptable results.
In the product design phase, it is crucial to establish tolerances that will still allow the product to function correctly, even if a critical part dimension is at the maximum (or minimum) value for the tolerance. At the same time, the tolerance must take into account the planned process and the economic burden of a tolerance that is tighter than necessary. In essence, tolerances provide a manufacturer with guidance on the acceptable range of results for a part dimension.
How Does Tolerance Work?
Tolerances are an integral part of the manufacturing process because no tool, machine, or material is perfect. Even the best-made machine with ideal calibration is subject to variations which results in deviations from the intended design. These possible deviations must be taken into account in the design phase to ensure the quality of the product is acceptable.
Tolerances are used as outer boundaries for acceptable dimensions of a part. Manufacturers use a range of process capability tracking and analysis tools to make sure that the products produced stay well within the outer boundaries established by the print tolerances.
What Is the Use of Tolerance?
Tolerances establish how much variation is acceptable for a given dimension on a given part. It is a vital metric to consider when designing and manufacturing a product, to ensure that the part is manufactured with acceptable quality.
How To Measure Tolerance
Tolerances are set during the design phase as the maximum allowable deviation from the specified geometries in the design. As such, they are typically not measured. To verify whether a manufacturing process can meet the required tolerances, the following procedure can be followed:
A batch of nominally identical test parts is produced.
The dimensions of these parts will then be measured.
The deviations of the parts from the design are checked to verify whether they fall within the tolerance range.
In the case that some of the parts are out of specification, the tolerances may be changed, or the manufacturing process may be improved.
What Is the Importance of Tolerances in Manufacturing?
Tolerances are important in manufacturing to establish the boundaries within which parts are still considered acceptable, even when the inevitable variations occur in manufacturing. Acceptable tolerance ranges should be specified as part of the design process. These tolerances describe the maximum amount of deviation at which the part would still be acceptable. At the manufacturing stage, these specified tolerance ranges must be taken into account. Only manufacturing processes and machinery that are capable of producing parts at the level of precision, and at the targeted value required by the established tolerances should be used.
What Are the Factors Affecting Precision and Tolerance in Manufacturing Process?
Precision and tolerance within the manufacturing process are affected by several factors, including:
1. Material Properties
Material properties such as hardness, coefficient of thermal expansion, and tensile strength affect how materials behave when being processed. Variation in mechanical or physical properties makes it harder to get consistent results from a manufacturing operation, which can negatively affect both precision and accuracy. For example, if the raw material being processed is harder than expected, the selected press may not have enough power to be able to accurately form it to the intended dimensions. This is why raw materials must also meet specifications with tolerances. A raw material might be specified with an allowable range of hardness to avoid variability in process outcomes.
2. Environmental Conditions
Environmental conditions, specifically temperature and humidity, can play a role in setting tolerance levels, and in the actual precision of a manufacturing process. Temperature fluctuations make materials expand and contract, and humidity can also affect material properties. Maintaining consistent environmental conditions can help improve the consistency between parts. This improves overall precision, which aids in meeting established tolerances.
3. Design Considerations
The design of a part affects how easy or difficult it is to machine or manufacture. The choice of material, its thickness, and the manufacturing processes used all play a big role in how consistently parts can be produced. Tolerances should never be set any tighter than proper part fit and function requires. If a process is producing out-of-tolerance parts, the original tolerances could be reviewed to see if they could be relaxed to accommodate the current process. Another possibility might be to redesign the part to make it easier to meet the required dimensions. If none of these options is a possibility, it may be necessary to change processes or use more capable machinery to meet the requirements.
4. Supply Chain Factors
The main supply chain factor that affects precision and tolerance is material quality. The quality of material that is to be used in the manufacturing process has a direct effect on the quality of the manufactured part. Any variations in the properties of the raw materials can affect how the material behaves during the fabrication process, leading to variations that lower the overall manufacturing precision.
5. Fixturing and Workholding
How a part is fixtured or held in place during processing plays a vital role in maintaining part precision and accuracy, and meeting established tolerances. Workpieces must consistently be held in the correct orientation and position to achieve repeatable results. Any deviations in the position of the part during processing can translate into dimensional variation in the final part. Likewise, if workpieces are not fixed securely enough, any vibrations or shifting will result in inaccurate parts, which makes it challenging to achieve high precision and tight tolerances.
6. Machine Capabilities and Calibration
Machines inherently produce random errors while processing a workpiece, as no machine can function with perfect consistency at all times. These random errors translate into deviations in the manufactured parts. Higher quality machines tend to produce more consistent results, with less deviation, improving overall manufacturing precision. Calibration may help minimize deviation across parts but is mainly used to increase accuracy, which can help in achieving tight tolerances.
7. Operator Skill and Training
Operator skill and training play a crucial role in maintaining high precision and meeting maintaining tight tolerances in manufacturing. For manual manufacturing processes, the operator directly controls the outcome of the process. Highly skilled operators are more likely to achieve consistent results, increasing precision. Training can further help increase their skills and familiarity with specific machinery and processes.
8. Process Stability and Variation Reduction
Process stability and variation reduction refer to methods of improving the consistency of outcomes of manufacturing processes. By reducing variability, the consistency of the manufactured parts is increased, which in turn increases precision. Many methods exist to help increase product stability, such as the use of feed-forward controllers, which compensate for deviation in a part, feedback control, where deviations in a part are adjusted for in subsequent parts, or reducing variations in inputs to the manufacturing process.
9. Quality Control and Inspection Techniques
Quality control and inspection techniques play a vital role in upholding high precision levels. By consistently maintaining high-quality control standards, any potential defects or unacceptable deviations can be spotted, and the process quickly adjusted to avoid making more out-of-tolerance parts. Statistical process control procedures and tools can be used to monitor the quality of the manufacturing process, making informed adjustments to maintain high levels of precision.
To learn more, see our full guide on Quality Control.
10. Tooling and Cutting Parameters
Tooling and cutting parameters can significantly impact precision if selected incorrectly. Parameters such as feed rates, depth of cut, and tool geometry all have optimal ranges of operation for a given process operation. Using cutters or tooling outside of these optimal ranges can lead to inferior part quality and significant variation between parts, affecting precision and the ability to meet established tolerances.
To learn more, see our guide on Tooling and Cutting Parameters.
What Are the Best Practices for Material Handling To Ensure Consistent Precision and Meet Tight Tolerances in Production?
Adopting established best practices for material handling is crucial in ensuring consistent precision and meeting tight tolerances in a mass production setting.
This begins with material quality control, which is a way to verify that all materials meet the required standards and specifications before they enter the production line. Proper storage is also important. It is important to protect materials from environmental damage, contamination, and deformation, with specific attention to controlling temperature and humidity for sensitive materials. Equally important is the training of personnel in correct material handling techniques to prevent damage and identify potential issues early. Regular calibration of measuring tools and equipment used in production also ensures ongoing precision. Following these best practices can significantly contribute to achieving and maintaining the high levels of precision and tight tolerances required in modern manufacturing processes.
What Are the Common Challenges in Achieving Precision and Meeting Tolerances in Manufacturing?
There are several challenges in achieving high precision and meeting tight tolerance levels in manufacturing, including:
Variability in the quality and properties of the raw materials can cause precision and accuracy to suffer. Finding suppliers who can deliver materials of a consistent quality can be a challenge.
Manufacturing machinery requires calibration to operate at optimal levels. However, calibration costs downtime at the very least, leading to lower productivity levels. Finding the balance between prioritizing precision and productivity can be challenging.
Keeping precision high enough to meet tight tolerances often involves spending money on high-quality machinery, operator training, and automation tools. Costs can easily overrun in the pursuit of these two metrics.
Some part designs require machining that is difficult to perform at a consistently high-quality level. Sometimes there is no practical workaround for this, and precision suffers as a result.
What Is the Difference Between Accuracy and Tolerance?
Accuracy is a measure of how close a given dimension is to the intended dimension. In the context of manufacturing, this would be how close the dimensions of a manufactured part are to the design specification.
Tolerances in manufacturing refer to the allowable deviation in measurements from a designed part to a manufactured part. Tolerances specified in a design would refer to the amount of deviation that would still result in an acceptable part.
What Is the Difference Between Precision and Tolerance?
Precision in the context of manufacturing is a measure of the consistency with which a part can be manufactured accurately.
Tolerance in manufacturing refers to the allowable deviation in measurements from a designed part to a manufactured part. Tolerances specified in a design would refer to the amount of deviation that would still result in an acceptable part.
Precision levels determine whether tolerances can be met. Low manufacturing precision may lead to a large number of out-of-specification parts. The precision level of the manufacturing process should be considered during the design phase when setting tolerances. If tolerances cannot be met with a particular manufacturing process, the tolerances may be widened, the part may need to be redesigned, or higher-quality manufacturing processes and machinery may need to be considered.
XTJ is a leading OEM Manufacturer that is dedicated to providing one-stop manufacturing solutions  from prototype to production. We are proud to be an ISO 9001 certified system quality management company and we are determined to create value in every customer relationship. We do that through collaboration, innovation, process improvements, and exceptional workmanship.