7. Process Capability and Sigma Level – Six Sigma for Business Excellence: Approach, Tools and Applications

7

Process Capability and Sigma Level

Process Capability: Basic Concepts

Having established the measurement system capability, the next logical step in a Six Sigma project is to determine the initial process capability. In process capability evaluation, we compare and find the ratio of two values: process tolerance and process variation.

Process tolerance is derived from the customer's requirements. Thus, it is sometimes called the ‘voice of customer’ (VOC). Process variation, on the other hand, is a result of our ability to manage and control the process. It is sometimes called the ‘voice of process’ (VOP). Tolerance for the process is usually specified by the designer. This obviously must relate to the customer's requirements. In complex products, there are a large number of components and sub-assemblies. The customer does not specify the tolerance of each component but is interested in performance and reliability of the end product. For example, a customer for a car would expect good appearance, a comfortable and safe ride, fuel economy, low maintenance, etc. He or she will never specify tolerance for pistons or steering or the starter motor. It is the designer's responsibility to understand these unspecified requirements and convert these into internal specifications. When these specifications are met, the product should meet the customer's requirements. To quantify process variation, ‘standard deviation’ (SD) is used.

To evaluate process capability, the process must be stable and predictable. In simple language, the process must be working as designed and documented. Thus, it is expected that all the preliminary work of setting up process parameters is completed and we are ready to run the process. We also need to ensure that the process is not changed or disturbed during the capability study.

Output data from many processes tend to fit normal distribution. Once we have verified that it does fit normal distribution reasonably well, we can use the mathematical formulae to interpret process variation and process capability. An important property of normal distribution is that 99.73 percent of data points lie within ±3 standard deviations (See Figure 7.1). Hence, the spread of ±3σ or total 6 σ is a measure of process variation, called the voice of process. The ratio of tolerance to 6σ is called process capability index Cp. Thus:

 

 

Note: If standard deviation is calculated by root-mean-square formula, Cp and Cpk are referred as Pp and Ppk in the SPC manual published by AIAG and also in the SigmaXL and Minitab software programs. More information is available in Chapter 17 on SPC.

 

Based on the above formula, we can easily conclude that Cp must be ‘sufficiently’ greater than 1 to assure that ‘most’ parts conform to specifications (AIAG 2005). Figure 7.2 should prove useful in understanding the concept and interpretation of Cp.

 

 

Figure 7.1 VOC and VOP

 

Figure 7.3 shows the estimated defect level in terms of parts per million (PPM) as a function of Cp. When the process mean equals mean of specification limits. The PPM levels can be easily calculated using Z values.

Based on the values of Cp and PPM shown in Figure 7.3, we can conclude that Cp values of more than 1.33 are desirable if we want to assure PPM level below 100. The graph is plotted assuming normal distribution and process mean matching with the tolerance mean. Minimum capability index is a management policy and relates to customer expectations. Most engineering and automotive manufacturers used to consider 1.33 as acceptable Cp value till the 1980s. Later, for the companies aspiring for Six Sigma, Cp target value became 2.

 

 

Figure 7.2 Interpretation of process capability index Cp

 

 

Figure 7.3 Capability index Cp vs defect level PPM

The Effect of Process Centering

Cp indicates inherent process capability and does not consider the effect of process centering. Cpk is calculated considering process centering as follows:

 

 

In the above equations, and s are approximate values of μ and σ.

Figure 7.4 illustrates the effect of process shift on Cpk. The dark area shows the probability of rejection.

 

 

Figure 7.4 The effect of process centering on Cpk

 

It is usually easier to improve Cpk if Cp is satisfactory. Imagine that clock ‘A’ runs slow. Such a clock will not be useful as we will not know the correct time at any instant (except twice a day…if one wants to argue). However, if clock ‘B’ always shows 10 minutes more than the actual time, we can easily adjust it. Thus, we would prefer clock B. Improving clock A is much more difficult. Similarly, it is much easier to correct a process with good Cp but low Cpk as it only needs adjustment. Adjustment of a process is usually the responsibility of the operator or supervisor. It is more difficult to improve a process with low Cp and this is the responsibility of the management. Process shift can occur due to various reasons. Depending on the type of industry, these can include tool wear, concentration of chemical bath, recalibration of measuring instrument, operator bias for process setting, environmental changes, heating of the equipment after usage for some time, etc. Considering that many of these causes can occur while running the process, Cp value of greater than 1.33 is usually preferred to achieve low PPM levels. The Six Sigma philosophy considers that a process shift to the extent of 1.5σ can occur. In a process that has achieved the Six Sigma level, σ is reduced to such an extent that (USL − LSL) = 12σ. Thus

 

 

 

Figure 7.5 Capability Index for a process at Six Sigma level

 

Even when such a process shifts to the extent of 1.5σ, the Cpk will be 1.5. This can easily be seen with equations if the process mean shifts towards the lower specification limit (LSL):

whichever is lower.

If process shifts towards LSL by 1.5σ,

whichever is lower.

Similarly, the Cpk will be 1.5 if the process mean shifts towards the USL (See Figure 7.5).

Application Example of Process Capability Calculation

A process is running with a grand mean of 24.9 mm and standard deviation of 0.15 mm. Specification limits are 25.5 and 24.5. Find Cp and Cpk. What action should be taken?

 

 

Thus, Cp needs to be improved to at least 1.33, preferably more to minimize the possibility of producing defective parts.

Process Capability Evaluation of Normally Distributed Data

Preliminary Capability Study Without Plotting Control Charts

This is like taking an ‘instantaneous snapshot’ of the process. Preliminary process capability can be estimated using the measurement of a sufficient number of parts. The important question here is: How many parts should we measure? In statistics, larger sample sizes always provide better estimates. However, usually a minimum of 50 pieces are recommended by experts. The data is then collected and analyzed and the overall standard deviation calculated. Prior to using the data for capability calculations, it is necessary to review the histogram and the normal probability plot. A more objective method of validating normality is by performing a ‘Goodness of Fit Test’ such as the Andersen Darling Test. For more information on goodness of fit tests, see Chapter 10. If the data can be considered as normally distributed, we can perform capability calculations using normal distribution.

Stages of Preliminary Capability Study

The process of performing preliminary capatility study involves the following steps:

  1. Select a characteristic for the study.
  2. Define and document the process.
  3. Perform a measurement system study.
  4. Collect data of about 50 to 100 parts produced without any changes in process parameters.
  5. Verify that the data is normally distributed. This is usually done using a software.
    • The shape of the histogram is a good indicator
    • Draw a normal probability plot
    • Perform a goodness of fit test
  6. If the data is normal, calculate the process capability indices Cp and Cpk.
  7. If Cp is low, plan for improvement. If Cp is good but Cpk is low, adjust the process so that the mean of the process matches with mean of tolerance.

Application Example of Preliminary Capability Using a Software

Design specification of a spring requires force to be controlled between 145 and 155 Newton. Data of 50 pieces were collected. This is available in Minitab worksheet file ‘spring force.mtw’ or Excel file ‘spring force.xls’. Calculate the process capability. Partial listing of data is shown in Table 7.1.

First check whether the data is normal. If yes, we can calculate the process capability.

To judge normality, plot the histogram of the data.

The histogram is shown in Figure 7.6. This looks ‘reasonably’ normal.

 

Table 7.1

Force
151.381
147.952
150.929
151.483
150.067
151.522
149.381
151.788
150.251
152.448
152.876

 

Figure 7.6 The histogram of spring force data

 

Now perform a normality test. Normal probability plotting and interpretation of p-value was explained in Chapter 4 on basic statistical concepts. Figure 7.7 shows the normal probability plot of the spring force data.

 

 

Figure 7.7 Normal probability plot of spring force data

 

As the P-value is 0.512, i.e., > 0.05, data can be considered as normally distributed. Mean is 151 and standard deviation is 1.279. Thus Cp and Cpk values are

 

 

Process Capability for Only Upper or Lower Specification

There are some characteristics that have only upper or lower specification limit.

  • Upper specification only (lower is better): surface roughness, run out, call length of a call at a call center, cycle time of a process
  • Lower specification only (larger is better): strength of a machine part, hours to fail.

In such cases, the value of Cp does not exist. Only values of CPU or CPL and Cpk exist.

Application Example

A call center, for example, analyzes the data of 150 calls and finds that the mean call length is 75 seconds with standard deviation of 20 seconds. Service level agreement with the customer is for a maximum call length of 120 seconds. If the data is normally distributed, what is the process capability?

 

 

As Cpk is < 1, the call center must take immediate corrective action to improve the capability.

Long-Term Process Capability Using Control Charting

When long-term process capability is to be evaluated, the use of control charts is recommended to continuously monitor and assess the stability of the processes. This procedure is explained in Chapter 17 on statistical process control.

Process Capability of Non-normal Data

How Normal Is Normal?

While many statistical methods are based on the assumption of normal distribution, we should understand that there are often situations where normal distribution is not appropriate. In such cases, if normal distribution is ‘force-fitted’ on the data, the process capability values can be misleading and result in incorrect estimates of defect level. Techniques to handle such data are discussed in this section.

Examples of such situations are given below:

  • Data belongs to multiple streams such as suppliers, machines, spindles, dies, patterns, etc. In such cases we can separate the data by the cause or variable that is causing non-normality. After the data is separated, it may be normal and we can use procedures based on normal distribution.
  • Processes that do not result in data which is normally distributed. We have two alternatives under this situation. (i) Use other appropriate distribution or (ii) transform the data so that it can fit into normal distribution.

Analyzing Non-Normal Data

The above options are explained in the following examples.

Application Example

Consider an example of transportation time. Data of time to transport parts from location A to B is collected for 200 trips. Data is available in Excel worksheet ‘transport time.xls’. We plot a histogram and normal probability plot to see whether this data is normal. This is showed in Figure 7.8.

The p-value is < 0.05 and therefore the data cannot be considered as normal. However, as we get more information, we realize that the data is from two different routes with different traffic conditions. Figure 7.9 shows separate histograms for the two routes. Now observe the probability plots of the data for the two routes.

Minitab commands are > Graph > Probability Plot > Select Multiple. Select Time as Graph variable and Route as Categorical variable. See the dialogue box in Figure 7.10 for guidance.

 

 

Figure 7.8 Histogram and normal probability plot of transportation time

 

 

Figure 7.9 Histograms separated by route

 

Minitab output is shown in Figure 7.11.

We can now get a better understanding about the data from the probability-plot shown in Figure 7.11. As the p-values > 0.05, we should conclude that both groups are reasonably normal. We can calculate the capability for each of the routes using normal distribution.

If graphical analysis and probability plotting show that the data is inherently not normal and does not have any specific causes or process streams, we need to use methods of non-normal data analysis. Usually there are two options:

  1. Transform the data so that the transformed values are normally distributed.

  2. Fit a different distribution such as Weibull, Lognormal, etc.

 

Figure 7.10 Minitab dialogue box for multiple probability plot

 

 

Figure 7.11 The probability plot for the two different routes

Transforming Non-Normal Data

Transformation of data means using some function of the response.

There are two types of transformations supported by most software programs:

  1. Box-Cox transformation and
  2. Johnson transformation.

The Box-Cox Transformation

The Box-Cox transformation uses the function Yt = Yλ, where Yt is transformed value of Y and λ can take on different values. Interpretation of various typical values of λ is shown below. The Box-Cox transformation functions are:

 

 

Interim values of λ are also possible.

Application Example

Data of repair time in a shop for 500 machine stoppages is collected. Data is in Excel file ‘repair time.xls’ or Minitab worksheet file ‘repair time.mtw’. Evaluate the process capability, given that the upper specification on repair time is 150 minutes.

First, verify whether the data is normally distributed. The histogram of the data shown in Figure 7.12 is clearly skewed to the right or positively skewed and does not exhibit normality. Normal probability plot of the data shows P-value of < 0.005. Thus, we can conclude that the data is not normal.

 

 

Figure 7.12 Histogram and normal probability plot of repair time

Using Box-Cox Transformation Minitab commands are > Stat > Quality Tools > Individual Distribution Identification; Choose Box-Cox transformation and select optimal lambda.

SigmaXL > Process Capability > Non-normal > Choose Box Cox Transformation and select optimal lambda.

The software output suggests that the value of λ = 0.5. With this transformation, the p-value for Anderson Darling Normality Test is 0.765. Thus transformation can be used to estimate process capability. The output from the SigmaXL software (see Figure 7.13) shows a histogram of the original data and normal probability plot of the transformed data. The normal probability plot and the fit look much better. The estimated percent defective above USL is 11.56. Note that the specification limit must also be transformed using the same transformation function. Here the transformed upper specification limit is

The Johnson Transformation

The Johnson transformation system was developed by Norman L. Johnson in 1949 (AIAG 2005). Depending on the type of variable, the system is given by

 

 

Luckily, there is no need to perform any calculation in this case! We can use Minitab or SigmaXL. Readers can try the above example using the Johnson transformation system and compare the estimated defect level with Box-Cox transformation.

 

 

Figure 7.13 Process capability analysis using Box-Cox transformation

 

 

Figure 7.14 Using Johnson Transformation in Minitab

 

Minitab Commands: > Stat > Quality Tools > Johnson Transformation. Select ‘Repair Time’ and store the transformed data in the column name ‘transformed’ as shown in the dialogue box in Figure 7.14.

SigmaXL commands are: > Process Capability > Non-normal > Process Capability Combination report. Uncheck the control chart options and select Johnson transformation. The p-value of normality test of the transformed data is 0.888. In SigmaXL, there is a good option of ‘Automatic best fit’ in the above command. Try using it and find out which is the best fit option.

One of the issues with transformations is we are not able to visualize the relationship directly. With software programs, we are better off as we can still see the specifications.

Process Capability Using Other Distributions

There are other mathematical distributions such as Lognormal or Weibull which can be used for nonnormal data. Interestingly, the lognormal distribution is actually a transformation using logarithm of the original data. This is already included in Box-Cox transformation with λ = 0. The Weibull distribution is most often used in the analysis of life data in reliability engineering. This distribution is usually defined by two parameters—shape parameter β and scale parameter θ. Figure 7.15 shows the effect of these two parameters on Weibull distribution.

In capability analysis using Weibull distribution, the objective is to assess whether data fits reasonably well on Weibull distribution and, if it does, to estimate the above parameters. The parameters are then used to estimate the defect level beyond the specification. Figure 7.16 illustrates the calculation of defect level when distribution parameters are estimated.

With the use of software, we can choose a best-fit distribution for a given set of data. Details of these distributions are not included in the scope of this book. For more information on distributions refer to Statistical Quality Control (Montgomery 2001).

 

 

Figure 7.15 The effect of shape and scale parameter on Weibull distribution

 

 

Figure 7.16 Weibull distribution calculation of area and defect level

 

Minitab commands are > Stat > Quality Tools > Individual Distribution Identification (See Figure 7.17).

SigmaXL version 6.0 also supports distribution identification. The commands are:> SigmaXL > Process Capability > Non-normal > Distribution Fitting. We can choose a few distributions or allow the choice of all distributions and transformations.

Minitab output in session window is shown in Table 7.2. From the list of distributions, choose the one that gives the minimum value of Anderson Darling (AD) statistic and larger p-value. In case of 3-parameter distributions, consider the likelihood-ratio-test (LRT) value in session window. Choose a 3-parameter instead of the corresponding 2-parameter distribution if the LRT value is less than the significance level, i.e., < 0.05 for 95 percent confidence level. This means that 3-parameter distribution should be considered as better fit than the corresponding 2-parameter distribution when p-value for LRT is < 0.05.

Process knowledge and past experience should be used along with the statistical guidelines given Table 7.2 above. In this case, we could choose Johnson (AD = 0.199), Weibull 2-parameter (AD = 0.327) or Box-Cox transformation (AD = 0.404). While Johnson transformation gives the lowest AD value, it is sometimes not preferred; the reason being its complex transformation function which makes it difficult to relate to the actual data.

If we choose Weibull distribution, then capability analysis can be performed using the following commands:

Minitab:> Stat > Quality Tools > Process Capability > Process Capability Analysis > Non-normal. Choose to fit data with Weibull distribution and choose benchmark Z's (sigma level) in the options. (See the dialogue box in Figure 7.18 for guidance).

 

 

Figure 7.17 Identifying best-fit distribution in Minitab

 

 

Figure 7.18 Capability analysis with non-normal distribution in Minitab

 

Table 7.2 Minitab output for distribution fitting

 

See Figure 7.19 for Minitab graphical output.

Minitab output shows that the percentage > USL of 11.97 corresponds to the probability of 0.1197. To find an equivalent Z-value, consider that the area to the right of such a Z-value for a standard normal curve is 0.1197. This benchmark Z-value is 1.18. This can easily be converted to a sigma level of 2.68 by adding 1.5 sigma shift.

 

 

Figure 7.19 Process capability with Weibull distribution

Exercise for Practice

A service center has collected data on waiting time for 100 service calls. This data is available in the file ‘callcenter.MTW’ or ‘Callcenter.xls’.

  • Does this data fit normal distribution?
  • If not, what is the appropriate transformation?
  • Perform process capability analysis if the customers are not willing to wait for more than 5 minutes.
Process Capability of Attribute Data

In case we are unable to use variable data, we have to use attribute data. Examples are: pass/fail, accept/reject, leak/no leak, etc. The process capability of attribute data is applicable in a limited sense.

  • A proportion or percent of the defective using p-charts (Binomial distribution)
  • The number of defects per unit using u-charts (Poisson distribution)

Procedures require control charting using attribute control charts, eliminating assignable causes and then estimating quality levels. Control charts are discussed in Chapter 17, which is on statistical process control.

Application Example of Percent Defective

A company manufactures toothpaste. Some caps cannot be fitted on the tubes due to poor quality of threading. During the course of a week, 20,000 caps out of 3,50,000 could not be fitted.

The total average defect level is 5.714 percent. This corresponds to 0.057 defects per opportunities (DPO) and 57142 DPMO. As shown in Table 7.3, this data can be readily converted into sigma level of 3.08 using the sigma calculator template provided in the CD.

 

Table 7.3 Sigma level of tube caps

Opportunities for defect per unit
1
Defects
20000
Sample Size
350000
DPO
0.05714
DPMP
57142.9
Sigma Level
3.08

Application Example of Number of Defects per Unit (DPU)

A vehicle manufacturer performs final inspection of vehicles. The data of defects and the number of vehicles inspected is available in Minitab worksheet file ‘vehicledefects.MTW’. There are in all 10 opportunities for defects in any vehicle.

The total number of vehicles produced is 1585 and the number of defects observed is 1126. Therefore, the DPU = 1126/1585 = 0.71.

Using the sigma calculator, this is converted to a sigma level of 2.97 (See Table 7.4).

 

Table 7.4 Sigma level of vehicles

Opportunities for defect per unit
10
Defects
1126
Sample Size
1585
DPO
0.07104
DPMO
71041
Sigma Level
2.97
Summary

Process capability is an essential step in every Six Sigma project to establish the baseline. Process capability index Cp is a ratio of the ‘voice of customer’ (VOC) to the ‘voice of process’ (VOP). Cp indicates the basic or inherent capability whereas Cpk indicates the capability considering centering of the process. Cp and Cpk values of >1.33 have been conventionally considered acceptable by the industry. For a process to achieve the Six Sigma level, the values should be: Cp > 2 and Cpk > 1.5. Process capability of non-normal data can be estimated using transformation to normal distribution or other distributions such as Weibull. Process capability of attribute data relates to the defect level in terms of proportion defective or PPM, DPU or DPMO which can be converted into Sigma Level.

Measure Phase Checklist

The following checklist can be useful to review whether the necessary activities for the measure phase have been completed and whether the project is ready to be moved to the Analyze phase.

Checkpoints for Completion of Define Phase

  • Process Map is completed with Xs and Ys identified
  • Value Stream Map (VSM) is completed if applicable (for lean projects)
  • MSA is complete and meets the norms.
  • Initial Process Capability and Sigma Level is evaluated to understand gap between desired and actual results.
  • Cause and Effects Matrix is completed and KPIVs are ranked by importance
References

Automotive Industry Action Group (AIAG). (2005). Statistical Process Control (SPC).

Breyfogle III, Forrest W. (1999). Implementing Six Sigma: Smarter Solutions Using Statistical Methods. New York, NY: John Wiley & Sons.

Montgomery, Douglas C. (2001). Statistical Quality Control. New York, NY: John Wiley & Sons.