Academic Editor:Yudong Zhang
School of Mechanical Engineering, Southwest Jiaotong University, Chengdu 610031, China
Received 4 July 2014; Revised 31 October 2014; Accepted 5 November 2014; 5 October 2015
This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
1. Introduction
In today's manufacturing and service industries, control charts are particularly important tools to improve product quality and monitor production process. Various kinds of control charts have been developed by different quality attributes and control targets. Recognizing control chart patterns (CCPs) is one of the most prevalently used techniques to detect process disturbances, equipment malfunctions, or other special events. In general, six basic CCPs are commonly exhibited by control charts, including normal (NOR), cyclic (CC), increasing trend (IT), decreasing trend (DT), upward shift (US), and downward shift (DS). Figure 1 shows these six types of control chart patterns [1]. Over the past two decades, attention has been given to improve the recognition accuracies of these basic CCPs using normalized original data. Automatic CCPs recognition was an active research area in last decade but has not yet been realized fully.
Figure 1: Six basic control chart patterns: (a) normal (NOR), (b) cyclic (CYC), (c) increasing trend (IT), (d) decreasing trend (DT), (e) upward shift (US), and (f) downward shift (DS).
(a) [figure omitted; refer to PDF]
(b) [figure omitted; refer to PDF]
(c) [figure omitted; refer to PDF]
(d) [figure omitted; refer to PDF]
(e) [figure omitted; refer to PDF]
(f) [figure omitted; refer to PDF]
There are numerous research papers on CCPs organization. Most of the previous studies are concerned with the recognition of single abnormal CCPs [2-4]. However, in practice, the observed process data may be mixture CCPs, which may be combined with two or three basic patterns. Compared to the basic patterns, the mixture patterns are more difficult to recognize and result in serious performance degradation for patterns recognition. So it is a challenging task to identify mixture patterns effectively. Only a few studies have reported on mixture patterns recognition [5-8]. Guh and Tannock [5] use the back-propagation neural network to recognize the mixture CCPs. H. Yang and S. Yang [6] propose an efficient statistical correlation coefficient method for the recognition of mixture CCPs. Chen et al. [7] integrate wavelet method and back-propagation neural network for online recognition of mixture CCPs. Lu et al. [8] propose a hybrid system that uses independent component analysis and supports vector machine to recognition mixture CCPs.
Feature extraction plays an important role in CCPs recognition. Most of the existing literatures use normalized original data as the inputs. These data normally generate large structures and are not very effective for complicated recognition problems. A smaller data size can lead to faster training and more efficiency. Regarding this, researchers have proposed various methods to extract features for CCPs recognition [2, 9, 10]. Ranaee et al. [2] use both shape features and statistical features as the data inputs. The results show that this method is good for control chart recognition. Hassan et al. [9] introduce feature-based control chart pattern recognition. Six statistical features were proposed: mean, variance, skewness, mean-square value, autocorrelation, and cusum. It is intended to improve the performance of CCPs recognizer by smaller size features. Gauri and Chakraborty [10] present the improved feature extraction from a large number of potentially useful features using a CART-based approach. And other feature extraction methods are proposed for eliminating the duplicated information, like independent component analysis (ICA) [11], fisher discriminate analysis (FDA) [12], and principal component analysis (PCA) [13, 14]. The feature extraction efforts cited above did not approach a suitable set of features. In this paper, thirteen features that consist of both statistical and shape features of the CCPs are initially chosen. It is a well-established dimensionality reduction technique, which can be employed to compress the noise and correlated measurements, so that makes the data into a simpler and smaller informative subspace for measurement data sets.
Traditionally, CCPs were analyzed and interpreted manually. Until the end of the 1980s, expert systems were employed for control chart patterns recognition [15, 16]. With the development of computer technology, machine learning techniques have been widely adopted in automatic process monitoring. In particular, artificial neural networks (ANNs) are the most frequently used in control chart patterns recognition [17-20]. The use of ANNs has overcome some drawbacks in the traditional expert system method. Artificial neural networks utilize a multilayer perception with back propagation training to classify unnatural patterns and show higher accuracy. In subsequent studies, many other methods like decision tree, fuzzy clustering, and wavelet analysis are combined with ANNs to recognize CCPs [19, 20].
However, ANNs also suffer from several weaknesses, such as the need for a large amount of training data, bad generalization ability, the risk of model over-fitting, difficulty to obtain stable solution, and getting into a local extremum easily. The application of ANNs is limited due to these weaknesses. Support vector machine (SVM), based on statistical learning theory, is proposed to recognize CCPs because of its excellent performance in the practical application. It mainly used the principle of structural risk minimization, which makes it have greater generalization ability when there is a small sample, and is superior to the principle of the empirical risk minimization principle as artificial neural networks [7, 21, 22]. The biggest problems encountered in setting up the SVM model are how to select the kernel function and its parameters values. The parameter set of the penalty parameter and kernel function parameter should be optimized.
The purpose of this study is to develop an intelligent hybrid CCPs recognition model that can be used for mixture CCPs to improve the recognition accuracy. This paper considers the six basic and four mixture CCPs and generates their statistical and shape features as the inputs and multiclass support vector machine (MSVM) as classifier. At the same time, genetic algorithm (GA) is chosen as an optimization tool to optimize the MSVM parameters. This model will improve CCPs recognition performance.
2. Modeling for Control Chart Patterns Recognition
The aim of this model is to recognize CCPs effectively and automatically. Figure 2 shows the schematic diagram representing the procedure of the CCPs recognition, in which three modules are in series: feature extraction, classifier, and parameters optimization (F_MSVM_GA).
Figure 2: Flow chart of F_MSVM_GA model.
[figure omitted; refer to PDF]
In the feature extraction module, statistical and shape features of observation data are used as the data inputs for the classifier. As we know, every control chart pattern has different properties, and features represent the properties of various CCPs. If some effective features are chosen to reflect the pattern, it is easier to recognize the abnormal patterns. Original data as the inputs usually have large data and are not very effective for complicated recognition problems. In this paper, statistical and shape features of CCPs as the feature extraction method are utilized to get the suitable data. In classifier module, an MSVM classifier is developed for recognizing the basic and mixture patterns. In order to achieve satisfactory recognition performance, the MSVM classifier needs to be properly designed, trained, and tested. However, using MSVM has some difficulties, like how to select the optimal kernel function type and the most appropriate hyperparameters values for MSVM training and testing stages. Therefore, genetic algorithm is applied for finding the optimum values of hyperparameters, that is, the kernel parameter and classifier parameters in parameters optimization module.
2.1. Statistical and Shape Features
The patterns can be described in the original data. The statistical features and shape features can be got from the original data. It is efficient to simplify the data number and get the useful information. In this paper, eight statistical features and five shape features are chosen to reflect the patterns; these thirteen features are, respectively, shown below [2].
2.1.1. Statistical Features
They are as follows: mean, standard deviation, mean-square value, autocorrelation, positive cusum, negative cusum, skewness, and kurtosis.
(1 ) Mean. The mean for normal and cyclic pattern is around zero, while that for other patterns is different from zero. Therefore, it may be a good candidate to differentiate normal and cyclic patterns from other patterns: [figure omitted; refer to PDF] (2 ) Standard Deviation. Standard deviation of sample data, each mode performance is different [figure omitted; refer to PDF] (3 ) Mean-Square Value. Consider [figure omitted; refer to PDF] (4 ) Average Autocorrelation. This paper takes the average of correlation degree between property values for each sample: [figure omitted; refer to PDF] (5 ) Positive Cusum. Sample data points are greater than the average and then cumulate the gap between data and their average: [figure omitted; refer to PDF] (6 ) Negative Cusum. Sample data points are smaller than the average and then cumulate the gap between data and their average: [figure omitted; refer to PDF] (7 ) Skewness. It provides information regarding the degree of asymmetry: [figure omitted; refer to PDF] (8 ) Kurtosis. It measures the relative peakness or flatness of its distribution: [figure omitted; refer to PDF]
2.1.2. Shape Features
They are as follows: slope, N1, N2, APML, and APLS.
(1 ) Slope. The slope of the least-square line: the slope for normal and cyclic pattern is around zero, while that for other patterns is greater than zero. Therefore, it may be a good candidate to differentiate normal and cyclic patterns from other patterns: [figure omitted; refer to PDF] (2 ) N1. The number of mean crossing: it is almost zero for shift and trend patterns but very high for normal patterns; cyclic pattern is the intermediate pattern. The feature differences can distinguish the normal and cyclic from shift and trend patterns: [figure omitted; refer to PDF] (3 ) N2. The number of least-square line crossing: this feature is the highest for normal and trend patterns, intermediate for shift patterns, and the lowest for cyclic patterns. Thus it can be used for separation of normal and trend patterns from others: [figure omitted; refer to PDF] (4 ) APML. The area between the pattern and its mean line: this feature is the lowest for normal pattern; therefore, it differentiates the normal pattern from others: [figure omitted; refer to PDF] (5 ) APLS. The area between the pattern and its least-square line: normal and trend patterns have lower values than shift and cyclic patterns. Thus it can be used to distinguish normal and trend patterns from shift and cyclic patterns: [figure omitted; refer to PDF]
2.2. Support Vector Machine
Basic SVM is invented by Vapnik of the AT&T Bell lab team. It is created based on the VC dimension theory and structural risk minimization of statistical learning theory. So that gets the best solution between model complexities and learning ability according to the limited sample information. The basic SVM deals with two-class problems. However, it can be extended to Multiclass SVM [7, 21-23].
An SVM performs classification tasks by constructing optimal separating hyperplanes (OSHs). An OSH maximizes the margin between the two nearest data points belonging to two separate classes. Suppose that the training set, [figure omitted; refer to PDF] , [figure omitted; refer to PDF] , [figure omitted; refer to PDF] , [figure omitted; refer to PDF] , can be separated by the hyperplane [figure omitted; refer to PDF] , where [figure omitted; refer to PDF] is the number of sample observations and [figure omitted; refer to PDF] is the dimension of each observation and [figure omitted; refer to PDF] is the weight vector and [figure omitted; refer to PDF] is the bias. If this hyperplane separates the data from two classes with maximal margin width [figure omitted; refer to PDF] and all the points on the boundary are named the support vector, the SVM solves the following optimization problem: [figure omitted; refer to PDF]
This is a convex quadratic programming (QP) problem, and Lagrange multipliers [figure omitted; refer to PDF] are used to solve it. And, for input data with a high noise level, an SVM using soft margins can be expressed with the introduction of the nonnegative slack variables [figure omitted; refer to PDF] . Equation (14) is transformed into the following constrained form: [figure omitted; refer to PDF]
In (15), [figure omitted; refer to PDF] is the penalty factor; it determines the penalty degree of the error. It can be viewed as a tuning parameter, which can be utilized to control the trade-off between maximizing the margin and the classification error.
An MSVM method is adopted in the classifier stage. There are two methods: one-against-all (OAA) or one-against-one (OAO). Suppose that it has an N-class pattern recognition problem; N independent SVMs are constructed and each of them is trained to separate one class of samples from all others. When testing the system after all the SVMs are trained, a sample is input to all the SVMs. Suppose that this sample belongs to class N1; ideally only the SVM trained to separate class N1 from the others can have a positive response. Another method is called one-against-one (OAO) method. For an N-class problem, SVMs are constructed and each of them is trained to separate one class from another class. Again, the decision of a testing sample is based on the voting results of these SVMs. In this paper, OAO is adopted for patterns recognition [24].
In the nonlinearly separable cases, which cannot be linearly separated in the input space, the SVM uses the kernel method to transform the original input space into a high dimensional feature space, where an optimal linear separating hyperplane can be found. Although there are several types of kernel function, the most widely used kernel function is the radial basis function (RBF), which is defined as [figure omitted; refer to PDF]
The largest problems encountered in MSVM are how to select the penalty parameter [figure omitted; refer to PDF] and kernel function parameters value [figure omitted; refer to PDF] . The GA is used to search for the best value of parameters in MSVM classifier.
2.3. Genetic Algorithms (GA)
GA is a powerful tool in the field of global optimization. It has better search efficiency, robustness, and parallel compared with traditional optimization algorithms. Genetic algorithms belong to the larger class of evolutionary algorithms, which generate solutions to optimization problems using techniques inspired by natural evolution, such as inheritance, mutation, selection, and crossover.
In this paper, OAO and RBF kernel function are adopted for MSVM; the performance of an MSVM is mainly impacted by the setting of parameters of two parameters ( [figure omitted; refer to PDF] and [figure omitted; refer to PDF] ). The GA is used to search for the best value of parameters in MSVM classifier. The particle has two dimensions: [figure omitted; refer to PDF] and [figure omitted; refer to PDF] ; the accuracy of training set is selected as the fitness function. The steps are as follows [25].
Step 1.
Set GA parameters, like the number of population, evolutionary generation, crossover and mutation probability, and parameter ranges.
Step 2.
Optimize coding parameters and initialize the population.
Step 3.
Optimize decoding parameters and calculate the recognition rate, decode chromosomes of population, select the training set of model, and use training set recognition rate as the fitness function of the GA algorithm, so that we obtain the optimal MSVM parameters ( [figure omitted; refer to PDF] and [figure omitted; refer to PDF] ).
Step 4.
Genetic manipulation (selection, crossover, and mutation): each chromosome does the selection, crossover, and mutation based on the fitness, thus excluding low fitness chromosomes and leaving high fitness chromosomes. The new group members are outstanding in the previous generation groups, which are better than the previous generation. GA performs iteratively until meeting some predetermined optimized targets.
Step 5.
Get optimal parameters: decode the best chromosome, use the optimal parameters to train the training data in support vector machine classifier, and ultimately get the optimized support vector machine classifier.
3. Simulation and Results Analysis
3.1. Data Generation
In order to analyze the CCPs recognition, Monte Carlo method is used to get the sample data. The following equation is applied to generate the data points for six basic patterns; different parameters are shown in Table 1 [8]: [figure omitted; refer to PDF] where [figure omitted; refer to PDF] means the value of sample data at time [figure omitted; refer to PDF] ; [figure omitted; refer to PDF] is the mean of data; [figure omitted; refer to PDF] , [figure omitted; refer to PDF] is the random value of standard normal distributed between -1 and 1, [figure omitted; refer to PDF] is the standard deviation of normal distribution, and [figure omitted; refer to PDF] is the abnormal value. We chose [figure omitted; refer to PDF] , [figure omitted; refer to PDF] and use the 40 data points of observation window as inputs of the feature extraction model. Every pattern generates 100 sample data.
Table 1: The parameters of six basic patterns.
Control chart patterns | Pattern equations | Parameters |
Normal (NOR) | [figure omitted; refer to PDF] | Mean [figure omitted; refer to PDF] , standard deviation [figure omitted; refer to PDF] |
Cyclic (CYC) | [figure omitted; refer to PDF] | Period [figure omitted; refer to PDF] , amplitude [figure omitted; refer to PDF] |
Increasing (decreasing) trend (IT/DT) | [figure omitted; refer to PDF] | Gradient [figure omitted; refer to PDF] |
Increasing (decreasing) shift (US/DS) | [figure omitted; refer to PDF] | Shift magnitude [figure omitted; refer to PDF] , shift position [figure omitted; refer to PDF] |
However, the observed process data may be mixture control chart patterns in practice, which is combined with two or three basic patterns. Figure 3 shows four kinds of mixture CCPs, which are combined with cyclic, increasing trend and decreasing shift. We know that the principles of increasing/decreasing trend or upward/downward shift are similar, so the increasing trend and downward shift are chosen for the mixture CCPs. And sample data of mixture CCPs can be generated by different parameters in Table 3. Six basic CCPs (see Figure 1) and four mixture CCPs (see Figure 3) are, respectively, used for training and testing the proposed F_MSVM_GA method in this study (Table 2).
Table 2: The parameters of mixture patterns.
Control chart patterns | Pattern equations | Parameters |
Cyclic + increasing trend (CT) | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] , [figure omitted; refer to PDF] , [figure omitted; refer to PDF] |
Cyclic + decreasing shift (CS) | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] , [figure omitted; refer to PDF] , [figure omitted; refer to PDF] , [figure omitted; refer to PDF] |
Increasing trend + decreasing shift (TS) | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] , [figure omitted; refer to PDF] , [figure omitted; refer to PDF] |
Cyclic + increasing trend + decreasing shift (CTS) | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] , [figure omitted; refer to PDF] , [figure omitted; refer to PDF] , [figure omitted; refer to PDF] , [figure omitted; refer to PDF] |
Table 3: Parameters in GA.
Parameters name | Value |
Maximum number of iterations | 100 |
Population size | 20 |
Crossover probability | 0.4 |
Mutation probability | 0.01 |
Figure 3: Four mixture CCPs: (a) mix-cyclic + trend (CT), (b) mix-cyclic + shift (CS), (c) mix-trend shift (TS), and (d) mix-cyclic + trend + shift (CTS).
(a) [figure omitted; refer to PDF]
(b) [figure omitted; refer to PDF]
(c) [figure omitted; refer to PDF]
(d) [figure omitted; refer to PDF]
3.2. Parameters of MSVM and GA
The performance of MSVM is influenced by its parameters. As Section 2.2 analysis, MSVM based on RBF kernel function is chosen in this study. Related parameters [figure omitted; refer to PDF] and [figure omitted; refer to PDF] for this kernel were varied in the fixed ranges [0.1, 100] and [0.01, 10], so as to cover high or small regulations of the classifier and fat or thin kernels, respectively. In the GA optimization module, there are several coefficients, whose values can be adjusted to produce better performances during training in this study, are summarized in Table 3.
4. Performance Analyses
In this section, we measure the performance of the proposed recognizer. For this purpose, we have previously generated 10 patterns, 100 of each type; every sample has 40 data points of observation windows. And we have used about 50% of the sample for training the classifier and the rest for testing. The testing samples can be used to estimate the performance of recognizer for each pattern and then compute the average recognition accuracy of CCPs. Several performances are done to verify the effectiveness of the proposed model.
4.1. Performance of Recognizer in Optimization
First, we have applied MSVM classifier with different features. Table 4 indicates the recognition accuracy (RA) of proposed F_MSVM_GA model on the 13 statistical and shape features and GA optimization algorithm. In order to demonstrate the superior performance of the proposed F_MSVM_GA scheme, MSVM using 13 features as inputs without GA optimization (called F_MSVM) is constructed; the performance results are shown in Table 5.
Table 4: Recognition accuracies with the proposed F_MSVM_GA model.
True pattern | Identified pattern (%) | |||||||||
NOR | CYC | IT | DT | US | DS | CT | CS | TS | CTS | |
NOR | 100 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
CYC | 0 | 98 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
IT | 0 | 0 | 98 | 0 | 0 | 0 | 2 | 0 | 0 | 0 |
DT | 0 | 0 | 0 | 100 | 0 | 0 | 0 | 0 | 0 | 0 |
US | 0 | 0 | 0 | 0 | 100 | 0 | 0 | 0 | 0 | 0 |
DS | 0 | 0 | 0 | 0 | 0 | 98 | 0 | 2 | 0 | 0 |
CT | 0 | 0 | 2 | 0 | 0 | 0 | 98 | 0 | 0 | 0 |
CS | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0 |
TS | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 96 | 2 |
CTS | 0 | 10 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 88 |
| ||||||||||
Average | 97.6 |
|
|
|
|
|
|
|
|
|
Table 5: Recognition accuracies with the F_MSVM model.
True pattern | Identified pattern (%) | |||||||||
NOR | CYC | IT | DT | US | DS | CT | CS | TS | CTS | |
NOR | 100 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
CYC | 2 | 86 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | 6 |
IT | 0 | 0 | 96 | 0 | 0 | 0 | 4 | 0 | 0 | 0 |
DT | 0 | 0 | 0 | 100 | 0 | 0 | 0 | 0 | 0 | 0 |
US | 0 | 0 | 0 | 0 | 100 | 0 | 0 | 0 | 0 | 0 |
DS | 0 | 0 | 0 | 2 | 0 | 96 | 0 | 2 | 0 | 0 |
CT | 0 | 0 | 6 | 0 | 0 | 0 | 94 | 0 | 0 | 0 |
CS | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 98 | 0 | 0 |
TS | 6 | 0 | 12 | 0 | 0 | 0 | 0 | 0 | 82 | 0 |
CTS | 0 | 34 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | 62 |
| ||||||||||
Average | 91.4 |
|
|
|
|
|
|
|
|
|
As reported in Tables 4 and 5, the average recognition accuracies of F_MSVM_GA and F_MSVM are 97.4% and 91.4%. The proposed F_MSVM_GA model has better recognition performance for the mixture CCPs, especially in TS and CTS. Genetic algorithm searches for the best combination of MSVM classifier parameters to gain the fitness maximum, so as to improve recognition rate of testing samples.
4.2. Performance of Recognizer in Different Features
Feature extraction can lead to faster training and more efficiency in CCPs recognition. Thirteen statistical and shape features are utilized as the inputs in this paper. In order to explain its effectiveness, MSVM classifier using the original 40 data points as the inputs (called D_MSVM) is constructed. Table 6 shows the recognition accuracy of mixture CCPs.
Table 6: Recognition accuracies with the D_MSVM model.
True pattern | Identified pattern (%) | |||||||||
NOR | CYC | IT | DT | US | DS | CT | CS | TS | CTS | |
NOR | 70 | 4 | 2 | 24 | 0 | 0 | 0 | 0 | 0 | 0 |
CYC | 2 | 76 | 2 | 0 | 0 | 0 | 14 | 0 | 0 | 6 |
IT | 14 | 0 | 80 | 0 | 2 | 0 | 4 | 0 | 0 | 0 |
DT | 0 | 4 | 0 | 88 | 0 | 0 | 0 | 0 | 2 | 6 |
US | 0 | 0 | 0 | 0 | 98 | 0 | 2 | 0 | 0 | 0 |
DS | 0 | 0 | 0 | 2 | 0 | 92 | 0 | 4 | 2 | 0 |
CT | 2 | 22 | 2 | 0 | 2 | 0 | 72 | 0 | 0 | 0 |
CS | 0 | 0 | 0 | 2 | 0 | 2 | 0 | 74 | 0 | 22 |
TS | 0 | 0 | 0 | 14 | 0 | 12 | 0 | 0 | 64 | 10 |
CTS | 0 | 4 | 0 | 12 | 0 | 2 | 0 | 16 | 0 | 66 |
| ||||||||||
Average | 78.0 |
|
|
|
|
|
|
|
|
|
The average recognition accuracies of D_MSVM (78.0%) and F_MSVM (91.4%) show that feature extraction method plays an important role in improving the recognition accuracy. From the data, we can find that mixture control chart patterns are difficult to recognize due to the complex relation, but the result is much better after using statistical and shape features extraction method.
4.3. Comparison of the Proposed Method with BP Method
Artificial neural networks (ANNs) are the most frequently used in control chart patterns recognition. It utilized a multilayer perception with back-propagation training to classify abnormal patterns. Back-propagation (BP) method is a common method of training artificial neural networks used in conjunction with an optimization method such as gradient descent. We define that the number of input neurons is 50 and the number of hidden layers is 5. Table 7 shows the performance results.
Table 7: Recognition accuracies with the BP model.
True pattern | Identified pattern (%) | |||||||||
NOR | CYC | IT | DT | US | DS | CT | CS | TS | CTS | |
NOR | 90 | 8 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 |
CYC | 4 | 88 | 4 | 2 | 2 | 0 | 0 | 0 | 0 | 0 |
IT | 0 | 0 | 56 | 32 | 10 | 2 | 0 | 0 | 0 | 0 |
DT | 0 | 6 | 24 | 32 | 12 | 26 | 0 | 0 | 0 | 0 |
US | 0 | 0 | 8 | 28 | 34 | 30 | 0 | 0 | 0 | 0 |
DS | 0 | 0 | 0 | 2 | 10 | 74 | 10 | 2 | 2 | 0 |
CT | 0 | 0 | 0 | 0 | 8 | 24 | 66 | 0 | 0 | 2 |
CS | 0 | 0 | 0 | 0 | 0 | 0 | 22 | 78 | 0 | 0 |
TS | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 10 | 84 | 4 |
CTS | 0 | 2 | 0 | 2 | 6 | 2 | 0 | 14 | 16 | 58 |
| ||||||||||
Average | 65.2 |
|
|
|
|
|
|
|
|
|
Compare MSVM models; the accuracy of BP method is only 65.4%, much lower than the other MSVM methods. The reason is that BP neural network quite depends on the quantity and quality of the sample data, but only 50 training samples are considered in this study; it belongs to the small sample noise problem.
We have compared the proposed model with other approaches. This comparison can be seen in Figure 4, and six basic CCPs and mixture CCPs are, respectively, numbered from 1 to 10.
Figure 4: Mixture CCPs recognition accuracy results of the four methods: (a) F_MSVM_GA, (b) F_MSVM, (c) D_MSVM, and (d) BP.
(a) Classification results of F_MSVM_GA
[figure omitted; refer to PDF]
(b) Classification results of F_MSVM
[figure omitted; refer to PDF]
(c) Classification results of D_MSVM
[figure omitted; refer to PDF]
(d) Classification results of BP
[figure omitted; refer to PDF]
5. Conclusion
Control charts are the most useful tools in statistical process control, and mixture control chart patterns are more and more widely used in manufacturing and service processes. Recognizing the mixture CCPs plays an important role in finding the abnormal quality problems. In this study, a hybrid method by integrating statistical and shape features extraction, MSVM, and GA are presented for recognizing the mixture CCPs. The proposed method initially uses statistical and shape features to get effective input data; then the combination of MSVM and GA is applied to recognize the mixture patterns. GA is to optimize the parameters of MSVM kernel parameters. Six basic CCPs and four mixture CCPs are used in this study for evaluating the performance of the proposed method. From the experiments, the simulation results indicate that the intelligent hybrid method can achieve the highest average recognition accuracies in the tested methods.
The future work will be focused on the following aspects: (1) employing statistical and shape features method as feature extraction method which we will compare with other excellent feature extraction methods, (2) comparing GA with other intelligent algorithms, including particle swarm optimization, simulated annealing algorithm, and ant colony optimization, (3) researching the fundamental principles of mixture CCPs with the help of mathematicians, and (4) seeking economic explanation of our method with the help of economists.
Acknowledgments
This work is financially supported by National Natural Science Foundation of China (NSFC) under Grant no. 51175442 and the Fundamental Research Funds for the Central Universities under Grant no. 2682014BR022.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
[1] D. C. Montgomery Introduction to Statistical Quality Control , John Wiley & Sons, New York, NY, USA, 2001.
[2] V. Ranaee, A. Ebrahimzadeh, R. Ghaderi, "Application of the PSOSVM model for recognition of control chart patterns," ISA Transactions , vol. 49, no. 4, pp. 577-586, 2010.
[3] R.-S. Guh, "A hybrid learning-based model for on-line detection and analysis of control chart patterns," Computers and Industrial Engineering , vol. 49, no. 1, pp. 35-62, 2005.
[4] S. K. Gauri, S. Chakraborty, "Feature-based recognition of control chart patterns," Computers & Industrial Engineering , vol. 51, no. 4, pp. 726-742, 2006.
[5] R.-S. Guh, J. D. T. Tannock, "Recognition of control chart concurrent patterns using a neural network approach," International Journal of Production Research , vol. 37, no. 8, pp. 1743-1765, 1999.
[6] J. H. Yang, M. S. Yang, "A control chart pattern recognition system using a statistical correlation coefficient method," Computers and Industrial Engineering , vol. 48, no. 2, pp. 205-221, 2005.
[7] Z. Chen, S. Lu, S. Lam, "A hybrid system for SPC concurrent pattern recognition," Advanced Engineering Informatics , vol. 21, no. 3, pp. 303-310, 2007.
[8] C.-J. Lu, Y. E. Shao, P.-H. Li, "Mixture control chart patterns recognition using independent component analysis and support vector machine," Neurocomputing , vol. 74, no. 11, pp. 1908-1914, 2011.
[9] A. Hassan, M. S. Nabi Baksh, A. M. Shaharoun, H. Jamaluddin, "Improved SPC chart pattern recognition using statistical features," International Journal of Production Research , vol. 41, no. 7, pp. 1587-1603, 2003.
[10] S. K. Gauri, S. Chakraborty, "Recognition of control chart patterns using improved selection of features," Computers and Industrial Engineering , vol. 56, no. 4, pp. 1577-1588, 2009.
[11] M. Kano, S. Tanaka, S. Hasebe, I. Hashimoto, H. Ohno, "Monitoring independent components for fault detection," AIChE Journal , vol. 49, no. 4, pp. 969-976, 2003.
[12] Q. P. He, S. J. Qin, J. Wang, "A new fault diagnosis method using fault directions in Fisher discriminant analysis," AIChE Journal , vol. 51, no. 2, pp. 555-571, 2005.
[13] G. N. Costache, P. Corcoran, P. Puslecki, "Combining PCA-based datasets without retraining of the basis vector set," Pattern Recognition Letters , vol. 30, no. 16, pp. 1441-1447, 2009.
[14] H. Wang, Z. Song, P. Li, "Fault detection behavior and performance analysis of principal component analysis based process monitoring methods," Industrial & Engineering Chemistry Research , vol. 41, no. 10, pp. 2455-2464, 2002.
[15] J. A. Swift, J. H. Mize, "Out-of-control pattern recognition and analysis for quality control charts using LISP-based systems," Computers & Industrial Engineering , vol. 28, no. 1, pp. 81-91, 1995.
[16] C. S. Cheng, N. F. Hubele, "Design of a knowledge-based expert system for statistical process control," Computers and Industrial Engineering , vol. 22, no. 4, pp. 501-517, 1992.
[17] H. B. Hwarng, N. F. Hubele, "Back-propagation pattern recognizers for X-bar control charts: methodology and performance," Computers and Industrial Engineering , vol. 24, no. 2, pp. 219-235, 1993.
[18] R.-S. Guh, Y.-R. Shiue, "On-line identification of control chart patterns using self-organizing approaches," International Journal of Production Research , vol. 43, no. 6, pp. 1225-1254, 2005.
[19] C. H. Wang, R. S. Guo, M. H. Chiang, J. Y. Wong, "Decision tree based control chart pattern recognition," International Journal of Production Research , vol. 46, no. 17, pp. 124-134, 2008.
[20] C.-H. Wang, W. Kuo, "Identification of control chart patterns using wavelet filtering and robust fuzzy clustering," Journal of Intelligent Manufacturing , vol. 18, no. 3, pp. 343-350, 2007.
[21] S.-Y. Yang, D.-H. Wu, H.-T. Su, "Abnormal pattern recognition method for control chart based on principal component analysis and support vector machine," Journal of System Simulation , vol. 18, no. 5, pp. 1314-1318, 2006.
[22] C. Wu, L. Zhao, "Control chart pattern recognition based on wavelet analysis and SVM," China Mechanical Engineering , vol. 21, no. 13, pp. 1572-1576, 2010.
[23] V. Ranaee, A. Ebrahimzadeh, "Control chart pattern recognition using a novel hybrid intelligent method," Applied Soft Computing Journal , vol. 11, no. 2, pp. 2676-2686, 2011.
[24] Y. Zhang, L. Wu, "Classification of fruits using computer vision and a multiclass support vector machine," Sensors , vol. 12, no. 9, pp. 12489-12505, 2012.
[25] Y. Zhang, S. Wang, G. Ji, "A rule-based model for bankruptcy prediction based on an improved genetic ant colony algorithm," Mathematical Problems in Engineering , vol. 2013, 2013.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Copyright © 2015 Min Zhang and Wenming Cheng. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract
Control charts have been widely utilized for monitoring process variation in numerous applications. Abnormal patterns exhibited by control charts imply certain potentially assignable causes that may deteriorate the process performance. Most of the previous studies are concerned with the recognition of single abnormal control chart patterns (CCPs). This paper introduces an intelligent hybrid model for recognizing the mixture CCPs that includes three main aspects: feature extraction, classifier, and parameters optimization. In the feature extraction, statistical and shape features of observation data are used in the data input to get the effective data for the classifier. A multiclass support vector machine (MSVM) applies for recognizing the mixture CCPs. Finally, genetic algorithm (GA) is utilized to optimize the MSVM classifier by searching the best values of the parameters of MSVM and kernel function. The performance of the hybrid approach is evaluated by simulation experiments, and simulation results demonstrate that the proposed approach is able to effectively recognize mixture CCPs.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer