Next Article in Journal
An Extended Object-Oriented Petri Net Model for Vulnerability Evaluation of Communication-Based Train Control System
Previous Article in Journal
Exact and Numerical Solitary Wave Structures to the Variant Boussinesq System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Pattern Recognition of Different Window Size Control Charts Based on Convolutional Neural Network and Information Fusion

Beijing Key Laboratory of Advanced Manufacturing Technology, Beijing University of Technology, Chaoyang District, Beijing 100124, China
*
Author to whom correspondence should be addressed.
Symmetry 2020, 12(9), 1472; https://doi.org/10.3390/sym12091472
Submission received: 15 August 2020 / Revised: 6 September 2020 / Accepted: 7 September 2020 / Published: 8 September 2020

Abstract

:
Control charts are an important tool for statistical process control (SPC). SPC has the characteristics of fluctuation and asymmetry in the symmetrical coordinate system. It is a graph with control limits used to analyze and judge whether the process is in a stable state. Its fast and accurate identification is of great significance to the actual production. The existing control chart pattern recognition (CCPR) method can only recognize a control chart with fixed window size, but cannot adjust with different window sizes according to the actual production needs. In order to solve these problems and improve the quality management effect in the manufacturing process, a new CCPR method is proposed based on convolutional neural network (CNN) and information fusion. After undergoing feature learning, CNN is used to extract the best feature set from the control chart, while at the same time, expert features (including one shape features and four statistical features) are fused to complete the CCPR. In this paper, the control charts of 10 different window sizes are generated by the Monte Carlo simulation method, and various data patterns are drawn into images, then the CCPR model is set up. Finally, simulation experiments and a real example are addressed to validate its feasibility and effectiveness. The results of simulation experiments demonstrate that the recognition method based on CNN can be used for pattern recognition for different window size control charts, and its recognition accuracy is higher than the traditional ones. In addition, the recognition method based on information fusion performs much better. The result of a real example shows that the method has potential application in solving the pattern recognition problem of control charts with different window sizes.

1. Introduction

Control charts, as important tools in statistical process control (SPC), are used to monitor whether the process is controlled or not. They have long been used for quality monitoring in the manufacturing process [1]. If only random causes affect the operation, it is considered that the production process is natural or normal [2]. At this time, the control chart tends to fluctuate randomly in the symmetrical coordinate system. The traditional control chart easily detects the abnormity beyond the boundary, but it is challenging to identify the abnormity range that usually requires human judgment, and it is easily affected by various factors. Therefore, it is urgent to develop a simple, automatic and accurate control chart pattern recognition (CCPR) method. The rise of intelligent manufacturing puts forward intelligent requirements for quality control in the manufacturing process. Machine learning technology is introduced into the process of quality monitoring, and pattern recognition via a control chart with a machine learning model becomes an effective means to realize it [3].
Machine learning has been widely used in various fields since it was proposed, and it has also achieved relatively perfect results in CCPR. The application research of machine learning algorithms in the CCPR field date back to the 1980s, and some achievements have been realized so far. For example, the back propagation neural network (BPNN) is used to identify abnormal patterns in control charts, which shows that BPNN has certain research values for further research in this field, but the disadvantage is that parameters are difficult to adjust, while training can easily fall into the local minimum [4,5]. In addition, some other scholars have applied probabilistic neural network (PNN), radial basis function (RBF) and other traditional machine learning algorithms to this field, which indicates that these algorithms have more advantages in accuracy, efficiency and robustness than BPNN [6,7,8,9,10]. Relatively speaking, though, the recognition rate is not high enough. Then, Zan et al. proposed a fuzzy adaptive resonance theory mapping (ARTMAP) neural network with incremental learning ability, so as to identify control chart patterns (CCPs), and it also used an unsupervised algorithm to achieve CCP clustering analysis [2]. The recognition rate of this method is higher than that of the traditional multilayer perceptron (MLP) method, but the clustering result exceeds the actual pattern type, so it is not very convenient to apply [11]. As part of a new generation of machine learning methods, the support vector machine (SVM) has been used in CCPR, and has achieved excellent results. Later, a CCPR method was submitted, which combined a fuzzy support vector machine (FSVM) with a hybrid kernel function and genetic algorithm (GA) [12]. This method has better recognition accuracy than MLP, SVM and PNN, and it also has better recognition performance.
A common problem in CCPR is the input data form, that is, the representation. [13,14,15]. The first form of expression is the raw data. However, when using unprocessed CCP data directly, there is a problem with high input dimension. In other words, high-dimensional data will cause the classifier to become larger, which will lead to a reduction in the recognition efficiency and accuracy in complex problems [16]. The second form of expression is a feature set, which consists of shape features [17,18,19,20], statistical characteristics [21] and wavelet analysis features [22,23]. Most related studies indicate that using a feature set as input allows better performance than a CCP classifier, which uses raw data as input [24]. In addition, choosing an appropriate feature set is also of great significance for improving the recognition accuracy. Therefore, some scholars have done some research on the characteristics of CCPs, for example, using time-domain features, shape features and wavelet decomposition methods to extract and fuse those from CCPs, which demonstrates the importance of extracting and selecting features.
After the deep neural network (DNN) was proposed, it was fully applied in various fields. Compared with the traditional artificial neural network (ANN), DNNs have unique advantages in feature learning [25,26]. As a representative of DNNS, a convolutional neural network (CNN) is widely used in various fields [27,28], and it has also achieved significant results in CCPR [29]. However, existing studies can only identify control charts with fixed window sizes. Nevertheless, the window sizes used by enterprises to discriminate control chart abnormalities are generally incompatible, which makes for poor generalization of recognition systems, while it can only meet specific needs. So, the CCPR method based on CNN and information fusion proposed in this paper, by identifying different window sizes can better solve these problems. On the one hand, it can adapt to the needs of different enterprises or different systems, especially when used in combination with the Nelson criterion. Furthermore, for the process wherein the amount of data gradually starts to accumulate, it can be more flexible in adapting to the data grouping.
In this paper, to realize the pattern recognition of control charts with different window sizes, a CCPR method based on CNN and information fusion is proposed. Information fusion, also known as data fusion, can also be called sensor information fusion or multi-sensor information fusion [30]. This method can integrate data or information from different sources, make full use of all kinds of information, and also make the final decision more comprehensive and accurately [31,32]. This method takes full advantage of CNN’s feature learning capabilities, as well as expert features. The image is input into CNN to extract those features; meanwhile, the extracted expert features are input into the fully connected layer, so as to complete the information fusion. The network is optimized to obtain the recognition model, and then it will be applied to actual production.
The second part of this article introduces the data generation method, and it gives the prototype of CNN. The third part introduces the recognition algorithm in detail. The fourth part validates the algorithm. Finally, the fifth part summarizes the conclusion.

2. Methodology

2.1. Simulation Method of Control Chart

Throughout processing and production, due to the influence of many factors such as personnel, machines, materials, processing methods, environment, etc., there are often fluctuations in the data flow of product quality characteristics in the production process, and abnormal fluctuations are often symmetrical, periodic and cyclical. According to the distribution characteristics of these data flows, they can be divided into six patterns, as shown in Figure 1. Namely, they are normal (NOR) pattern, upward-shift (US) pattern, downward-shift (DS) pattern, upward-trend (UT) pattern, downward-trend (DT) pattern and cycle (CYC) pattern [33]. Except for the NOR pattern, other abnormal modes correspond to some abnormal changes in the manufacturing process. In the practical application of CCPR, the method of moving the window is generally used for anomaly monitoring. Every time a new product is produced, the window will be moved back once, and new data will be added to the recognition window. If an exception is detected in a window at a certain time, an exception alarm will be given and the specific exception mode will be output. Next, the control chart simulation method and process in this paper will be introduced in detail.
The data of six basic modes of control chart are obtained by Monte Carlo simulation method, and the formula is as follows:
y(t) = μ + x(t) + d(t)
among them, y(t) is the quality characteristic value; set the value of t according to the needs of the production site, so as to generate the data of different window size control charts. μ is the average value of statistics under controlled conditions, x(t) is the random interference caused by accidental factors at time t, x(t)~N(0, σ2), σ is the standard deviation, and d(t) is the abnormal interference value, while t is the sample time.
The calculation method of d(t) for each mode is as follows.
The following formula gives the NOR pattern:
d(t) = 0
The following formula gives the US and DS patterns:
d(t) = v × s
among them, v is a parameter that determines the position of the shift (before shift: v = 0; after shift: v = 1), and s is the shift amplitude.
The following formula gives the UT and DT patterns:
d(t) = v × d × t
among them, v is a parameter that determines the position of the trend (before trend: v = 0; after trend: v = 1), and d is the slope.
The following formula gives the CYC pattern:
d(t) = v × a × sin(2πt⁄ω)
among them, a is the fluctuation amplitude, and ω is the fluctuation period.

2.2. CNN Model

As we all know, CNN is a typical DNN. As shown in Figure 2, CNN consists of three types of layers: a convolutional layer, a pooling layer, and a fully connected layer with activation function. Therefore, CNN consists of a feature extractor (convolutional layer and pooling layer) and a classifier (fully connected layer).
The weights and biases in the convolutional layer are called the convolution kernel in CNN. The convolutional layer given by the following formula:
x j l = f ( i = 1 D l 1 x i l 1 ω i j l + b j l ) ,                 j = 1 ,   2 ,   ,   D l
among them, ∗ is the convolution operation, l is the layer number, D represents the number of convolution kernels, and ωl is a convolutional kernel. Its dimension is r × c, whereby r is the height and c is the width, x j l is the jth output feature map, b denotes the bias, and f denotes the activation function, which is given by the following formula:
f ( x ) = m a x ( 0 ,     x )
In the pooling layer, the downsampling is completed, which can quickly reduce the dimensions of the feature map. The pooling layer is expressed by the following formula:
x j l = d o w n m a x ( x j l 1 ) ,                               j = 1 ,   2 ,   ,     D l
among them, x j l is the jth output subsample map, and downmax is the max pooling.
The output of the convolution layer is used as the input of the pooling layer. In maximum pooling, the maximum value of the subsampling area is regarded as a new feature, reflecting the most significant feature of the subsampling area.
The dimension of the output subsample map of the lth pooling layer is Rl × Cl, and it is given by the following formula:
R l × C l = ( R l 1 /   2 ) × ( C l 1 / 2 )
Feature maps are unrolled and stitched together in the fully connected layer. As shown in Figure 2, M is the number of feature maps finally extracted by CNN. It is expressed by the following formula:
M = R l 1 × C l 1 × D l 1
As in CNNs, the neurons in the fully connected layer are completely connected to all neurons in the previous layer, and the calculation formula is as follows:
O = f ( ω o f v + b o )
among them, fv denotes the input vector, bo denotes the bias, and ωo denotes the weight.
The last layer of the CNN is the output layer, which contains N neurons, representing the number of pattern types that need to be identified. In this paper, there are six CCPs.

3. Proposed Method

In the traditional CCPR method, features are usually manually extracted from the original data under the control chart, so as to complete CCPR with a single window size, while the adaptability is insufficient. Therefore, a CNN method for CCPR with different window sizes has been proposed, which uses the CNN feature learning to extract those features from the control graph to realize CCPR. This method makes full use of the advantages of CNN’s feature learning function and expert features. When inputting the image into CNN, information fusion is used to fill in the expert features extracted from the data, and make them into the fully connected layer. This is followed by fusing the features extracted by CNN to obtain the recognition model. Different from the traditional CCPR method, the CNN input is an image, and the output is a pattern type. Since control charts of different window sizes can be drawn as images of the same size, this method can identify those ones.
The proposed research block diagram is shown in Figure 3. The structure of the proposed CCPR method is shown in Figure 4, wherein the CNN structure proposed is composed of three convolution layers, three pooling layers and a fully connected layer. Convolutional and pooling layers complete the feature extraction, while fully connected layers achieve classification. The main steps of the proposed method are as follows:
Step 1:
The characteristic datasets of six CCPs with different window sizes are generated by the Monte Carlo simulation method, including the training set and test set.
Step 2:
The obtained data of various modes are drawn into control chart images corresponding to different window sizes.
Step 3:
The expert features of the generated data are extracted.
Step 4:
The image of the training set is input into CNN, and at the same time, the expert features are input into the full connection layer and the features extracted by CNN are fused, the weight and bias of the training network are optimized, and the recognition model is derived.
Step 5:
The test set is used to verify the performance of the proposed method.

4. Simulation Experiments

A CNN is programed in Matlab (MathWorks, Natick, MA, USA) and runs on a 3.10 GHz CPU, together with 16 GB memory personal computer. After performing a series of simulation experiments, the feasibility and effectiveness of this method are verified. The correct recognition rate (CRR) is used as the estimation criterion. The confusion matrix can intuitively express the evaluation results. Finally, compared with the existing methods, the superiority is proven.

4.1. CCP Parameters

Here, we consider six common CCPs. Each CCP contains 10 different window size datasets, ranging from 25 to 34. The control charts with different window sizes are shown in Figure 5. The generated dataset using LabVIEW contains 6 × 3500 × 10 samples; 85% of the samples are used for training, and the rest for testing. Table 1 gives the specific simulation parameters of each pattern. The starting position in each unnatural pattern is between 20% and 35% of the window size. The parameter range is set strictly according to the actual situation. In order to more realistically simulate the actual production situation, the average μ is set to 30 and the standard deviation σ is set to 0.05. The slope d value is within the set [0.1σ, 0.3σ]. The displacement s magnitude is within the set [1.5σ, 3σ] range, and the amplitude value in the cyclic pattern a is within the set [1.5σ, 4σ]. The period ω value varies within the set {4, 5, 6, 7, 8}.

4.2. Data Pre-Processing

After using LabVIEW to obtain the dataset, the data needs to be preprocessed first. This is mainly divided into two steps. The first step is to extract expert features from the data, thus providing the data for MLP and information fusion. The second step is to convert the data into images and provide the network input for CNNs.
The expert features extracted in the first step include four statistical features and one shape feature, namely mean, standard deviation, skewness, kurtosis and slope. These five expert functions are selected experimentally instead of randomly. Taking the mean as an example, the extracted expert features are shown in Figure 6, and the other similar features are not described one by one.
Compared to the first step, the second step is relatively simple. It only involves converting the control chart datasets from different window sizes into images in the same size. In this paper, all the different window sizes data are converted into images of the same size. After the data are preprocessed, they can be directly input into CNN for identification.

4.3. Structural Parameters of CNN Performance

The important CNN structural parameters are mainly the convolution kernel (r) dimension and the number of convolution kernels (D). In order to determine the optimal structural parameters in CNN, the performance of CNN under different structural parameters is compared experimentally in this paper. We have performed two experiments on the same dataset. The first comparison experiment is to get the optimal convolution kernel (r) dimension. The best results are shown in Table 2 in bold. A second comparison experiment was used to get the best number of convolution kernels (D). The optimal results are also shown in Table 2 in bold. The structural parameters shown in bold have the highest CRRs and shorter training times. Therefore, the table represents a CNN structure with the best performance. Final network parameters are shown in Table 3.
In the experiment of determining the CNN’s structural parameters, the results show that the recognition rate obtained by the same set of structural parameters after multiple trainings does not change too much, but each training time is slightly different. In terms of the feature maps number, the experimental results show that if the feature maps D number is too small, the recognition rate will be greatly reduced; when the number exceeds a certain value, the CRR does not change significantly with the increase in the feature map D, but the training time significantly increases. This indicates that the features learned by the current network are sufficient to map the original data. Therefore, in order to obtain a better performance recognition model, we not only need to guarantee the recognition rate, but also focus on the recognition efficiency. In this paper, we choose the most appropriate convolution kernel (r) dimension and convolution kernel (D) through experiments.

4.4. Performance Comparison between MLP and CNN

To further evaluate the CNN recognition performance of the CNN-based CCPR method, it is compared with the MLP method. As an excellent machine learning method, MLP is often used in CCPR and has been favored by most scholars [34]. In the verification experiment, the CNN’s input is the image, and the MLP input is the feature set. The feature set includes statistical features (mean, standard deviation, skewness and kurtosis) and shape features (S, NC1, NC2, APML, APSL), which have been proven to be very effective in CCPR [35]. Among them, NC1, NC2, APML and APSL cannot be used for control charts with different window sizes, so only the remaining five are used as the MLP feature set. The MLP is a classic three layers forward network, which contains 5 neurons receiving a feature set, 15 hidden neurons and 6 output neurons corresponding to 6 typical CCPs. The confusion matrix is used to evaluate the MLP and CNN performance. The diagonal elements of the confusion matrix represent the correct recognition percentage [36]. The CRR is equal to the average of the correct recognition rates in the six patterns. The results are shown in Figure 7.
Figure 7 shows the error classification of the two recognition algorithms. In MLP and the feature set, the total recognition rate is 94.4%, and the recognition results of the NOR and CYC patterns are extremely advance, with respective recognition rates of 99.2% and 98.9%. In contrast, CNN’s total recognition rate is 95%, but the recognition rates of the UT, DT, US and DS patterns are higher than those of MLP. The experimental results show that although the CNN’s recognition rate is slightly higher than that of MLP, the two methods have their own advantages in identifying different patterns.
In the process of CCPR, the experimental results show that CNN performs continuous convolution and pooling layer transformation on the CCP image, and that so as to obtain the feature set, it not only reduces the workload, but also can be easily applied to other fields. The expert features used in the MLP method are more sensitive to certain CCPs, such as the NOR and CYC patterns. These features will be used in the next section.

4.5. Recognition Results of Information Fusion

In order to further improve the recognition rate, the CNN’s feature learning capabilities and the expert features advantages are fully utilized. The training set images are input into the CNN for feature learning. At the same time, the expert features extracted from the data are input into the fully connected layer, and information fusion is used to merge them with the features extracted by CNN. Then, the network weights and bias values are optimized to obtain the recognition model. The recognition result of the test set is represented by the confusion matrix, as shown in Figure 8.
It can be easily seen from Figure 8 that the recognition rate of the CCPR method based on CNN and information fusion is 97.08%. Compared with CNN and the image, MLP and feature set recognition methods, the recognition rates of the NOR and CYC patterns are further improved, while the recognition rates of the UT, DT, US and DS patterns are maintained by the CCPR method in information fusion, which makes full use of CNN’s feature learning ability and expert features advantages. The effectiveness of the proposed CCPR method is further verified.
From the above analysis, we can draw the following conclusions: firstly, CNN does not need to extract features manually, and puts no pressure on its high-dimensional data processing, which makes the training and recognition speed of the model faster and improves its accuracy; secondly, because of the popularity of deep learning and parallel accelerated computing, the method is more effective in model training and application, which can better adapt to the needs of the future big data environment; finally the method of information fusion makes full use of the CNN advantages and expert features, makes the method in this paper more conducive to the actual quality control, and improves the automation degree and the intelligent level of enterprise quality management.

4.6. Comparison of CNN with Other Classification Methods

In order to further evaluate the effectiveness of the method proposed in this paper, it is compared with the method in the relevant literature. As shown in Table 4, CCPR uses many classifiers, including MLP, PNN, Fuzzy ARTMAP, SVM and RBF. The input is the original data or feature set. According to the results shown in Table 4, using feature sets as input is generally more effective than using raw data. In particular, when the shape and feature set are combined together, the performance is most prominent.
Compared with these methods, the method proposed in this paper not only has superiority in processing high-latitude data, but also can identify control charts with different window sizes. This is something other methods cannot even attempt. In terms of recognition rate, it is suitable for both simulated data and actual production data, and it is also able to withstand verification. Its recognition rate has room for improvement in subsequent research.

4.7. A Real Example

In order to further illustrate the proposed CCPR method based on CNN and information fusion, the diameter of the through-hole on the axle table is considered as a key quality characteristic, which can be measured with a digital caliper. The specification of the hole diameter in finish processing operation is φ750 + 0.235 + 0.185 mm.
In the production process, the diameter of the through hole of the processed part is measured at certain intervals and used as control chart data in different chain lengths. After the data are obtained, the same data preprocessing process as in the simulation experiment is performed to extract expert features from the data and convert the data into images. After ensuring the machining process is in a controlled state, the proposed CCPR method is implemented on these images by using the recognition model obtained from simulation experiments.
The recognition results show a downward-trend and an upward-shift pattern. The corresponding control charts are shown in Figure 9. Through analysis, it is found that the reason for the downward-trend pattern is likely to be the reduction in the hole diameter due to the wear in the cutting tool, and the reason for the upward-shift pattern is likely to be the overlapping caused by different product batches in actual production. This shows that the CCPR method based on CNN and information fusion proposed in this paper can identify abnormal changes in processing.

5. Conclusions

This paper proposes a CCPR method based on CNN and information fusion; after using the Monte Carlo method to simulate the characteristics of various modes in the control chart, the corresponding pattern data are generated, and the data in various patterns are drawn into corresponding images. The convolutional neural network model directly classifies and recognizes the image pattern, and finally obtains the best recognition model, which can be used for the abnormal recognition of control charts in actual production and production monitoring. This method takes advantage of CNN’s feature learning capabilities and the advantages of expert features. The image is used as the CNN input, so control charts with different window sizes can be identified. The results show that the recognition accuracy of this method is better than that of the traditional MLP method, and that of the method using only feature learning. The final recognition accuracy is as high as 97.08%.
With the popularity of the industrial internet of things and the rapid development of computer processing performance, the advantages for big data have been reflected. How to ensure full effectiveness has become a top priority. The proposed CCPR method provides a new feasible approach to further research in this field. It avoids the problem of extracting various complex features, while being more conducive to actual quality control, and will help improve the automation and intelligence level for enterprise quality management in the future, while also improving their economic benefits. How to apply deep learning to online intelligent monitoring and diagnosis in product quality processes will be the next research direction.

Author Contributions

Conceptualization, T.Z. and Z.S.; methodology, T.Z.; software, Z.L.; validation, Z.S.; formal analysis, X.G.; investigation, X.G.; resources, M.W.; data curation, Z.L.; writing—original draft preparation, Z.S.; writing—review and editing, Z.S.; visualization, D.C.; supervision, D.C.; project administration, M.W.; funding acquisition, M.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by China Scholarship Council, grant number 201806545032 and National Natural Science Foundation of China, grant numbers 51975020, 51875008 and 51575014.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hadian, H.; Rahimifard, A. Multivariate statistical control chart and process capability indices for simultaneous monitoring of project duration and cost. Comput. Ind. Eng. 2019, 130, 788–797. [Google Scholar] [CrossRef]
  2. Zan, T.; Wang, M.; Fei, R.Y. Pattern Recognition for Control Charts Using AR Spectrum and Fuzzy ARTMAP Neural Network. Adv. Mater. Res. 2010, 97, 3696–3702. [Google Scholar] [CrossRef]
  3. Shewhart, M. Interpreting statistical process control (SPC) charts using machine learning and expert system techniques. In Proceedings of the Aerospace & Electronics Conference. IEEE Xplore, Dayton, OH, USA, 18–22 May 1992. [Google Scholar]
  4. Guh, R.S.; Hsieh, Y.C. A Neural Network-Based Model for Abnormal Pattern Recognition of Control Charts; Pergamon Press, Inc.: Oxford, UK, 1999. [Google Scholar]
  5. Addeh, J.; Ebrahimzadeh, A.; Ranaee, V. Control chart pattern recognition using adaptive back-propagation artificial neural networks and effi-cient features. In Proceedings of the 2nd International Conference on Control, Instrumentation and Automation, Shiraz, Iran, 27–29 December 2011; pp. 742–746. [Google Scholar]
  6. Cheng, Z.; Ma, Y. A research about pattern recognition of control chart using probability neural network. In Proceedings of the ISECS International Colloquium on Computing, Communication, Control, & Management, Guangzhou, China, 3–4 August 2008. [Google Scholar]
  7. Shao, Y.E.; Chiu, C.-C. Applying emerging soft computing approaches to control chart pattern recognition for an SPC–EPC process. Neurocomputing 2016, 201, 19–28. [Google Scholar] [CrossRef]
  8. Cheng, C.-S.; Huang, K.-K.; Chen, P.-W. Recognition of control chart patterns using a neural network-based pattern recognizer with features extracted from correlation analysis. Pattern Anal. Appl. 2015, 18, 75–86. [Google Scholar] [CrossRef]
  9. Haghtalab, S.; Xanthopoulos, P.; Madani, K. A robust unsupervised consensus control chart pattern recognition framework. Expert Syst. Appl. 2015, 42, 6767–6776. [Google Scholar] [CrossRef]
  10. Tang, H.; Xiao, W.; Liu, H.; Sebe, N. Fast and robust dynamic hand gesture recognition via key frames extraction and feature fusion. Neurocomputing 2019, 331, 424–433. [Google Scholar] [CrossRef] [Green Version]
  11. Addeh, A.; Maghsoudi, B.M. Control chart patterns detection using COA based trained MLP neural network and shape features. Comput. Res. Prog. Appl. Sci. Eng. 2016, 2, 5–8. [Google Scholar]
  12. Zhou, X.; Jiang, P.; Wang, X. Recognition of control chart patterns using fuzzy SVM with a hybrid kernel function. J. Intell. Manuf. 2018, 29, 51–67. [Google Scholar] [CrossRef]
  13. Wang, C.-H.; Kuo, W. Identification of control chart patterns using wavelet filtering and robust fuzzy clustering. J. Intell. Manuf. 2007, 18, 343–350. [Google Scholar] [CrossRef]
  14. DPham, D.T.; Öztemel, E. Control chart pattern recognition using learning vector quantization networks. Int. J. Prod. Res. 1994, 32, 721–729. [Google Scholar]
  15. Hassan, A.; Baksh, M.S.N.; Shaharoun, A.M.; Jamaluddin, H. Improved SPC chart pattern recognition using statistical features. Int. J. Prod. Res. 2003, 41, 1587–1603. [Google Scholar] [CrossRef]
  16. Ranaee, V.; Ebrahimzadeh, A. Control chart pattern recognition using a novel hybrid intelligent method. Appl. Soft Comput. 2011, 11, 2676–2686. [Google Scholar] [CrossRef]
  17. Ranaee, V.; Ebrahimzadeh, A. Control chart pattern recognition using neural networks and efficient features: A comparative study. Pattern Anal. Appl. 2013, 16, 321–332. [Google Scholar] [CrossRef]
  18. Bag, M.; Gauri, S.K.; Chakraborty, S. An expert system for control chart pattern recognition. Int. J. Adv. Manuf. Technol. 2012, 62, 291–301. [Google Scholar] [CrossRef]
  19. Gauri, S.K.; Chakraborty, S. Recognition of control chart patterns using improved selection of features. Comput. Ind. Eng. 2009, 56, 1577–1588. [Google Scholar] [CrossRef]
  20. Pelegrina, G.D.; Duarte, L.T.; Jutten, C. Blind source separation and feature extraction in concurrent control charts pattern recognition: Novel analyses and a comparison of different methods. Comput. Ind. Eng. 2016, 92, 105–114. [Google Scholar] [CrossRef] [Green Version]
  21. Addeh, J.; Khormali, A.; Golilarz, N.A. Control chart pattern recognition using RBF neural network with new training algorithm and practical features. ISA Trans. 2018, 79, 202–216. [Google Scholar] [CrossRef]
  22. Jin, J.; Shi, J. Automatic feature extraction of waveform signals for in-process diagnostic performance improvement. J. Intell. Manuf. 2001, 12, 257–268. [Google Scholar] [CrossRef] [Green Version]
  23. Hachicha, W.; Ghorbel, A. A survey of control-chart pattern-recognition literature (1991–2010) based on a new conceptual classification scheme. Comput. Ind. Eng. 2012, 63, 204–222. [Google Scholar] [CrossRef]
  24. Zan, T.; Liu, Z.; Su, Z.; Wang, M.; Gao, X.; Chen, D. Statistical Process Control with Intelligence Based on the Deep Learning Model. Appl. Sci. 2020, 10, 308. [Google Scholar] [CrossRef] [Green Version]
  25. Timmermans, A.; Hulzebosch, A. Computer vision system for on-line sorting of pot plants using an artificial neural network classifier. Comput. Electron. Agric. 1996, 15, 41–55. [Google Scholar] [CrossRef]
  26. Liao, Y.; Zeng, X.; Li, W. Wavelet transform based convolutional neural network for gearbox fault classification. In Proceedings of the Prognostics and System Health Management Conference, Harbin, China, 9–12 July 2017; pp. 1–6. [Google Scholar]
  27. Xia, M.; Li, T.; Xu, L.; Liu, L.; Silva, C.W.D. Fault diagnosis for rotating machinery using multiple sensors and convolutional neural networks. IEEE/ASME Trans. Mechatron. 2017, 23, 101–110. [Google Scholar] [CrossRef]
  28. Xie, Y.; Zhang, T. Fault Diagnosis for Rotating Machinery Based on Convolutional Neural Network and Empirical Mode Decomposition. Shock. Vib. 2017, 2017, 1–12. [Google Scholar] [CrossRef]
  29. Zan, T.; Liu, Z.; Wang, H.; Wang, M.; Gao, X. Control chart pattern recognition using the convolutional neural network. J. Intell. Manuf. 2019, 31, 703–716. [Google Scholar] [CrossRef]
  30. Zeng, H.; Wang, Q.; Liu, J. Multi-Feature Fusion Based on Multi-View Feature and 3D Shape Feature for Non-Rigid 3D Model Retrieval. IEEE Access 2019, 7, 41584–41595. [Google Scholar] [CrossRef]
  31. Lu, J.; Ma, C.-X.; Zhou, Y.-R.; Luo, M.-X.; Zhang, K.-B. Multi-Feature Fusion for Enhancing Image Similarity Learning. IEEE Access 2019, 7, 167547–167556. [Google Scholar] [CrossRef]
  32. Liu, A.; Yang, Y.; Sun, Q.; Xu, Q. A Deep Fully Convolution Neural Network for Semantic Segmentation Based on Adaptive Feature Fusion. In Proceedings of the 2018 5th International Conference on Information Science and Control Engineering (ICISCE), Zhengzhou, China, 20–22 July 2018; pp. 16–20. [Google Scholar]
  33. Montgomery, D.C. Introduction to Statistical Quality Control; Wiley: Hoboken, NJ, USA, 2009. [Google Scholar]
  34. Al-Assaf, Y. Recognition of control chart patterns using multiresolution wavelets analysis and neural networks. Comput. Ind. Eng. 2004, 47, 17–29. [Google Scholar] [CrossRef]
  35. Guh, R.S.; Zorriassatine, F.; Tannock, J.D.; O’Brien, C. On-line con trol chart pattern detection and discrimination—A neural network approach. Artif. Intell. Eng. 1999, 13, 413–425. [Google Scholar] [CrossRef]
  36. Sokolova, M.; Lapalme, G. A systematic analysis of performance measures for classification tasks. Inf. Process. Manag. 2009, 45, 427–437. [Google Scholar] [CrossRef]
Figure 1. Six patterns of control charts.
Figure 1. Six patterns of control charts.
Symmetry 12 01472 g001
Figure 2. The structure of convolutional neural network (CNN).
Figure 2. The structure of convolutional neural network (CNN).
Symmetry 12 01472 g002
Figure 3. The proposed research block diagram.
Figure 3. The proposed research block diagram.
Symmetry 12 01472 g003
Figure 4. Structure of pattern recognition method for control chart.
Figure 4. Structure of pattern recognition method for control chart.
Symmetry 12 01472 g004
Figure 5. Control charts with different window sizes.
Figure 5. Control charts with different window sizes.
Symmetry 12 01472 g005
Figure 6. Extracted mean features.
Figure 6. Extracted mean features.
Symmetry 12 01472 g006
Figure 7. The confusion matrix of (a) the MLP and feature set, and (b) the CNN and image.
Figure 7. The confusion matrix of (a) the MLP and feature set, and (b) the CNN and image.
Symmetry 12 01472 g007
Figure 8. The CCPR confusion matrix for CNN and information fusion.
Figure 8. The CCPR confusion matrix for CNN and information fusion.
Symmetry 12 01472 g008
Figure 9. The control chart of (a) the downward-trend pattern and (b) the upward-shift pattern.
Figure 9. The control chart of (a) the downward-trend pattern and (b) the upward-shift pattern.
Symmetry 12 01472 g009
Table 1. The specific simulation parameters of each pattern.
Table 1. The specific simulation parameters of each pattern.
PatternMathematical ExpressionParameter Value
NORy(t) = μ + x(t)μ = 30, σ = 0.05
UTy(t) = μ + x(t) + v × d × td∈[0.1σ, 0.3σ]
DTy(t) = μ + x(t) + v × d × td∈[−0.1σ, −0.3σ]
USy(t) = μ + x(t) + v × ss∈[1.5σ, 3σ]
DSy(t) = μ + x(t) + v × ss∈[−1.5σ, −3σ]
CYCy(t) = μ + x(t) + v × a × sin(2πt⁄ω)a∈[1.5σ, 4σ], ω∈{4, 5, 6, 7, 8}
Table 2. Performance of CNNs with different numbers and dimensions of convolution kernels.
Table 2. Performance of CNNs with different numbers and dimensions of convolution kernels.
NumberD2, D3r2 × c2D4, D5r4 × c4D6, D7r6 × c6CRR (%)Time (s/Epoch)
Experiment 1163 × 3123 × 3203 × 390.8145.1502
263 × 3123 × 3205 × 593.3841.2235
363 × 3125 × 5203 × 392.7741.5635
465 × 5123 × 3203 × 385.0340.4361
563 × 3125 × 5205 × 574.1441.5799
665 × 5123 × 3205 × 59541.1486
765 × 5125 × 5203 × 380.7841.0586
865 × 5125 × 5205 × 592.9141.0406
Experiment 2925 × 543 × 365 × 516.6732.2071
1045 × 583 × 3125 × 592.5436.3599
1265 × 5123 × 3205 × 59541.1486
13125 × 5243 × 3245 × 593.5850.2526
14245 × 5243 × 3245 × 593.7863.7782
Table 3. The architecture of CNN.
Table 3. The architecture of CNN.
LayerLayer TypeOutput ShapeKernel SizeNumber of Kernels
0Input layer60 × 60--
1Convolutional60 × 605 × 56
Pooling15 × 154 × 4-
2Convolutional15 × 153 × 312
Pooling7 × 72 × 2-
3Convolutional7 × 75 × 520
Pooling3 × 32 × 2-
4Fully connectedM = 180 -
Output layerN = 6--
Table 4. Performance comparison of different classification methods.
Table 4. Performance comparison of different classification methods.
ReferenceInput RepresentationClassifierCRR (%)
(Guh and Tannock 1999)Raw dataMLP94.38
(Hassan et al., 2003)Feature setMLP96.80
(Cheng and Ma 2008)Raw dataPNN95.58
(Zan et al., 2010)Autoregressive (AR) spectrumFuzzy ARTMAP95
(Ranaee and Ebrahimzadeh 2013)Shape and feature setMLP99.15
(Zhou and Wang 2018)Shape and feature setFSVM99.28
(Addeh et al., 2018)Shape and feature setBees-RBF99.63
This workImagesCNN95
This workImages and information fusionCNN97.08

Share and Cite

MDPI and ACS Style

Zan, T.; Su, Z.; Liu, Z.; Chen, D.; Wang, M.; Gao, X. Pattern Recognition of Different Window Size Control Charts Based on Convolutional Neural Network and Information Fusion. Symmetry 2020, 12, 1472. https://doi.org/10.3390/sym12091472

AMA Style

Zan T, Su Z, Liu Z, Chen D, Wang M, Gao X. Pattern Recognition of Different Window Size Control Charts Based on Convolutional Neural Network and Information Fusion. Symmetry. 2020; 12(9):1472. https://doi.org/10.3390/sym12091472

Chicago/Turabian Style

Zan, Tao, Zifeng Su, Zhihao Liu, Deyin Chen, Min Wang, and Xiangsheng Gao. 2020. "Pattern Recognition of Different Window Size Control Charts Based on Convolutional Neural Network and Information Fusion" Symmetry 12, no. 9: 1472. https://doi.org/10.3390/sym12091472

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop