# Evolutionary Optimization of Spiking Neural P Systems for Remaining Useful Life Prediction

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

- We use a modified version of the NEAT algorithm to successfully evolve SN P systems applied to RUL prediction.
- We obtain better performance than an MLP on the CMAPSS dataset, reducing also the number of parameters.
- We obtain better-than-random performance on the new CMAPSS dataset while significantly reducing the number of parameters w.r.t. the state-of-the-art.

## 2. Background

#### 2.1. Spiking Neural P Systems

- $O=a$ is a singleton alphabet, where a represents a spike.
- ${\sigma}_{1},\dots ,{\sigma}_{m}$ denote neurons. Each neuron consists has the form: ${\sigma}_{i}=({n}_{i},{R}_{i})$, $1\le i\le m$, in which ${n}_{i}$ is the number of spikes initially present in ${\sigma}_{i}$ and ${R}_{i}=\{{r}_{1},\dots ,{r}_{N}\}$ is a finite set of rules in the neuron, respectively. Each rule ${r}_{i}$ can take one of these two forms:$$\begin{array}{cc}\hfill E/{a}^{c}& \to {a}^{p};d,\phantom{\rule{3.33333pt}{0ex}}\forall c,p,d:p<c;\hfill \end{array}$$$$\begin{array}{cc}\hfill E/{a}^{f}& \to \lambda ,\phantom{\rule{3.33333pt}{0ex}}\forall f.\hfill \end{array}$$
- $syn$ is the set of synapses, where each synapse is included in $\{1,\dots ,m\}\times \{1,\dots ,m\}$. More specifically, $syn$ is an $m\times m$ matrix in which each $(i,j)$ element contains the neuron indexes of the synapse connecting neurons i and j, and its corresponding weight, which is an integer. In other words, the synapses have the form of $(i,j,{z}_{i,j})$ where $1\le i,j\le m$, $i\ne j$ denote the indexes of the two connected neurons, and ${z}_{i,j}$ denotes the weight on that connection.
- ${I}_{in}$ is the set of input neurons, i.e., a mutually exclusive subset of $\{{\sigma}_{1},\dots ,{\sigma}_{m}\}$.
- ${I}_{out}$ is the set of output neurons, i.e., a mutually exclusive subset of $\{{\sigma}_{1},\dots ,{\sigma}_{m}\}$.

#### Applications of SN P Systems

#### 2.2. NEAT

#### 2.3. RUL Prediction

## 3. Proposed Method

- c: number of spikes needed by the firing rule;
- p: number of spikes produced by the firing rule;
- d: refractory period of the firing rule;
- f: number of spikes required by the forgetting rule.

**Constraint 1—Firing**: This constraint states that the number of spikes produced in output must be less than or equal to the number of spikes that triggered the rule, i.e., $p\le c$. If this condition is violated, we set $p=c$.**Constraint 2—Forgetting**: Since we use only two rules, i.e., a firing and a forgetting rule, we must be sure that the forgetting rule requires less spikes than the firing rule; otherwise, it may be never applied. For this reason, we add the constraint: $f<c$. If this constraint is violated, we set $f=c-1$.

#### 3.1. Input Features

- n: number of real values in the input time series (i.e., the size of the input vector);
- w: size of the integer vector resulting from the SFA (i.e., number of selected Fourier coefficients);
- s: number of values to discretize each Fourier coefficient to (i.e., number of symbols).

#### 3.2. Fitness Evaluation

## 4. Experimental Setup

#### 4.1. Datasets

#### 4.1.1. CMAPSS

#### 4.1.2. N-CMAPSS

#### 4.2. Compared Algorithms

#### 4.3. Computational Setup and Data Preparation

#### 4.4. NEAT Configurations

## 5. Numerical Results

## 6. Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## References

- Song, T.; Pan, L.; Wu, T.; Zheng, P.; Wong, M.L.D.; Rodriguez-Paton, A. Spiking Neural P Systems With Learning Functions. IEEE Trans. NanoBiosci.
**2019**, 18, 176–190. [Google Scholar] [CrossRef] [PubMed] - Paun, G. Computing with Membranes. J. Comput. Syst. Sci.
**2000**, 61, 108–143. [Google Scholar] [CrossRef] [Green Version] - Ionescu, M.; Păun, G.; Yokomori, T. Spiking neural P systems. Fundam. Inform.
**2006**, 71, 279–308. [Google Scholar] - Păun, G.; Pérez-Jiménez, M.J.; Rozenberg, G. Spike trains in spiking neural P systems. Int. J. Found. Comput. Sci.
**2006**, 17, 975–1002. [Google Scholar] [CrossRef] [Green Version] - Martín-Vide, C.; Pazos, J.; Păun, G.; Rodríguez-Patón, A. A New Class of Symbolic Abstract Neural Nets: Tissue P Systems. In International Computing and Combinatorics Conference (COCOON); Ibarra, O.H., Zhang, L., Eds.; Springer: Berlin/Heidelberg, Germany, 2002; pp. 290–299. [Google Scholar]
- Pan, L.; Zeng, X. A note on small universal spiking neural P systems. In Proceedings of the International Workshop on Membrane Computing (WMC), Curtea de Arges, Romania, 24–27 August 2009; Springer: Berlin/Heidelberg, Germany, 2009; pp. 436–447. [Google Scholar]
- Wang, J.; Hoogeboom, H.J.; Pan, L.; Păun, G.; Pérez-Jiménez, M.J. Spiking neural P systems with weights. Neural Comput.
**2010**, 22, 2615–2646. [Google Scholar] [CrossRef] [PubMed] - Wang, X.; Song, T.; Gong, F.; Zheng, P. On the computational power of spiking neural P systems with self-organization. Sci. Rep.
**2016**, 6, 27624. [Google Scholar] [CrossRef] - Dong, J.; Stachowicz, M.; Zhang, G.; Cavaliere, M.; Rong, H.; Paul, P. Automatic Design of Spiking Neural P Systems Based on Genetic Algorithms. Int. J. Unconv. Comput.
**2021**, 16, 201–216. [Google Scholar] - Casauay, L.J.P.; Cabarle, F.G.C.; Macababayao, I.C.H.; Adorna, H.N.; Zeng, X.; Martínez-Del-Amor, M.Á.; Cruz, R.T.D.L. A Framework for Evolving Spiking Neural P Systems. Int. J. Unconv. Comput.
**2021**, 16, 271–298. [Google Scholar] - Custode, L.L.; Mo, H.; Iacca, G. Neuroevolution of Spiking Neural P Systems. In Applications of Evolutionary Computation; 2022; to appear. [Google Scholar]
- Stanley, K.O.; Miikkulainen, R. Evolving neural networks through augmenting topologies. Evol. Comput.
**2002**, 10, 99–127. [Google Scholar] [CrossRef] - Sateesh Babu, G.; Zhao, P.; Li, X.L. Deep Convolutional Neural Network Based Regression Approach for Estimation of Remaining Useful Life. In Database Systems for Advanced Applications; Lecture Notes in Computer Science; Navathe, S.B., Wu, W., Shekhar, S., Du, X., Wang, X.S., Xiong, H., Eds.; Springer: Berlin/Heidelberg, Germany, 2016; Volume 9642, pp. 214–228. [Google Scholar] [CrossRef]
- Zheng, S.; Ristovski, K.; Farahat, A.; Gupta, C. Long Short-Term Memory Network for Remaining Useful Life estimation. In Proceedings of the 2017 IEEE International Conference on Prognostics and Health Management (ICPHM), Dallas, TX, USA, 19–21 June 2017; pp. 88–95. [Google Scholar]
- Chen, Z.; Wu, M.; Zhao, R.; Guretno, F.; Yan, R.; Li, X. Machine Remaining Useful Life Prediction via an Attention-Based Deep Learning Approach. IEEE Trans. Ind. Electron.
**2021**, 68, 2521–2531. [Google Scholar] [CrossRef] - Mo, H.; Custode, L.L.; Iacca, G. Evolutionary neural architecture search for remaining useful life prediction. Appl. Soft Comput.
**2021**, 108, 107474. [Google Scholar] [CrossRef] - Mo, H.; Lucca, F.; Malacarne, J.; Iacca, G. Multi-Head CNN-LSTM with Prediction Error Analysis for Remaining Useful Life Prediction. In Proceedings of the 2020 27th Conference of Open Innovations Association (FRUCT), Trento, Italy, 7–9 September 2020; pp. 164–171. [Google Scholar]
- Ye, Z.; Yu, J. Health condition monitoring of machines based on long short-term memory convolutional autoencoder. Appl. Soft Comput.
**2021**, 107, 107379. [Google Scholar] [CrossRef] - Mo, H.; Iacca, G. Multi-Objective Optimization of Extreme Learning Machine for Remaining Useful Life Prediction. In Applications of Evolutionary Computation; 2022; to appear. [Google Scholar]
- Ma, T.; Hao, S.; Wang, X.; Alfonso Rodriguez-Paton, A.; Wang, S.; Song, T. Double Layers Self-Organized Spiking Neural P Systems With Anti-Spikes for Fingerprint Recognition. IEEE Access
**2019**, 7, 177562–177570. [Google Scholar] [CrossRef] - Peng, H.; Wang, J.; Pérez-Jiménez, M.J.; Wang, H.; Shao, J.; Wang, T. Fuzzy reasoning spiking neural P system for fault diagnosis. Inf. Sci.
**2013**, 235, 106–116. [Google Scholar] [CrossRef] - Wang, T.; Zhang, G.; Zhao, J.; He, Z.; Wang, J.; Perez-Jimenez, M.J. Fault Diagnosis of Electric Power Systems Based on Fuzzy Reasoning Spiking Neural P Systems. IEEE Trans. Power Syst.
**2015**, 30, 1182–1194. [Google Scholar] [CrossRef] - Chen, H.; Ishdorj, T.-O.; Paun, G.; Pérez Jiménez, M.d.J. Spiking neural P systems with extended rules. In Proceedings of the 4th Brainstorming Week on Membrane Computing (BWMC), Sevilla, Spain, 30 January–3 February 2006; Fénix Editora: ETS de Ingeniería Informática. Volume I, pp. 241–265. [Google Scholar]
- Ishdorj, T.-O.; Leporati, A. Uniform solutions to SAT and 3-SAT by spiking neural P systems with pre-computed resources. Nat. Comput.
**2008**, 7, 519–534. [Google Scholar] [CrossRef] - Leporati, A.; Gutiérrez-Naranjo, M.A. Solving Subset Sum by spiking neural P systems with pre-computed resources. Fundam. Inform.
**2008**, 87, 61–77. [Google Scholar] - Zhang, G.; Rong, H.; Neri, F.; Pérez-Jiménez, M.J. An optimization spiking neural P system for approximately solving combinatorial optimization problems. Int. J. Neural Syst.
**2014**, 24, 1440006. [Google Scholar] [CrossRef] - Qi, F.; Liu, M. Optimization spiking neural P system for solving TSP. In Proceedings of the International Conference on Machine Learning and Intelligent Communications (MLICOM), Shenzhen, China, 26–27 September 2017; Springer: Berlin/Heidelberg, Germany, 2017; pp. 668–676. [Google Scholar]
- Ionescu, M.; Sburlan, D. Some applications of spiking neural P systems. Comput. Inform.
**2008**, 27, 515–528. [Google Scholar] - Hamabe, R.; Fujiwara, A. Asynchronous SN P systems for logical and arithmetic operations. In Proceedings of the International Conference on Foundations of Computer Science (FCS), Las Vegas, NV, USA, 16–19 July 2012; The Steering Committee of the World Congress in Computer Science. p. 1. [Google Scholar]
- Song, T.; Zheng, P.; Wong, M.D.; Wang, X. Design of logic gates using spiking neural P systems with homogeneous neurons and astrocytes-like control. Inf. Sci.
**2016**, 372, 380–391. [Google Scholar] [CrossRef] - Peng, X.W.; Fan, X.P.; Liu, J.X. Performing balanced ternary logic and arithmetic operations with spiking neural P systems with anti-spikes. Adv. Mater. Res.
**2012**, 505, 378–385. [Google Scholar] [CrossRef] - Tu, M.; Wang, J.; Peng, H.; Shi, P. Application of Adaptive Fuzzy Spiking Neural P Systems in Fault Diagnosis of Power Systems. Chin. J. Electron.
**2014**, 23, 87–92. [Google Scholar] - Díaz-Pernil, D.; Peña-Cantillana, F.; Gutiérrez-Naranjo, M.A. A parallel algorithm for skeletonizing images by using spiking neural P systems. Neurocomputing
**2013**, 115, 81–91. [Google Scholar] [CrossRef] - Song, T.; Pang, S.; Hao, S.; Rodríguez-Patón, A.; Zheng, P. A parallel image skeletonizing method using spiking neural P systems with weights. Neural Process. Lett.
**2019**, 50, 1485–1502. [Google Scholar] [CrossRef] - Schäfer, P.; Högqvist, M. SFA: A symbolic Fourier approximation and index for similarity search in high dimensional datasets. In Proceedings of the 15th International Conference on Extending Database Technology—EDBT’12, Berlin, Germany, 27–30 March 2012; ACM Press: New York, NY, USA, 2012; p. 516. [Google Scholar]
- Saxena, A.; Goebel, K.; Simon, D.; Eklund, N. Damage propagation modeling for aircraft engine run-to-failure simulation. In Proceedings of the 2008 International Conference on Prognostics and Health Management, Denver, CO, USA, 6–9 October 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 1–9. [Google Scholar]
- Arias Chao, M.; Kulkarni, C.; Goebel, K.; Fink, O. Aircraft Engine Run-to-Failure Dataset under Real Flight Conditions for Prognostics and Diagnostics. Data
**2021**, 6, 5. [Google Scholar] [CrossRef] - Louen, C.; Ding, S.X.; Kandler, C. A new framework for remaining useful life estimation using Support Vector Machine classifier. In Proceedings of the 2013 Conference on Control and Fault-Tolerant Systems (SysTol), Nice, France, 9–11 October 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 228–233. [Google Scholar]
- Arias Chao, M.; Kulkarni, C.; Goebel, K.; Fink, O. Fusing physics-based and deep learning models for prognostics. Reliab. Eng. Syst. Saf.
**2022**, 217, 107961. [Google Scholar] [CrossRef] - McIntyre, A.; Kallada, M.; Miguel, C.G.; da Silva, C.F. Neat-Python. Available online: https://github.com/CodeReclaimers/neat-python (accessed on 10 November 2021).

**Figure 2.**Illustration of the Symbolic Fourier Approximation process. ${X}_{i}$ denotes a selected Fourier coefficient, and ${o}_{i}$ represents each output symbol from the SFA.

**Figure 3.**Fitness across generations (mean ± std. dev. across 10 independent runs) for SNPS (5) on CMAPSS (FD001).

**Figure 4.**Fitness across generations (mean ± std. dev. across 10 independent runs) for SNPS (4) on N-CMAPSS (DS02).

**Figure 6.**Trade-off between test RMSE and number of trainable parameters for the methods considered in the experimentation on CMAPSS (FD001).

**Figure 7.**Trade-off between test RMSE and number of trainable parameters for the methods considered in the experimentation on N-CMAPSS (DS02).

FD001 | FD002 | FD003 | FD004 | |
---|---|---|---|---|

Number of engines in training set | 100 | 260 | 100 | 249 |

Number of engines in test set | 100 | 259 | 100 | 248 |

Max/min cycles in training set | 362/128 | 378/128 | 525/145 | 543/128 |

Max/min cycles in test set | 303/31 | 367/21 | 475/38 | 486/19 |

Operating conditions | 1 | 6 | 1 | 6 |

Fault modes | 1 | 1 | 2 | 2 |

**Table 2.**Overview of each unit in the DS02 sub-dataset of N-CMAPSS w.r.t. the number of samples ${m}_{i}$ (in millions), the end-of-life time ${t}_{EOL}$, and the failure modes.

Training Set (${\mathit{D}}_{\mathit{train}}$) | Test Set (${\mathit{D}}_{\mathit{test}}$) | ||||||
---|---|---|---|---|---|---|---|

Unit | ${\mathit{m}}_{\mathit{i}}$ (M) | ${\mathit{t}}_{\mathit{EOL}}$ | Failure Mode | Unit | ${\mathit{m}}_{\mathit{i}}$ (M) | ${\mathit{t}}_{\mathbf{EOL}}$ | Failure Mode |

${u}_{2}$ | 0.85 | 75 | HPT | ${u}_{11}$ | 0.66 | 59 | HPT + LPT |

${u}_{5}$ | 1.03 | 89 | HPT | ${u}_{14}$ | 0.16 | 76 | HPT + LPT |

${u}_{10}$ | 0.95 | 82 | HPT | ${u}_{15}$ | 0.43 | 67 | HPT + LPT |

${u}_{16}$ | 0.77 | 63 | HPT + LPT | ||||

${u}_{18}$ | 0.89 | 71 | HPT + LPT | ||||

${u}_{20}$ | 0.77 | 66 | HPT + LPT |

Method | Description |
---|---|

MLP [13] | 1 hidden layer |

CNN [13] | 2 convolutional layers, 2 pooling layers, and 1 fully connected layer |

LSTM [14] | 2 LSTM layers and 2 fully connected layer |

SVM [38] | SVM classifier and Weibull distribution |

Parameter | Value | Parameter | Value |
---|---|---|---|

Population size | 30 | Generations | 300 |

Initialization weight | ∼$\mathcal{N}(0,1)$ | Weight range | [−1, 1] |

Mutation power | ∼$\mathcal{N}(0,3)$ | Mutation rate | 0.2 |

Replacement rate | 0.1 | Add connection rate | 0.5 |

Remove connection rate | 0.5 | Add node rate | 0.5 |

Remove node rate | 0.5 | Toggle “enable” rate | 0.1 |

Max stagnation period | 20 | Hidden neurons | 10 |

**Table 6.**Mean, std. dev., and max. value for each evolved parameter in the tested NEAT configurations.

c | p | d | f | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|

Mean | Std. Dev. | Max | Mean | Std. Dev. | Max | Mean | Std. Dev. | Max | Mean | Std. Dev. | Max | |

SNPS (1) | 100 | 100 | 500 | 100 | 100 | 500 | 10 | 10 | 100 | 1 | 2 | 200 |

SNPS (2) | 100 | 1 | 200 | 100 | 100 | 200 | 1 | 1 | 10 | 1 | 2 | 200 |

SNPS (3) | 100 | 1 | 200 | 100 | 1 | 200 | 1 | 1 | 10 | 1 | 2 | 200 |

SNPS (4) | 5 | 2 | 500 | 5 | 2 | 500 | 1 | 1 | 10 | 1 | 2 | 200 |

SNPS (5) | 5 | 2 | 500 | 5 | 2 | 500 | 1 | 1 | 10 | 1 | 1 | 5 |

SNPS (6) | 5 | 2 | 10 | 5 | 2 | 10 | 1 | 1 | 10 | 5 | 2 | 10 |

**Table 7.**Summary of the comparative analysis based on the test results on CMAPSS (FD001). The symbol “-” indicates not available data. For our SNPS methods, we report the RMSE in terms of mean ± std. dev. across 10 independent runs. For the remaining methods, we report the results from the original papers (std. dev. not provided). The boldface indicates the best value per column.

Methods | Test RMSE | s-Score × (${10}^{3}$) | Trainable Parameters | Test Execution Time (ms) |
---|---|---|---|---|

MLP [13] | 37.36 ± 0.00 | 6.45 | 801 | 93 |

CNN [13] | 18.45 ± 0.00 | 1.29 | 6815 | 151 |

LSTM [14] | 16.14 ± 0.00 | 0.34 | 14,681 | 968 |

SVM [38] | 29.82 ± 0.00 | - | - | - |

SNPS (1) | 29.27 ± 3.76 | 3.28 | 263 | 25 |

SNPS (2) | 31.55 ± 3.61 | 8.83 | 365 | 27 |

SNPS (3) | 29.28 ± 2.99 | 4.36 | 60 | 7 |

SNPS (4) | 20.90 ± 1.52 | 0.81 | 334 | 28 |

SNPS (5) | 20.32 ± 1.54 | 0.54 | 232 | 25 |

SNPS (6) | 20.79 ± 1.58 | 1.57 | 285 | 26 |

**Table 8.**Summary of the comparative analysis based on the test results on N-CMAPSS (DS02). The symbol “-” indicates not available data. For our SNPS methods, we report the RMSE in terms of mean ± std. dev. across 10 independent runs. For the remaining methods, we report the results from the original papers (std. dev. not provided). The boldface indicates the best value per column.

Methods | Test RMSE | s-Score × (${10}^{3}$) | Trainable Parameters | Test Execution Time (ms) |
---|---|---|---|---|

MLP [39] | 8.34 ± 0.00 | 20.41 | 94,701 | 1223 |

CNN [39] | 7.22 ± 0.00 | 1.14 | 5722 | 1178 |

SNPS (1) | 19.72 ± 0.44 | 8.39 | 218 | 19 |

SNPS (2) | 20.11 ± 0.62 | 6.53 | 180 | 17 |

SNPS (3) | 19.23 ± 0.57 | 7.92 | 83 | 12 |

SNPS (4) | 18.14 ± 0.52 | 6.39 | 56 | 11 |

SNPS (5) | 18.25 ± 0.59 | 7.69 | 105 | 15 |

SNPS (6) | 18.45 ± 0.57 | 6.48 | 98 | 14 |

${R}_{rnd}$ | 26.85 ± 0.00 | 276.76 | - | - |

${R}_{\mu}$ | 18.97 ± 0.00 | 63.98 | - | - |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Custode, L.L.; Mo, H.; Ferigo, A.; Iacca, G.
Evolutionary Optimization of Spiking Neural P Systems for Remaining Useful Life Prediction. *Algorithms* **2022**, *15*, 98.
https://doi.org/10.3390/a15030098

**AMA Style**

Custode LL, Mo H, Ferigo A, Iacca G.
Evolutionary Optimization of Spiking Neural P Systems for Remaining Useful Life Prediction. *Algorithms*. 2022; 15(3):98.
https://doi.org/10.3390/a15030098

**Chicago/Turabian Style**

Custode, Leonardo Lucio, Hyunho Mo, Andrea Ferigo, and Giovanni Iacca.
2022. "Evolutionary Optimization of Spiking Neural P Systems for Remaining Useful Life Prediction" *Algorithms* 15, no. 3: 98.
https://doi.org/10.3390/a15030098