# Distributed Deep Fusion Predictor for a Multi-Sensor System Based on Causality Entropy

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Related Works

#### 2.1. The Methods for Prediction

#### 2.2. The Method to Calculate Causality and Correlation

#### 2.3. The Bayesian Deep Learning Network

#### 2.4. Innovation

## 3. Distributed Deep Fusion Predictor

#### 3.1. Series Causality Entropy

#### 3.2. Bayesian LSTM as the Sub-Predictor

#### 3.3. Model Framework

## 4. Experiments

#### 4.1. Dataset

#### 4.2. Experimental Setup

#### 4.3. Case 1

#### 4.4. Case 2

#### 4.5. Case 3

## 5. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Conflicts of Interest

## References

- Rubio, F.; Valero, F.; Llopis-Albert, C. A review of mobile robots: Concepts, methods, theoretical framework, and applications. Int. J. Adv. Robot. Syst.
**2019**, 16, 1–22. [Google Scholar] [CrossRef] [Green Version] - Zhao, Z.; Wang, X.; Yao, P.; Bai, Y. A health performance evaluation method of multirotors under wind turbulence. Nonlinear Dyn.
**2020**, 102. [Google Scholar] [CrossRef] - Zhang, X.; Zhao, Z.; Wang, Z.; Wang, X. Fault detection and identification method for quadcopter based on airframe vibration signals. Sensors
**2021**, 21, 581. [Google Scholar] [CrossRef] - Jin, X.; Yu, X.; Wang, X. Deep Learning Predictor for Sustainable Precision Agriculture Based on Internet of Things System. Sustainability
**2020**, 12, 1433. [Google Scholar] [CrossRef] [Green Version] - Jin, X.; Yang, N.; Wang, X. Hybrid Deep Learning Predictor for Smart Agriculture Sensing Based on Empirical Mode Decomposition and Gated Recurrent Unit Group Model. Sensors
**2020**, 20, 1334. [Google Scholar] [CrossRef] [Green Version] - Senthilkumar, R.; Venkatakrishnan, P.; Balaji, N. Intelligent based novel embedded system based IoT enabled air pollution monitoring system. Microprocess. Microsyst.
**2020**, 77, 103172. [Google Scholar] [CrossRef] - Jin, X.; Yang, N.; Wang, X. Integrated Predictor Based on Decomposition Mechanism for PM2.5 Long-Term Prediction. Appl. Sci.
**2019**, 9, 4533. [Google Scholar] [CrossRef] [Green Version] - Jin, X.; Sun, S.; Wei, H.; Yang, F. Advances in Multi-Sensor Information Fusion: Theory and Applications 2017. Sensors
**2018**, 18, 1162. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Lee, D.; Kim, K. Recurrent neural network-based hourly prediction of photovoltaic power output using meteorological information. Energies
**2019**, 12, 215. [Google Scholar] [CrossRef] [Green Version] - Jin, X.; Yang, N.; Wang, X. Deep hybrid model based on EMD with classification by frequency characteristics for long-term air quality prediction. Mathematics
**2020**, 8, 214. [Google Scholar] [CrossRef] [Green Version] - Huang, C.J.; Kuo, P.H. A deep CNN-LSTM model for particulate matter (PM2.5) forecasting in smart cities. Sensors
**2018**, 18, 2220. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Bai, Y.; Wang, X.; Sun, Q. Spatio-temporal prediction for the monitoring-blind area of industrial atmosphere based on the fusion network. Int. J. Environ. Res. Public Health
**2019**, 16, 3788. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Bai, Y.; Jin, X.; Wang, X.; Wang, X.; Xu, J. Dynamic correlation analysis method of air pollutants in spatio-temporal analysis. Int. J. Environ. Res. Public Health
**2020**, 17, 360. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Ziemann, T.; Peri, H.; Singh, A. System and method for enhancing trust for person-related data sources. U.S. Patent 10,542,043, 21 January 2020. [Google Scholar]
- Runge, J.; Nowack, P.; Kretschmer, M.; Flaxman, S.; Sejdinovic, D. Detecting and quantifying causal associations in large nonlinear time series datasets. Sci. Adv.
**2019**, 5, eaau4996. [Google Scholar] [CrossRef] [Green Version] - Bai, Y.; Jin, X.; Wang, X.; Su, T.; Kong, J.; Lu, Y. Compound autoregressive network for prediction of multivariate time series. Complexity
**2019**, 2019, 9107167. [Google Scholar] [CrossRef] - Rendon-Sanchez, J.F.; de Menezes, L.M. Structural combination of seasonal exponential smoothing forecasts applied to load forecasting. Eur. J. Oper. Res.
**2019**, 275, 916–924. [Google Scholar] [CrossRef] - Gautam, S. A novel moving average forecasting approach using fuzzy time series data set. J. Control. Autom. Electr. Syst.
**2019**, 30, 532–544. [Google Scholar] [CrossRef] - Messner, J.W.; Pinson, P. Online adaptive lasso estimation in vector autoregressive models for high dimensional wind power forecasting. Int. J. Forecast.
**2019**, 35, 1485–1498. [Google Scholar] [CrossRef] - Alsharif, M.; Younes, M.; Kim, J. Time series ARIMA model for prediction of daily and monthly average global solar radiation: The case study of Seoul, South Korea. Symmetry
**2019**, 11, 240. [Google Scholar] [CrossRef] [Green Version] - Zhang, X.; Wang, J.; Gao, Y. A hybrid short-term electricity price forecasting framework: Cuckoo search-based feature selection with singular spectrum analysis and SVM. Energy Econ.
**2019**, 81, 899–913. [Google Scholar] [CrossRef] - Chitsazan, M.A.; Fadali, M.S.; Trzynadlowski, A.M. Wind speed and wind direction forecasting using echo state network with nonlinear functions. Renew. Energy
**2019**, 131, 879–889. [Google Scholar] [CrossRef] - Ren, Y.; Mao, J.; Liu, Y. A novel dbn model for time series forecasting. IAENG Int. J. Comput. Sci.
**2017**, 44, 79–86. [Google Scholar] - Sulaiman, J.; Wahab, S.H. Heavy rainfall forecasting model using artificial neural network for flood prone area. In IT Convergence and Security 2017; Springer: Singapore, 2018; pp. 68–76. [Google Scholar]
- Izonin, I.; Tkachenko, R.; Verhun, V.; Zub, K. An approach towards missing data management using improved GRNN-SGTM ensemble method-ScienceDirect. Eng. Sci. Technol. Int. J.
**2020**, in press. [Google Scholar] - Min, K.; Kim, D.; Park, J.; Huh, K. RNN-based path prediction of obstacle vehicles with deep ensemble. IEEE Trans. Veh. Technol.
**2019**, 10, 10252–10256. [Google Scholar] [CrossRef] - Sundermeyer, M.; Schlüter, R.; Ney, H. LSTM neural networks for language modeling. In Proceedings of the Thirteenth Annual Conference of the International Speech Communication Association, Portland, OR, USA, 9–13 September 2012; pp. 194–197. [Google Scholar]
- Wang, Y.; Liao, W.; Chang, Y. Gated recurrent unit network-based short-term photovoltaic forecasting. Energies
**2018**, 11, 2163. [Google Scholar] [CrossRef] [Green Version] - Tang, X.; Dai, Y.; Wang, T. Short-term power load forecasting based on multi-layer bidirectional recurrent neural network. IET Gener. Transm. Distrib.
**2019**, 13, 3847–3854. [Google Scholar] [CrossRef] - Qi, Y.; Li, Q.; Karimian, H.; Liu, D. A hybrid model for spatiotemporal forecasting of PM2.5 based on graph convolutional neural network and long short-term memory. Sci. Total. Environ.
**2019**, 664, 1–10. [Google Scholar] [CrossRef] - Tian, C.; Ma, J.; Zhang, C. A deep neural network model for short-term load forecast based on long short-term memory network and convolutional neural network. Energies
**2018**, 11, 3493. [Google Scholar] [CrossRef] [Green Version] - Xu, L.; Ding, F. Iterative parameter estimation for signal models based on measured data. Circuits Syst. Signal Process.
**2018**, 37, 3046–3069. [Google Scholar] [CrossRef] - Xu, L. The parameter estimation algorithms based on the dynamical response measurement data. Adv. Mech. Eng.
**2017**, 9, 1687814017730003. [Google Scholar] [CrossRef] - Ding, F.; Wang, X.H.; Mao, L.; Xu, L. Joint state and multi-innovation parameter estimation for time-delay linear systems and its convergence based on the Kalman filtering. Dig. Signal Proc.
**2017**, 62, 211–223. [Google Scholar] [CrossRef] - Ding, F.; Xu, L.; Zhu, Q. Performance analysis of the generalised projection identification for time-varying systems. IET Control. Theory Appl.
**2016**, 10, 2506–2514. [Google Scholar] [CrossRef] [Green Version] - Ding, J.; Cao, Z.X.; Chen, J.Z.; Jiang, G.P. Weighted parameter estimation for Hammerstein nonlinear ARX systems. Circuits Syst. Signal Proc.
**2020**, 39, 2178–2192. [Google Scholar] [CrossRef] - Ding, F.; Xu, L.; Meng, D.; Jin, X.B.; Alsaedi, A.; Hayat, T. Gradient estimation algorithms for the parameter identification of bilinear systems using the auxiliary model. J. Comput. Appl. Math.
**2020**, 369, 112575. [Google Scholar] [CrossRef] - Pan, J.; Jiang, X.; Wan, X.K.; Ding, W. A filtering based multi-innovation extended stochastic gradient algorithm for multivariable control systems. Int. J. Control Autom. Syst.
**2017**, 15, 1189–1197. [Google Scholar] [CrossRef] - Zhang, X. Recursive parameter estimation and its convergence for bilinear systems. IET Control. Theory Appl.
**2020**, 14, 677–688. [Google Scholar] [CrossRef] - Li, M.H.; Liu, X.M. The least squares based iterative algorithms for parameter estimation of a bilinear system with autoregressive noise using the data filtering technique. Signal Process.
**2018**, 147, 23–34. [Google Scholar] [CrossRef] - Zhang, X.; Liu, Q.Y. Recursive identification of bilinear time-delay systems through the redundant rule. J. Frankl. Inst.
**2020**, 357, 726–747. [Google Scholar] [CrossRef] - Xu, L. Parameter estimation algorithms for dynamical response signals based on the multi-innovation theory and the hierarchical principle. IET Signal Process.
**2017**, 11, 228–237. [Google Scholar] [CrossRef] - Waldmann, P. On the use of the pearson correlation coefficient for model evaluation in genome-wide prediction. Front. Genet.
**2019**, 10, 899. [Google Scholar] [CrossRef] [PubMed] - Amarkhil, Q.; Elwakil, E.; Hubbard, B. A meta-analysis of critical causes of project delay using spearman’s rank and relative importance index integrated approach. Can. J. Civ. Eng.
**2020**. Just-IN. [Google Scholar] [CrossRef] - Duan, S.; Yang, W.; Wang, X. Grain pile temperature forecasting from weather factors: A support vector regression approach. In Proceedings of the 2019 IEEE/CIC International Conference on Communications in China (ICCC), Changchun, China, 11–13 August 2019; pp. 255–260. [Google Scholar] [CrossRef]
- Ciulla, G.; D’Amico, A. Building energy performance forecasting: A multiple linear regression approach. Appl. Energy
**2019**, 253, 113500. [Google Scholar] [CrossRef] - Jing, B.; Qian, Z.; Pei, Y.; Wang, J. Ultra short-term PV power forecasting based on ELM segmentation model. J. Eng.
**2017**, 2017, 2564–2568. [Google Scholar] [CrossRef] - Lin, C.Y.; Chang, Y.S.; Chiao, H.T. Design a Hybrid Framework for Air Pollution Forecasting. In Proceedings of the 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), Bari, Italy, 6–9 October 2019; pp. 2472–2477. [Google Scholar]
- Gao, D.; Zhou, Y.; Wang, T.; Wang, Y. A Method for predicting the remaining useful life of lithium-ion batteries based on particle filter using Kendall rank correlation coefficient. Energies
**2020**, 13, 4183. [Google Scholar] [CrossRef] - Contreras-Reyes, J.E.; Hernández-Santoro, C. Assessing granger-causality in the southern humboldt current ecosystem using cross-spectral methods. Entropy
**2020**, 22, 1071. [Google Scholar] [CrossRef] - Podobnik, B.; Stanley, H.E. Detrended cross-correlation analysis: A new method for analyzing two nonstationary time series. Phys. Rev. Lett.
**2008**, 100, 084102. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Jin, X.; Zhang, J.; Su, T.; Bai, Y.; Kong, J.; Wang, X. Wavelet-deep optimized model for nonlinear multi-component data forecasting. Comput. Intell. Neurosci.
**2021**. accept. [Google Scholar] - Jin, X.; Wang, H.; Wang, X. Deep-learning prediction model with serial two-level decomposition based on bayesian optimization. Complexity
**2020**, 2020, 4346803. [Google Scholar] [CrossRef] - Wang, H.; Yeung, D.Y. A survey on Bayesian deep learning. ACM Comput. Surv. (CSUR)
**2020**, 53, 1–37. [Google Scholar] [CrossRef] - Mukhopadhyay, P.; Mallick, S. Bayesian deep learning for seismic facies classification and its uncertainty estimation. Soc. Explor. Geophys.
**2019**, 2488–2492. [Google Scholar] [CrossRef] - Zhang, R.; Li, C.; Zhang, J. Cyclical stochastic gradient MCMC for Bayesian deep learning. arXiv
**2019**, arXiv:1902.03932. [Google Scholar] - Jin, X.; Lian, X.; Su, T.; Shi, Y.; Miao, B. Closed-loop estimation for randomly sampled measurements in target tracking system. Math. Probl. Eng.
**2014**, 2014, 315908. [Google Scholar] - Jin, X.; Du, J.; Bao, J. Target tracking of a linear time invariant system under irregular sampling. Int. J. Adv. Robot. Syst.
**2012**, 9, 219–230. [Google Scholar] - Li, G.; Yang, L.; Lee, C.G.; Wang, X.; Rong, M. A Bayesian deep learning RUL framework integrating epistemic and aleatoric uncertainties. IEEE Trans. Ind. Electron.
**2020**, 1. [Google Scholar] [CrossRef] - Harper, R.; Southern, J. A Bayesian deep learning framework for end-to-end prediction of emotion from heartbeat. IEEE Trans. Affect. Comput.
**2020**. [Google Scholar] [CrossRef] [Green Version] - Abdi, H. The Kendall rank correlation coefficient. In Encyclopedia of Measurement and Statistics, Salkind, N.J., Ed; SAGE Publications Inc.: Thousand Oaks, CA, USA, 2007; pp. 508–510. [Google Scholar]
- Lopez Quintero, F.O.; Contreras-Reyes, J.E.; Wiff, R.; Arellano-Valle, R.B. Flexible Bayesian analysis of the von bertalanffy growth function with the use of a log-skew-t distribution. Fish. Bull.
**2017**, 115, 13–26. [Google Scholar] [CrossRef] - Bouhlel, N.; Dziri, A. Kullback–Leibler divergence between multivariate generalized gaussian distributions. IEEE Signal Process. Lett.
**2019**, 26, 1021–1025. [Google Scholar] [CrossRef] - Xu, L.; Xiong, W.L.; Alsaedi, A.; Hayat, T. Hierarchical parameter estimation for the frequency response based on the dynamical window data. Int. J. Control Autom. Syst.
**2018**, 16, 1756–1764. [Google Scholar] [CrossRef] - Gu, Y.; Liu, J.; Li, X.; Chou, Y.; Ji, Y. State space model identification of multirate processes with time-delay using the expectation maximization. J. Frankl. Inst.
**2019**, 356, 1623–1639. [Google Scholar] [CrossRef] - Xu, L.; Ding, F.; Zhu, Q.M. Hierarchical Newton and least squares iterative estimation algorithm for dynamic systems by transfer functions based on the impulse responses. Int. J. Syst. Sci.
**2019**, 50, 141–151. [Google Scholar] [CrossRef] [Green Version] - Xu, L.; Ding, F.; Lu, X.; Wan, L.J.; Sheng, J. Hierarchical multi-innovation generalised extended stochastic gradient methods for multivariable equation-error autoregressive moving average systems. IET Control. Theory Appl.
**2020**, 14, 1276–1286. [Google Scholar] [CrossRef] - Pan, J.; Ma, H.; Zhang, X.; Liu, Q.; Ding, F.; Chang, Y.; Sheng, J. Recursive coupled projection algorithms for multivariable output-error-like systems with coloured noises. IET Signal Process.
**2020**, 14, 455–466. [Google Scholar] [CrossRef] - Xu, L.; Ding, F.; Wan, L.J.; Sheng, J. Separable multi-innovation stochastic gradient estimation algorithm for the nonlinear dynamic responses of systems. Int. J. Adapt. Control Signal Process.
**2020**, 34, 937–954. [Google Scholar] [CrossRef] - Zhang, X.; Ding, F.; Alsaadi, F.E.; Hayat, T. Recursive parameter identification of the dynamical models for bilinear state space systems. Nonlinear Dyn.
**2017**, 89, 2415–2429. [Google Scholar] [CrossRef] - Zhang, X.; Xu, L.; Ding, F.; Hayat, T. Combined state and parameter estimation for a bilinear state space system with moving average noise. J. Frankl. Inst.
**2018**, 355, 3079–3103. [Google Scholar] [CrossRef] - Gu, Y.; Zhu, Q.; Nouri, H. Bias compensation-based parameter and state estimation for a class of time-delay nonlinear state-space models. IET Control. Theory Appl.
**2020**, 14, 2176–2185. [Google Scholar] [CrossRef] - Zhang, X.; Ding, F.; Xu, L.; Yang, E. State filtering-based least squares parameter estimation for bilinear systems using the hierarchical identification principle. IET Control. Theory Appl.
**2018**, 12, 1704–1713. [Google Scholar] [CrossRef] [Green Version] - Wang, L.J.; Ji, Y.; Wan, L.J.; Bu, N. Hierarchical recursive generalized extended least squares estimation algorithms for a class of nonlinear stochastic systems with colored noise. J. Frankl. Inst.
**2019**, 356, 10102–10122. [Google Scholar] [CrossRef] - Zhang, X.; Ding, F.; Xu, L.; Yang, E.F. Highly computationally efficient state filter based on the delta operator. Int. J. Adapt. Control Signal Process.
**2019**, 33, 875–889. [Google Scholar] [CrossRef] - Fan, Y.M.; Liu, X.M. Two-stage auxiliary model gradient-based iterative algorithm for the input nonlinear controlled autoregressive system with variable-gain nonlinearity. Int. J. Robust Nonlinear Control
**2020**, 30, 5492–5509. [Google Scholar] [CrossRef] - Zhang, X.; Ding, F.; Yang, E.F. State estimation for bilinear systems through minimizing the covariance matrix of the state estimation errors. Int. J. Adapt. Control Signal Process.
**2019**, 33, 1157–1173. [Google Scholar] [CrossRef]

**Figure 4.**The difference between the normal LSTM network and the Bayesian LSTM network. (

**a**) The parameters in the LSTM; (

**b**) the example of the parameters in the normal LSTM; (

**c**) the example of the parameters in the Bayesian LSTM.

**Figure 6.**The prediction results of the temperature. The above picture is the prediction for the first 200 hours, which is a part of the bottom picture, in which we draw the results for about 21 days. We can see that in the bottom picture, the sensor is out of order with two hours, in which the sensor measurement data are zero. However, the prediction result effectively overcomes the sensor’s failure and gives a daily temperature trend consistent with historical data.

**Figure 7.**Comparison of prediction performance with two inputs. The input variables are historical temperature and humidity, historical temperature and wind force, and historical temperature and wind direction, respectively. We can find that when the inputs are the historical temperature and humidity, the least RMSE, MSE, and MAE and the largest R can be obtained.

**Figure 8.**Comparison of prediction performance with multiple inputs. We can see that when two input variables are used, compared with one input variable, the RMSE, MSE, and MAE decrease and R increases, which shows that the performance is getting better. However, as the number of input variables increases, the performance becomes worse. For example, when the input variables are historical temperature, humidity, and wind force, the prediction performance worsens. Further, when we use the four input variables, the performance is the worst.

**Figure 9.**Comparison of the prediction performance with different sub-predictors. We can find that the proposed model with the Bayesian LSTM is the best, obtaining the least RMSE, of 2.374, MSE, and MAE and the largest R.

Measurement | Series Causality Coefficient (SCC) |
---|---|

Temperature | 0.3673 |

Humidity | 0.3259 |

Wind force | 0.175 |

Wind direction | 0.1318 |

Rainfall | 0 |

Step | Optimization Process |
---|---|

0 | Set the scale parameter $\alpha $ as $\alpha \in (0,\phantom{\rule{0.277778em}{0ex}}1)$. |

1 | Sample the random variable $\epsilon $ as $\epsilon \sim \mathcal{N}(0,1)$. |

2 | Set the initial value of the optimized parameters $(\mu ,\phantom{\rule{0.277778em}{0ex}}\rho )$. |

3 | Sample all the parameters as $\theta =\mu \phantom{\rule{0.277778em}{0ex}}+\phantom{\rule{0.277778em}{0ex}}log(1\phantom{\rule{0.277778em}{0ex}}+\phantom{\rule{0.277778em}{0ex}}exp(\rho \left)\right)\phantom{\rule{0.277778em}{0ex}}\otimes \phantom{\rule{0.277778em}{0ex}}\epsilon $. |

4 | Set the cost function as $Loss=logq\left(\theta \right|\mu ,\phantom{\rule{0.277778em}{0ex}}\rho )-logP(\theta )+logP(D\left|\theta \right)$. |

5 | Calculate the gradient by the mean with the training data D as $\u25b3\mu =\frac{\partial Loss}{\partial \theta}+\frac{\partial Loss}{\partial \mu}$. |

6 | Calculate the gradient by the standard deviation with the training data D as $\u25b3\rho =\frac{\partial Loss}{\partial \theta}\frac{\epsilon}{1+exp(-\rho )}+\frac{\partial Loss}{\partial \rho}$. |

7 | Update the parameters $(\mu ,\phantom{\rule{0.277778em}{0ex}}\rho )$ as the following: $\mu \leftarrow \mu -\alpha \u25b3\mu ,\phantom{\rule{0.277778em}{0ex}}\rho \leftarrow \rho -\alpha \u25b3\rho $. |

Layers | Design Details | Experiment Setup |
---|---|---|

Bayesian LSTM | Number of layers: 1 Number of neurons: 24 Sampling number: 4 | Batch size: 30 Epochs: 100 Learning rate: 0.001 |

MLP | Number of layers: 1 Number of neurons: 24 |

Input Data | RMSE | MSE | MAE | R |
---|---|---|---|---|

Historical temperature and humidity | 3.203 | 10.260 | 2.000 | 0.940 |

Historical temperature and wind force | 3.244 | 10.525 | 2.108 | 0.937 |

Historical temperature and wind direction | 3.400 | 11.559 | 2.250 | 0.932 |

Input Data | RMSE | MSE | MAE | R |
---|---|---|---|---|

Historical temperature | 3.508 | 12.305 | 2.331 | 0.930 |

Historical temperature and humidity | 3.203 | 10.260 | 2.000 | 0.940 |

Historical temperature, humidity, and wind force | 3.235 | 10.465 | 2.014 | 0.938 |

Historical temperature, humidity, wind force, and wind direction | 3.230 | 10.430 | 2.032 | 0.938 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Jin, X.-B.; Yu, X.-H.; Su, T.-L.; Yang, D.-N.; Bai, Y.-T.; Kong, J.-L.; Wang, L.
Distributed Deep Fusion Predictor for a Multi-Sensor System Based on Causality Entropy. *Entropy* **2021**, *23*, 219.
https://doi.org/10.3390/e23020219

**AMA Style**

Jin X-B, Yu X-H, Su T-L, Yang D-N, Bai Y-T, Kong J-L, Wang L.
Distributed Deep Fusion Predictor for a Multi-Sensor System Based on Causality Entropy. *Entropy*. 2021; 23(2):219.
https://doi.org/10.3390/e23020219

**Chicago/Turabian Style**

Jin, Xue-Bo, Xing-Hong Yu, Ting-Li Su, Dan-Ni Yang, Yu-Ting Bai, Jian-Lei Kong, and Li Wang.
2021. "Distributed Deep Fusion Predictor for a Multi-Sensor System Based on Causality Entropy" *Entropy* 23, no. 2: 219.
https://doi.org/10.3390/e23020219