# Measuring Uncertainty in the Negation Evidence for Multi-Source Information Fusion

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Preliminaries

#### 2.1. Dempster–Shafer Evidence Theory

#### 2.2. The Negation Evidence

#### 2.3. Belief Entropy

## 3. An Improved Multi-Source Information Fusion Method Based on Measuring the Uncertainty of Negation Evidence

#### 3.1. Improved Belief Entropy of Negation Evidence

#### 3.2. Multi-Source Information Fusion Considering the Uncertainty of Negation Evidence

- Step 1
- Modeling uncertain information with the original BPA in Dempster–Shafer evidence theory.
- Step 2
- Calculating the negation of BPAs and using the proposed belief entropy of negation evidence for the uncertainty measure in the negation evidence.For the ith BOE ($i=1,2,\cdots $), the corresponding uncertain degree with the belief entropy of negation evidence ${E}_{n}$ is calculated as follows:$${E}_{n}\left({m}_{i}\right)=-\sum _{A\subseteq X}{\overline{m}}_{i}\left(A\right){log}_{2}\frac{{\overline{m}}_{i}\left(A\right)}{{2}^{\left|A\right|}-1},$$$${\overline{m}}_{i}\left(A\right)=\frac{1-{m}_{i}\left(A\right)}{n-1}.$$
- Step 3
- Construct the weight factor of each BOE based on the uncertainty measure results.There may be conflict among different sources of evidence. The weight factor is based on the uncertainty measure and for balancing different information sources. The relative weight factor for the ith BOE ($i=1,2,\cdots ,m$) among all the available number of BOEs, denoted as ${w}_{i}$, is defined as follows:$${w}_{i}=\frac{{E}_{n}\left({m}_{i}\right)}{{\displaystyle \sum _{i=1}^{m}}{E}_{n}\left({m}_{i}\right)}.$$
- Step 4
- Evidence modification based on the weight factor with the belief entropy of negation evidence.Based on the weight factor of each BOE, the weighted mass function of each proposition is calculated for final data fusion. For each proposition A in the BOE, the weighted mass function can be calculated as follows:$${m}_{w}\left(A\right)=\sum _{i=1}^{n}{w}_{i}{m}_{i}\left(A\right).$$
- Step 5
- Evidence fusion with Dempster’s rule of combination.The BPAs of multi-source information have been measured and modified based on the proposed measure and now are ready for information fusion with Dempster’s rule of combination. For each proposition A in the BOE, the combination result of modified evidence can be calculated by calculating Dempster’s rule of combination with $(m-1)$ ($m\ge 2$) times:$$m\left(A\right)={({({({({m}_{w}\oplus {m}_{w})}_{1}\oplus {m}_{w})}_{2}...\oplus {m}_{w})}_{(m-2)}\oplus {m}_{w})}_{(m-1)}\left(A\right).$$

## 4. Experiment and Discussion

#### 4.1. Experiment with Artificial Data

#### 4.2. Experiment in Fault Diagnosis

**Step 1**- Modeling uncertain information with BPA in Dempster–Shafer evidence theory.

**Step 2**- Using the proposed belief entropy of negation evidence for uncertainty measure of the negation of BPAs.

**Step 3**- Construct the weight factor of each BOE based on the uncertainty measure result.

**Step 4**- Evidence modification based on the weight factor with the belief entropy of negation evidence.

**Step 5**- Evidence fusion with Dempster’s rule of combination.

#### 4.3. Discussion and Limitation

## 5. Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Data Availability Statement

## Conflicts of Interest

## References

- Zhang, X.; Mahadevan, S. Bayesian neural networks for flight trajectory prediction and safety assessment. Decis. Support Syst.
**2020**, 131, 113246. [Google Scholar] [CrossRef] - Zhou, T.; Zhang, X.; Droguett, E.L.; Mosleh, A. A generic physics-informed neural network-based framework for reliability assessment of multi-state systems. Reliab. Eng. Syst. Saf.
**2023**, 229, 108835. [Google Scholar] [CrossRef] - Dempster, A.P. Upper and Lower Probabilities Induced by a Multi-valued Mapping. Ann. Math. Stat.
**1967**, 38, 325–339. [Google Scholar] [CrossRef] - Shafer, G. A Mathematical Theory of Evidence; Princeton University Press: Princeton, NJ, USA, 1976. [Google Scholar]
- Liu, Z.G.; Zhang, Z.W.; Pan, Q.; Ning, L.B. Unsupervised Change Detection From Heterogeneous Data Based on Image Translation. IEEE Trans. Geosci. Remote Sens.
**2022**, 60, 1–13. [Google Scholar] [CrossRef] - Su, Z.G.; Denoeux, T. BPEC: Belief-peaks evidential clustering. IEEE Trans. Fuzzy Syst.
**2018**, 27, 111–123. [Google Scholar] [CrossRef] - Jiao, L.; Wang, F.; Liu, Z.G.; Pan, Q. TECM: Transfer learning-based evidential c-means clustering. Knowl.-Based Syst.
**2022**, 257, 109937. [Google Scholar] [CrossRef] - Wu, D.; Liu, Z.; Tang, Y. A new classification method based on the negation of a basic probability assignment in the evidence theory. Eng. Appl. Artif. Intell.
**2020**, 96, 103985. [Google Scholar] [CrossRef] - Jiao, L.; Yang, H.; Liu, Z.G.; Pan, Q. Interpretable fuzzy clustering using unsupervised fuzzy decision trees. Inf. Sci.
**2022**, 611, 540–563. [Google Scholar] [CrossRef] - Zhou, K.; Martin, A.; Pan, Q.; Liu, Z. SELP: Semi–supervised evidential label propagation algorithm for graph data clustering. Int. J. Approx. Reason.
**2018**, 92, 139–154. [Google Scholar] [CrossRef] [Green Version] - Liu, Z.G.; Huang, L.Q.; Zhou, K.; Denoeux, T. Combination of transferable classification with multisource domain adaptation based on evidential reasoning. IEEE Trans. Neural Netw. Learn. Syst.
**2020**, 32, 2015–2029. [Google Scholar] [CrossRef] - Xiao, F. An improved method for combining conflicting evidences based on the similarity measure and belief function entropy. Int. J. Fuzzy Syst.
**2018**, 20, 1256–1266. [Google Scholar] [CrossRef] - Fu, C.; Hou, B.; Chang, W.; Feng, N.; Yang, S. Comparison of Evidential Reasoning Algorithm with Linear Combination in Decision Making. Int. J. Fuzzy Syst.
**2020**, 22, 686–711. [Google Scholar] [CrossRef] - Fu, C.; Chang, W.; Xue, M.; Yang, S. Multiple criteria group decision making with belief distributions and distributed preference relations. Eur. J. Oper. Res.
**2019**, 273, 623–633. [Google Scholar] [CrossRef] - Fei, L.; Lu, J.; Feng, Y. An extended best-worst multi-criteria decision-making method by belief functions and its applications in hospital service evaluation. Comput. Ind. Eng.
**2020**, 142, 106355. [Google Scholar] [CrossRef] - Zhang, L.; Ding, L.; Wu, X.; Skibniewski, M.J. An improved Dempster–Shafer approach to construction safety risk perception. Knowl.-Based Syst.
**2017**, 132, 30–46. [Google Scholar] [CrossRef] - Zhao, F.J.; Zhou, Z.J.; Hu, C.H.; Chang, L.L.; Zhou, Z.G.; Li, G.L. A New Evidential Reasoning-Based Method for Online Safety Assessment of Complex Systems. IEEE Trans. Syst. Man-Cybern.-Syst.
**2018**, 48, 954–966. [Google Scholar] [CrossRef] - Zheng, X.; Deng, Y. Dependence assessment in human reliability analysis based on evidence credibility decay model and IOWA operator. Ann. Nucl. Energy
**2018**, 112, 673–684. [Google Scholar] [CrossRef] - Su, X.; Mahadevan, S.; Xu, P.; Deng, Y. Dependence Assessment in Human Reliability Analysis Using Evidence Theory and AHP. Risk Anal.
**2015**, 35, 1296–1316. [Google Scholar] [CrossRef] - Liu, T.; Deng, Y.; Chan, F. Evidential supplier selection based on DEMATEL and game theory. Int. J. Fuzzy Syst.
**2018**, 20, 1321–1333. [Google Scholar] [CrossRef] - Chen, L.; Deng, Y. A new failure mode and effects analysis model using Dempster–Shafer evidence theory and grey relational projection method. Eng. Appl. Artif. Intell.
**2018**, 76, 13–20. [Google Scholar] [CrossRef] - Wu, D.; Tang, Y. An improved failure mode and effects analysis method based on uncertainty measure in the evidence theory. Qual. Reliab. Eng. Int.
**2020**, 36, 1786–1807. [Google Scholar] [CrossRef] - Chen, L.; Diao, L.; Sang, J. A novel weighted evidence combination rule based on improved entropy function with a diagnosis application. Int. J. Distrib. Sens. Netw.
**2019**, 15, 1550147718823990. [Google Scholar] [CrossRef] - Xu, X.; Weng, X.; Xu, D.; Xu, H.; Hu, Y.; Li, J. Evidence updating with static and dynamical performance analyses for industrial alarm system design. ISA Trans.
**2020**, 99, 110–122. [Google Scholar] [CrossRef] - Xiong, L.; Su, X.; Qian, H. Conflicting evidence combination from the perspective of networks. Inf. Sci.
**2021**, 580, 408–418. [Google Scholar] [CrossRef] - Su, X.; Li, L.; Qian, H.; Mahadevan, S.; Deng, Y. A new rule to combine dependent bodies of evidence. Soft Comput.
**2019**, 23, 9793–9799. [Google Scholar] [CrossRef] - Xiao, F. Multi-sensor data fusion based on the belief divergence measure of evidences and the belief entropy. Inf. Fusion
**2019**, 46, 23–32. [Google Scholar] [CrossRef] - Zhang, J.; Deng, Y. A method to determine basic probability assignment in the open world and its application in data fusion and classification. Appl. Intell.
**2017**, 46, 934–951. [Google Scholar] [CrossRef] - Jing, M.; Tang, Y. A new base basic probability assignment approach for conflict data fusion in the evidence theory. Appl. Intell.
**2021**, 51, 1056–1068. [Google Scholar] [CrossRef] - Jiroušek, R.; Shenoy, P.P. A new definition of entropy of belief functions in the Dempster–Shafer theory. Int. J. Approx. Reason.
**2018**, 92, 49–65. [Google Scholar] [CrossRef] [Green Version] - Jiroušek, R.; Shenoy, P.P. On properties of a new decomposable entropy of Dempster–Shafer belief functions. Int. J. Approx. Reason.
**2020**, 119, 260–279. [Google Scholar] [CrossRef] - Nascimento, J.; Ferreira, F.; Aguiar, V.; Guedes, I.; Costa Filho, R.N. Information measures of a deformed harmonic oscillator in a static electric field. Phys. Stat. Mech. Its Appl.
**2018**, 499, 250–257. [Google Scholar] [CrossRef] - Srivastava, A.; Kaur, L. Uncertainty and negation-Information theoretic applications. Int. J. Intell. Syst.
**2019**, 34, 1248–1260. [Google Scholar] [CrossRef] - Ostovare, M.; Shahraki, M.R. Evaluation of hotel websites using the multicriteria analysis of PROMETHEE and GAIA: Evidence from the five-star hotels of Mashhad. Tour. Manag. Perspect.
**2019**, 30, 107–116. [Google Scholar] [CrossRef] - George, T.; Pal, N.R. Quantification of conflict in Dempster-Shafer framework: A new approach. Int. J. Gen. Syst.
**1996**, 24, 407–423. [Google Scholar] [CrossRef] - Song, Y.; Wang, X.; Lei, L.; Yue, S. Uncertainty measure for interval-valued belief structures. Measurement
**2015**, 80, 241–250. [Google Scholar] [CrossRef] - Deng, Y. Deng entropy. Chaos Solitons Fractals
**2016**, 91, 549–553. [Google Scholar] [CrossRef] - Jiang, W.; Hu, W. An improved soft likelihood function for Dempster–Shafer belief structures. Int. J. Intell. Syst.
**2018**, 33, 1264–1282. [Google Scholar] [CrossRef] - Jiang, W. A correlation coefficient for belief functions. Int. J. Approx. Reason.
**2018**, 103, 94–106. [Google Scholar] [CrossRef] [Green Version] - Zhou, Q.; Deng, Y. Fractal-based belief entropy. Inf. Sci.
**2022**, 587, 265–282. [Google Scholar] [CrossRef] - Deng, Y. Uncertainty measure in evidence theory. Sci. China Inf. Sci.
**2020**, 63, 1–19. [Google Scholar] [CrossRef] - Yager, R.R. On the maximum entropy negation of a probability distribution. IEEE Trans. Fuzzy Syst.
**2014**, 23, 1899–1902. [Google Scholar] [CrossRef] - Yin, L.; Deng, X.; Deng, Y. The negation of a basic probability assignment. IEEE Trans. Fuzzy Syst.
**2018**, 27, 135–143. [Google Scholar] [CrossRef] - Deng, X.; Jiang, W. On the negation of a Dempster–Shafer belief structure based on maximum uncertainty allocation. Inf. Sci.
**2020**, 516, 346–352. [Google Scholar] [CrossRef] [Green Version] - Tang, Y.; Wu, D.; Liu, Z. A new approach for generation of generalized basic probability assignment in the evidence theory. Pattern Anal. Appl.
**2021**, 24, 1007–1023. [Google Scholar] [CrossRef] - Deng, Y.; Shi, W.; Zhu, Z.; Liu, Q. Combining belief functions based on distance of evidence. Decis. Support Syst.
**2004**, 38, 489–493. [Google Scholar] - Zhang, Z.; Liu, T.; Chen, D.; Zhang, W. Novel algorithm for identifying and fusing conflicting data in wireless sensor networks. Sensors
**2014**, 14, 9562–9581. [Google Scholar] [CrossRef] [Green Version] - Yuan, K.; Xiao, F.; Fei, L.; Kang, B.; Deng, Y. Conflict management based on belief function entropy in sensor fusion. SpringerPlus
**2016**, 5, 638. [Google Scholar] [CrossRef] [Green Version] - Jiang, W.; Xie, C.; Zhuang, M.; Shou, Y.; Tang, Y. Sensor Data Fusion with Z-Numbers and Its Application in Fault Diagnosis. Sensors
**2016**, 16, 1509. [Google Scholar] [CrossRef] [Green Version] - Tang, Y.; Zhou, D.; Xu, S.; He, Z. A weighted belief entropy-based uncertainty measure for multi-sensor data fusion. Sensors
**2017**, 17, 928. [Google Scholar] [CrossRef]

**Figure 1.**The flowchart of multi-source information fusion based on the belief entropy of negation evidence.

BPA | m(A) | m(B) | m(C) | m(A,C) |
---|---|---|---|---|

${m}_{1}\left(\xb7\right)$ | 0.41 | 0.29 | 0.3 | 0 |

${m}_{2}\left(\xb7\right)$ | 0 | 0.9 | 0.1 | 0 |

${m}_{3}\left(\xb7\right)$ | 0.58 | 0.07 | 0 | 0.35 |

${m}_{4}\left(\xb7\right)$ | 0.55 | 0.1 | 0 | 0.35 |

${m}_{5}\left(\xb7\right)$ | 0.6 | 0.1 | 0 | 0.3 |

Methods | m(A) | m(B) | m(C) | m(A,C) |
---|---|---|---|---|

Deng et al’s method [46] | 0.9820 | 0.0039 | 0.0107 | 0.0034 |

Zhang et al’s method [47] | 0.9820 | 0.0033 | 0.0115 | 0.0032 |

Yuan et al’s method [48] | 0.9886 | 0.0002 | 0.0072 | 0.0039 |

Xiao’s method [27] | 0.9905 | 0.0002 | 0.0061 | 0.0043 |

Proposed method | 0.9863 | 0.0013 | 0.0086 | 0.0038 |

$\mathit{Freq}1$ | $\mathit{Freq}2$ | $\mathit{Freq}3$ | ||||||||
---|---|---|---|---|---|---|---|---|---|---|

$\left\{\mathit{F}\mathbf{2}\right\}$ | $\left\{\mathit{F}\mathbf{3}\right\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2}\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2},\mathit{F}\mathbf{3}\}$ | $\left\{\mathit{F}\mathbf{2}\right\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2},\mathit{F}\mathbf{3}\}$ | $\left\{\mathit{F}\mathbf{1}\right\}$ | $\left\{\mathit{F}\mathbf{2}\right\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2}\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2},\mathit{F}\mathbf{3}\}$ | |

${m}_{{s}_{1}}\left(\xb7\right)$ | 0.8176 | 0.0003 | 0.1553 | 0.0268 | 0.6229 | 0.3771 | 0.3666 | 0.4563 | 0.1185 | 0.0586 |

${m}_{{s}_{2}}\left(\xb7\right)$ | 0.5658 | 0.0009 | 0.0646 | 0.3687 | 0.7660 | 0.2341 | 0.2793 | 0.4151 | 0.2652 | 0.0404 |

${m}_{{s}_{3}}\left(\xb7\right)$ | 0.2403 | 0.0004 | 0.0141 | 0.7452 | 0.8598 | 0.1402 | 0.2897 | 0.4331 | 0.2470 | 0.0302 |

Evidence | $\mathit{Freq}1$ | $\mathit{Freq}2$ | $\mathit{Freq}3$ |
---|---|---|---|

${E}_{n}\left({m}_{{s}_{1}}\right)$ | 3.1726 | 2.7405 | 3.3109 |

${E}_{n}\left({m}_{{s}_{2}}\right)$ | 3.0142 | 2.9352 | 3.2634 |

${E}_{n}\left({m}_{{s}_{3}}\right)$ | 2.6191 | 2.9985 | 3.2789 |

Evidence | $\mathit{Freq}1$ | $\mathit{Freq}2$ | $\mathit{Freq}3$ |
---|---|---|---|

${w}_{{S}_{1}}$ | 0.3603 | 0.3131 | 0.3360 |

${w}_{{S}_{2}}$ | 0.3423 | 0.3398 | 0.3312 |

${w}_{{S}_{3}}$ | 0.2974 | 0.3471 | 0.3328 |

$\mathit{Freq}1$ | $\mathit{Freq}2$ | $\mathit{Freq}3$ | ||||||||
---|---|---|---|---|---|---|---|---|---|---|

$\left\{\mathit{F}\mathbf{2}\right\}$ | $\left\{\mathit{F}\mathbf{3}\right\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2}\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2},\mathit{F}\mathbf{3}\}$ | $\left\{\mathit{F}\mathbf{2}\right\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2},\mathit{F}\mathbf{3}\}$ | $\left\{\mathit{F}\mathbf{1}\right\}$ | $\left\{\mathit{F}\mathbf{2}\right\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2}\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2},\mathit{F}\mathbf{3}\}$ | |

${m}_{w}\left(\xb7\right)$ | 0.5597 | 0.0005 | 0.0823 | 0.3575 | 0.7538 | 0.2462 | 0.3121 | 0.4349 | 0.2098 | 0.0431 |

$\mathit{Freq}1$ | $\mathit{Freq}2$ | $\mathit{Freq}3$ | ||||||||
---|---|---|---|---|---|---|---|---|---|---|

$\left\{\mathit{F}\mathbf{2}\right\}$ | $\left\{\mathit{F}\mathbf{3}\right\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2}\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2},\mathit{F}\mathbf{3}\}$ | $\left\{\mathit{F}\mathbf{2}\right\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2},\mathit{F}\mathbf{3}\}$ | $\left\{\mathit{F}\mathbf{1}\right\}$ | $\left\{\mathit{F}\mathbf{2}\right\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2}\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2},\mathit{F}\mathbf{3}\}$ | |

Fusion result | 0.9146 | 0.0002 | 0.0394 | 0.0458 | 0.9851 | 0.0149 | 0.3353 | 0.6316 | 0.0329 | 0.0002 |

$\mathit{Method}$ | $\mathit{Freq}1$ | $\mathit{Freq}2$ | $\mathit{Freq}3$ | |||||||
---|---|---|---|---|---|---|---|---|---|---|

$\left\{\mathit{F}\mathbf{2}\right\}$ | $\left\{\mathit{F}\mathbf{3}\right\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2}\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2},\mathit{F}\mathbf{3}\}$ | $\left\{\mathit{F}\mathbf{2}\right\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2},\mathit{F}\mathbf{3}\}$ | $\left\{\mathit{F}\mathbf{1}\right\}$ | $\left\{\mathit{F}\mathbf{2}\right\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2}\}$ | $\{\mathit{F}\mathbf{1},\mathit{F}\mathbf{2},\mathit{F}\mathbf{3}\}$ | |

Jiang et al’s method [49] | 0.8861 | 0.0002 | 0.0582 | 0.0555 | 0.9621 | 0.0371 | 0.3384 | 0.5904 | 0.0651 | 0.0061 |

Tang et al’s method [50] | 0.8891 | 0.0003 | 0.0427 | 0.0679 | 0.9784 | 0.0216 | 0.3318 | 0.6332 | 0.0349 | 0.0001 |

Proposed method | 0.9146 | 0.0002 | 0.0394 | 0.0458 | 0.9852 | 0.0149 | 0.3353 | 0.6316 | 0.0329 | 0.0002 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Tang, Y.; Chen, Y.; Zhou, D.
Measuring Uncertainty in the Negation Evidence for Multi-Source Information Fusion. *Entropy* **2022**, *24*, 1596.
https://doi.org/10.3390/e24111596

**AMA Style**

Tang Y, Chen Y, Zhou D.
Measuring Uncertainty in the Negation Evidence for Multi-Source Information Fusion. *Entropy*. 2022; 24(11):1596.
https://doi.org/10.3390/e24111596

**Chicago/Turabian Style**

Tang, Yongchuan, Yong Chen, and Deyun Zhou.
2022. "Measuring Uncertainty in the Negation Evidence for Multi-Source Information Fusion" *Entropy* 24, no. 11: 1596.
https://doi.org/10.3390/e24111596