Next Article in Journal
Hierarchical Wilson–Cowan Models and Connection Matrices
Next Article in Special Issue
HE-YOLOv5s: Efficient Road Defect Detection Network
Previous Article in Journal
Increasing Extractable Work in Small Qubit Landscapes
Previous Article in Special Issue
Attention-Based Spatial–Temporal Convolution Gated Recurrent Unit for Traffic Flow Forecasting
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Evidence Combination Method Based on Improved Pignistic Probability

School of Automation, Chongqing University, Chongqing 400044, China
*
Author to whom correspondence should be addressed.
Entropy 2023, 25(6), 948; https://doi.org/10.3390/e25060948
Submission received: 19 April 2023 / Revised: 3 June 2023 / Accepted: 7 June 2023 / Published: 16 June 2023

Abstract

:
Evidence theory is widely used to deal with the fusion of uncertain information, but the fusion of conflicting evidence remains an open question. To solve the problem of conflicting evidence fusion in single target recognition, we proposed a novel evidence combination method based on an improved pignistic probability function. Firstly, the improved pignistic probability function could redistribute the probability of multi-subset proposition according to the weight of single subset propositions in a basic probability assignment (BPA), which reduces the computational complexity and information loss in the conversion process. The combination of the Manhattan distance and evidence angle measurements is proposed to extract evidence certainty and obtain mutual support information between each piece of evidence; then, entropy is used to calculate the uncertainty of the evidence and the weighted average method is used to correct and update the original evidence. Finally, the Dempster combination rule is used to fuse the updated evidence. Verified by the analysis results of single-subset proposition and multi-subset proposition highly conflicting evidence examples, compared to the Jousselme distance method, the Lance distance and reliability entropy combination method, and the Jousselme distance and uncertainty measure combination method, our approach achieved better convergence and the average accuracy was improved by 0.51% and 2.43%.

1. Introduction

As an uncertain reasoning method, evidence theory [1,2] needs weaker conditions than Bayesian probability theory, but it possesses the ability to express “uncertainty” and “ignorance” directly. The primary data required in evidence theory are more intuitive and easy to obtain than in probability reasoning theory. One can quickly integrate the knowledge and data from different experts or data sources to describe the uncertainty flexibly. It has been widely used in supplier selection [3,4], target recognition [5,6], decision making [7,8], reliability analysis [9,10,11,12,13], optimization in uncertain environments [14,15,16], etc. However, in the application of DS evidence theory, evidence fusion plays a crucial role due to the unreliability of the evidence source. The Dempster fusion rule was established based on the multiplication principle. In cases where there is conflicting evidence in evidence theory, the Dempster combination rule is used to explain counterintuitive outcomes, which can lead to what is known as the “Zadeh paradox”.
To address the issue of conflicting evidence fusion, scholars have presented many outstanding studies. When there are conflicts in original evidence, traditional DS theory cannot be applied and it needs to be improved. In recent years, a large number of scholars and experts have improved evidence theory from two aspects. The first is to improve the fusion rules of DS evidence theory. Sun promoted the concept of credibility, believing that the credibility of all evidence is equal, and modified the Dempster rule by weighted summation [17]. Yang established a unique evidential reasoning (ER) rule, combining multiple pieces of independent evidence with weight and reliability, improving and enhancing the Dempster rule by determining how to combine completely reliable evidence and analyzing significant or complete conflicts through new reliability disturbances [18]. Deng Yong proposed a new concept called generalized evidence theory (GET), and defined a new concept called generalized basic probability assignment (GBPA). They established a model to deal with uncertain information, provided generalized combination rules (GCR) for the combination of GBPA, and constructed a generalized conflict model to measure the conflict between evidence [19]. The second aspect is to modify the original evidence. Smets proposed pignistic probability transformations and adopted an average distribution strategy in order for the mass function of assignments to meet the conditions [20]. Based on a geometric interpretation of this evidence theory, Jousselme defined the Jousselme evidence distance to describe the differences in evidence through distance information [21]. Murphy believed that the evidence should be a weighted average, which would better deal with the normalization problem [22]. Tang proposed a new multi-sensor data fusion method based on weighted confidence entropy, which measures the uncertainty of evidence through mass functions and an identification framework to reduce the loss of evidence information [23].
In this field, domestic and foreign scholars and experts have achieved excellent results in a particular range. However, in the application of single target recognition, there is only a single result, i.e., m ( A 1 ) . Multi-subset BPAs such as m ( A 1 A 2 ) , which means there is still uncertainty in the outcome of the fusion, reduce the probability of target recognition. At present, there has been no relevant research conducted on the matter. In this article, a method for recognizing single targets through conflicting evidence fusion is proposed. This method involves consolidating multiple subsets within the framework of evidence theory into a single subset, while incorporating the evidence distance, evidence angle, and entropy to enhance the accuracy of target fusion. First, the pignistic probability function is improved to transform each original evidence into a single propositional subset to avoid a single recognition result including multiple subset propositions in the process of the Dempster rule. Then, the combination of the Manhattan distance and evidence angle measurements is proposed to extract the degree of evidence certainty and obtain the mutual support information between all data. Furthermore, entropy is introduced to calculate the uncertainty of evidence. The initial evidence is fused according to the coefficient of uncertainty and is transformed to updated evidence. Finally, the Dempster combination rule is used to fuse the updated evidence.

2. Materials and Methods

2.1. Dempster Rule

Let U be a domain set representing all values of X, and all elements in U are not integrated. Then, U is called the recognition framework of X.
The research objects in scientific theory compose a nonempty set, which is called a domain.
Definition 1.
Let U be a recognition framework, then the function  m : 2 U [ 0 , 1 ]  satisfies the following conditions:
(1) 
m ( ) = 0 ;
(2) 
A U = 1 .
Then, m ( A ) is the basic probability assignment (BPA) of A, m ( A ) is the mass function, and m ( A ) represents the degree of trust in A. If m ( A ) > 0 , A is called a focal element.
Hypothesis m 1 and m 2 are the two basic probability assignments on the same recognition framework U, and the focal elements are A 1 , A 2 , , A k and B 1 , B 2 , , B r . Namely,
K = A i B j = m 1 ( A i ) m 2 ( B j ) < 1
Then,
m ( C ) = A i B j = m 1 ( A i ) m 2 ( B j ) 1 K , C U 0 , C = .
where i = 1 , 2 , , k ; j = 1 , 2 , , r ; and K is the conflict factor, which reflects the degree of conflict between evidence. 1 1 K is called the normalization factor. The Dempster rule allocates the conflict to each set in equal proportion.
Define the system identification framework as U = { A 1 , A 2 , , A M } , N evidence as E 1 , E 2 , , E N , and the mass functions corresponding to each evidence as m 1 , m 2 , , m N .

2.2. Improved Pignistic Probability Function

In single target recognition, the fusion recognition result often only has a certain target. In the framework of DS evidence theory, when evidence contains multiple subset propositions, the fusion result also contains multiple subset propositions, which increases the computational complexity. This work improves the pignistic probability function to transform the multiple subset propositions into single subset propositions, in which the BPA of multiple subset propositions is distributed according to the weight of single subset propositions. The weight is allocated according to the information of each single subset proposition provided by the evidence itself, which reduces the computational complexity and information lost in the process of transforming the pignistic probability function [24] from the BPA to the single subset BPA. The improved pignistic probability conversion function is as follows:
B e t P m ( A i ) = A i A j U , B A j m ( A i ) k = 1 A j m ( B ) · m ( A j ) 1 m ( ϕ )
where A is the proposition of original evidence and B is a simple subset proposition in A j , A i , A j refers to the multi sub proposition in evidence, A i A j , ∅ is an empty set, m ( ) 1 , and A j represents the number of elements contained in proposition A.
After the pignistic probability function is conserved, the BPA is converted into a single subset BPA m 1 , m 2 , , m N .
m i = { B e t P m i ( A 1 ) , B e t P m i ( A 2 ) , , B e t P m i ( A N ) }

2.3. Evidence Support Based on the Manhattan Distance

The distance between evidence [25] can effectively measure the degree of support between evidence. At present, domestic and foreign scholars have proposed a variety of methods to measure distance, including the Lance distance, the Jousselme distance, and the Mahalanobis distance. However, the Lance distance does not take into account the correlation between indicators. The Jousselme distance is affected by the dispersion of the basic probability distribution of evidence [26]. The Mahalanobis distance function requires calculation of the covariance of the matrix, which is hugely complex. The Manhattan distance introduced in this paper calculates the distance of each single subset BPA identification result to measure the similarity between the evidence. This method has low computational complexity.
The Manhattan distance between two pieces of evidence is calculated as:
d ( E i , E j ) = k = 1 M B e t P m i ( A k ) B e t P m j ( A k )
where i , j = 1 , 2 , , N . The Manhattan distance between each evidence is calculated to obtain the distance matrix D:
D = d ( E 1 , E 1 ) d ( E 2 , E 1 ) d ( E N , E 1 ) d ( E 1 , E 2 ) d ( E 2 , E 2 ) d ( E N , E 2 ) d ( E 1 , E N ) d ( E 2 , E N ) d ( E N , E N )
The distance between evidence is negatively correlated with the support.
The calculation of the evidence support of E 1 , E 2 , , E N is:
S U P i = 1 N 1 k = 1 , k i N 1 D i k
The support degree is obtained based on the Manhattan distance between evidence. The support is normalized to obtain the support coefficient of the evidence C o r _ d ( E i ) .
C o r _ d ( E i ) = S U P i j = 1 N S U P j

2.4. Evidence Similarity Based on Evidence Angle

The angle between the two pieces of evidence can be used to structure the consistency between the evidence subjects, and the results obtained can be used to measure the similarity between the two evidence subjects. The formula for the evidence angle [27] is as follows:
c o s ( E i , E j ) = m i × m j m i × m j = k = 1 M B e t P m i ( A k ) × B e t P m j ( A k ) k = 1 M [ B e t P m i ( A k ) ] 2 × k = 1 M [ B e t P m j ( A k ) ] 2
The larger value of c o s ( E i , E j ) , the more consistent two pieces of evidence are. It shows that there is a higher similarity between the two pieces of evidence. The evidence angle between each evidence is calculated from the angle matrix, Ang. The similarity between evidence is calculated from the angle matrix:
S I M i = 1 N 1 k = 1 , k i N A n g i k
The similarity between evidence is measured based on the evidence angle. The similarity is normalized to obtain the similarity coefficient of the evidence C o r A ( E i ) .
C o r _ A ( E i ) = S I M i j = 1 N S I M j

2.5. Evidence Uncertainty Based on Entropy

In evidence theory, the amount of information content that evidence carries can be measured by information entropy. The higher the information entropy, the more information the evidence carries, and the lower the probability of occurrence in the real world, the higher the uncertainty. The calculation formula for information entropy is as follows:
H ( E i ) = A k E i m i ( A k ) l o g 2 ( m i ( A k ) )
The uncertainty coefficient of each piece of evidence C o r _ S ( E i ) is calculated by:
C o r _ S ( E i ) = e H n ( E i )
where i = 1 , 2 , , N .

2.6. Evidence Fusion Based on the Dempster Rule

The evidence fusion coefficient integrating the Manhattan distance, the evidence angle, and the reliability entropy is:
C o r ( E i ) = C o r _ S ( E i ) × C o r _ d ( E i ) × C o r _ A ( E i )
The fusion coefficient C o r ( E i ) is normalized to obtain the final evidence fusion coefficient C o r _ f u s i o n ( E i ) . The single subset BPA { m 1 , m 2 , , m N } is modified:
m = i = 1 N C o r f u s i o n ( E i ) × m i
where i = 1 , 2 , , N ;
All the initial evidence is replaced with m , and finally the modified evidence is fused with the Dempster rules:
m f u s = ( ( ( m m ) 1 ) i m ) N 1
The flow graph of the method proposed in this paper is shown in Figure 1.

3. Results

In the following, we verify the effectiveness of the method through two conflicting examples: those that only contain single-subset propositions and those that contain multiple-subset propositions.

3.1. An Example of Single-Subset Proposition Conflicting Evidence

An example of single-subset proposition conflicting evidence can be found in reference [28]. An evidence recognition framework is assumed and there are five independent pieces of evidence. The corresponding BPA is shown in Table 1 [28].

3.1.1. Improved Pignistic Probability Function

This example is a simple subset proposition, and the conversed BPA is obtained by Formula (1). m 1 = { 0.90 , 0 , 0.10 } , m 2 = { 0 , 0.01 , 0.99 } , m 3 = { 0.50 , 0.20 , 0.30 } , m 4 = { 0.98 , 0.01 , 0.01 } , m 5 = { 0.90 , 0.05 , 0.05 } .

3.1.2. Calculate Fusion Coefficient

Apply Formulas (5)–(8) as follows to obtain the evidence support coefficient, the distance matrix, and support matrix as:
D = 0 1.8 0.8 0.18 0.10 1.80 0 1.38 1.96 1.88 0.80 1.38 0 0.96 0.80 0.18 1.96 0.96 0 0.16 0.10 1.88 0.80 0.16 0
The support based on the Manhattan distance between evidence is:
S U P i = { 4.3425 , 0.5800 , 1.0650 , 3.3400 , 4.5075 }
The evidence support coefficient C o r ( E i ) is:
C o r d _ d ( E i ) = { 0.3139 , 0.0419 , 0.0770 , 0.2414 , 0.3258 }
Apply Formulas (9)–(11) as follows to obtain the the evidence similarity coefficient C o r _ A ( E i ) ,
A n g = 1 0.11 0.86 0.99 1.00 0.11 1 0.49 0.01 0.06 0.86 0.49 1 0.82 0.85 0.99 0.01 0.82 1 1.00 1.00 0.06 0.85 1.00 1 A n g i = { 0.7400 , 0.1675 , 0.7550 , 0.7050 , 0.7275 }
The evidence similarity coefficient C o r _ A ( E i ) is:
C o r d _ A ( E i ) = { 0.2391 , 0.0541 , 0.2439 , 0.2278 , 0.2351 }
Apply Formulas (12) and (13) as follows to obtain the evidence uncertainty coefficient C o r _ S ( E i ) .
C o r d _ S ( E i ) = { 1.60 , 1.08 , 4.44 , 1.17 , 1.77 }
The support coefficient and similarity coefficient represent the certainty of evidence, and the uncertainty coefficient represents the uncertainty of evidence. The final evidence fusion coefficient is obtained by applying Formulas (14) and (15):
C o r d _ f u s i o n ( E i ) = { 0.2960 , 0.0059 , 0.2055 , 0.1585 , 0.3342 }
Modify the evidence again
m = i = 1 N C o r _ f u s i o n ( E i ) × m i = { 0.8253 , 0.0595 , 0.1154 }
to replace the initial evidence.

3.1.3. Evidence Fusion Based on the Dempster Rule

Formula (16) is applied for fusion four times, and the fusion results are shown in Table 2 and Figure 2. A comparison with other methods is shown in Table 3 and Figure 3.

3.2. An Example of Multi-Subset Proposition Conflicting Evidence

Suppose there is a multi-sensor-based target recognition system, then the recognized targets are U = { A 1 , A 2 , A 3 } , which are the real targets. There are five independent sensors. The recognition results of the five sensors are shown in Table 4.

3.2.1. Improved Pignistic Probability Function

According to Formula (3), the conversed BPA is as shown in Table 5.

3.2.2. Calculate Fusion Coefficient C o r ( E i )

As calculated by Formulas (4)–(15), the coefficients are shown in Table 6.
We obtained the final evidence of the BPA:
m = i = 1 N C o r d _ f u s i o n ( E i ) * m i = { 0.8284 , 0.0873 , 0.0841 }

3.2.3. Evidence Fusion Based on the Dempster Rule

Evidence fusion was performed four times by the Dempster rule, and the fusion results are shown in Table 7 and Figure 4. A comparison with other methods is shown in Table 8 and Figure 5.

4. Discussion

As shown in the figure above, applying the Dempster fusion rule leads to counter-intuitive results.
The fusion results of single- and multi-subset conflicting evidence are discussed in this section. Analyzing Table 2 and Table 7 and Figure 2 and Figure 4, our proposed method has a good fusion effect, and the BPA reaches 0.9999 and 1.0000 in the fourth fusion. The BPA decreases with the increase in fusion time. It shows that the method proposed in this paper can effectively extract the characteristics of the evidence.
Analyzing Table 3 and Table 8 and Figure 3 and Figure 5, when the number of pieces of evidence is two or three, our method is not as effective as Chen’s method and Zhao’s method. In the case of a small amount of evidence, the input data are insufficient and it is difficult to extract multiple features from each evidence source. However, with the increase in the number of evidence sources, our method’s accuracy rapidly improves and its accuracy performance is expected to be even better. According to the tables and the figures, it can be seen that our proposed method has a higher accuracy and better effect after three or more fusion processes. Furthermore, for four or more fusion processes, our proposed method has a higher accuracy and better impact in the fusion results of multi-subset conflict examples.
Experiments have shown that the method proposed in this article can effectively extract mutually supportive features between various evidence sources when there are sufficient evidence sources and can achieve good results.

5. Conclusions

In this article, we proposed a novel evidence combination method based on an improved pignistic probability function. Considering evidence characteristics and information richness, this paper proposes a novel method to solve the problem of highly conflicting evidence fusion in DS evidence theory. Through experiments, it has been shown that we have achieved good results in dealing with single target recognition problems and an improved fusion accuracy of the evidence theory framework in target fusion recognition. Evidence theory has a strong ability to handle uncertainty problems. Our next work will further investigate how evidence theory can be extended and applied in the real world.

Author Contributions

Conceptualization, X.S. and F.L.; methodology, P.Q.; validation, L.Y.; data curation, G.H.; writing—original draft preparation, F.L.; writing—review and editing, X.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study did not require ethical approval.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
BPAbasic probability assignment

References

  1. Dempster, A.P. Upper and Lower Probabilities Induce by a Multiplicand Mapping. Ann. Math. Stat. 1967, 38, 325–339. [Google Scholar] [CrossRef]
  2. Shafer, G. A Mathematical Theory of Evidence; Princeton University Press: Princeton, NJ, USA, 1976; Volume 25, pp. 10–40. [Google Scholar]
  3. Dou, Z.; Xu, X.; Lin, Y.; Zhou, R. Application of D-S evidence fusion method in the fault detection of temperature sensor. Math. Probl. Eng. 2014, 2014, 395057. [Google Scholar] [CrossRef] [Green Version]
  4. Deng, X.; Hu, Y.; Deng, Y.; Mahadevan, S. Supplier selection using AHP methodology extended by D numbers. Expert Syst. Appl. 2014, 41, 156–167. [Google Scholar] [CrossRef]
  5. Xu, X.; Li, S.; Song, X.; Wen, C.; Xu, D. Te optimal design of industrial alarm systems based on evidence theory. Control. Eng. Pract. 2016, 46, 142–156. [Google Scholar] [CrossRef]
  6. Chen, Y.; Cremers, A.B.; Cao, Z. Interactive color image segmentation via iterative evidential labeling. Inf. Fusion 2014, 20, 292–304. [Google Scholar] [CrossRef]
  7. Suh, D.; Yook, J. A method to determine basic probability assignment in context awareness of a moving object. Int. J. Distrib. Sens. Netw. 2013, 9, 972641. [Google Scholar] [CrossRef] [Green Version]
  8. Zadeh, L.A. A simple view of the Dempster-Shafer theory of evidence and its implication for the rule of combination. AI Mag. 1986, 7, 85–90. [Google Scholar]
  9. Leung, Y.; Ji, N.-N.; Ma, J.-H. An integrated information fusion approach based on the theory of evidence and group decision-making. Inf. Fusion 2013, 14, 410–422. [Google Scholar] [CrossRef]
  10. Khamseh, S.A.; Sedigh, A.K.; Moshiri, B.; Fatehi, A. Control performance assessment based on sensor fusion techniques. Control. Eng. Pract. 2016, 49, 14–28. [Google Scholar] [CrossRef]
  11. Niu, D.; Wei, Y.; Shi, Y.; Karimi, H.R. A novel evaluation model for hybrid power system based on vague set and Dempster-Shafer evidence theory. Math. Probl. Eng. 2012, 2012, 784389. [Google Scholar] [CrossRef] [Green Version]
  12. Zhou, Q.; Zhou, H.; Zhou, Q.; Yang, F.; Luo, L.; Li, T. Structural damage detection based on posteriori probability support vector machine and Dempster-Shafer evidence theory. Appl. Soft Comput. 2015, 36, 368–374. [Google Scholar] [CrossRef]
  13. Xu, J.; Zhong, Z.; Xu, L. ISHM-oriented adaptive fault diagnostics for avionics based on a distributed intelligent agent system. Int. J. Syst. Sci. 2015, 46, 2287–2302. [Google Scholar] [CrossRef]
  14. Deng, Y.; Liu, Y.; Zhou, D. An improved genetic algorithm with initial population strategy for symmetric TSP. Math. Probl. Eng. 2015, 2015, 212794. [Google Scholar] [CrossRef] [Green Version]
  15. Du, W.-B.; Gao, Y.; Liu, C.; Zheng, Z.; Wang, Z. Adequate is better: Particle swarm optimization with limited-information. Appl. Math. Comput. 2015, 268, 832–838. [Google Scholar] [CrossRef]
  16. Deng, X.; Jiang, W. On the negation of a Dempster-Shafer belief structure based on maximum uncertainty allocation. Inf. Sci. 2020, 516, 346–352. [Google Scholar] [CrossRef] [Green Version]
  17. Sun, Q.; Ye, X.Q.; Gu, W.-K. A New Combination Rules of Evidence Theory. ACTA Electron. Sin. 2000, 28, 117. [Google Scholar]
  18. Yang, J.; Xu, D. Evidential reasoning rule for evidence combination. Artif. Intell. 2013, 205, 1–29. [Google Scholar] [CrossRef]
  19. Deng, Y. Generalized evidence theory. Appl. Intell. 2015, 43, 530–543. [Google Scholar] [CrossRef] [Green Version]
  20. Smets, P.; Kennes, R. The transferable belief model. Artif. Intell. 1994, 66, 191–234. [Google Scholar] [CrossRef]
  21. Jousselme, A.L.; Grenier, D.; Bossé, E. A new distance between two bodies of evidence. Inf. Fusion 2001, 2, 91–101. [Google Scholar] [CrossRef]
  22. Murphy, C.K. Combining belief functions when evidence conflicts. Decis. Support Syst. 2000, 29, 1–9. [Google Scholar] [CrossRef]
  23. Tang, Y.; Zhou, D.; Xu, S. A Weighted Belief Entropy-based Uncertainty Measure for Multi-sensor Data Fusion. Sensors 2017, 17, 928. [Google Scholar] [CrossRef] [Green Version]
  24. Smets, P. The combination of evidence in the transferable belief model. IEEE Trans. Pattern Anal. Mach. Intell. 1990, 12, 447–458. [Google Scholar] [CrossRef]
  25. Jousselme, A.L.; Maupin, P. Distances in evidence theory: Comprehensive survey and generalizations. Int. J. Approx. Reason. 2012, 53, 118–145. [Google Scholar] [CrossRef] [Green Version]
  26. Mao, Y.F.; Zhang, G.D.L.; Wang, L. Measurement of evidence conflict based on overlapping degree. Control. Decis. 2017, 32, 293–298. [Google Scholar]
  27. Chen, L.; Diao, L.; Sang, J. A New Method to Handle Conflict when Combining Evidences Using Entropy Function and Evidence Angle with an Effective Application in Fault Diagnosis. Math. Probl. Eng. 2020, 2020, 3564365. [Google Scholar] [CrossRef]
  28. Wang, X.; Di, P.; Yin, D. Conflict Evidence Fusion Method Based on Lance Distance and Credibility Entropy. Syst. Eng. Electron. 2022, 44, 592–602. [Google Scholar]
  29. Deng, Y.; Shi, W.-K.; Zhu, Z.-F. Efficient combination approach of conflict evidence. J. Infrared Millim. Waves 2004, 23, 27–32. [Google Scholar]
  30. Lei, C.; Ling, D.; Jun, S. Weighted Evidence Combination Rule Based on Evidence Distance and Uncertainty Measure: An Application in Fault Diagnosis. Math. Probl. Eng. 2018, 2018, 1–10. [Google Scholar]
  31. Xiao, F. Multi-sensor data fusion based on the belief divergence measure of evidences and the belief entropy. Inf. Fusion 2018, 46, 23–32. [Google Scholar] [CrossRef]
  32. Zhao, K.Y.; Sun, R.T.; Li, L. An improved evidence fusion algorithm in multi-sensor systems. Appl. Intell. 2021, 51, 7614–7624. [Google Scholar] [CrossRef]
Figure 1. The flow graph of the proposed method.
Figure 1. The flow graph of the proposed method.
Entropy 25 00948 g001
Figure 2. Fusion results of multi-subset proposition conflicting examples.
Figure 2. Fusion results of multi-subset proposition conflicting examples.
Entropy 25 00948 g002
Figure 3. Fusion results chart of a comparison of different methods of fusion of several pieces of single-subset proposition evidence.
Figure 3. Fusion results chart of a comparison of different methods of fusion of several pieces of single-subset proposition evidence.
Entropy 25 00948 g003
Figure 4. Fusion results of multi-subset proposition conflict.
Figure 4. Fusion results of multi-subset proposition conflict.
Entropy 25 00948 g004
Figure 5. Fusion results chart of a comparison of different methods on the fusion of several pieces of multi-subset proposition evidence.
Figure 5. Fusion results chart of a comparison of different methods on the fusion of several pieces of multi-subset proposition evidence.
Entropy 25 00948 g005
Table 1. Single-subset proposition conflicting evidence.
Table 1. Single-subset proposition conflicting evidence.
Evidence m ( A 1 ) m ( A 2 ) m ( A 3 )
E 1 0.9000.10
E 2 00.010.99
E 3 0.500.200.30
E 4 0.980.010.01
E 5 0.900.050.05
Table 2. Fusion results of single-subset proposition conflicting examples.
Table 2. Fusion results of single-subset proposition conflicting examples.
Fusion Times m ( A 1 ) m ( A 2 ) m ( A 3 )
First0.97580.00510.0191
Second0.99690.00040.0027
Third0.99960.00000.0004
Fourth0.99990.00000.0001
Table 3. Comparison of evidence fusion with different methods.
Table 3. Comparison of evidence fusion with different methods.
ApproachFusion Result
BPA m ( A 1 ) , m ( A 2 ) m ( A 1 ) , m ( A 2 ) ,
m ( A 3 )
m ( A 1 ) , m ( A 2 ) ,
m ( A 3 ) , m ( A 4 )
m ( A 1 ) , m ( A 2 ) , m ( A 3 ) ,
m ( A 4 ) , m ( A 5 )
m ( A 1 ) 0000
Dempster-Shafer m ( A 2 ) 0000
m ( A 3 ) 1111
m ( A 1 ) 0.40540.50550.89300.9834
Murphy [22] m ( A 2 ) 0.00010.00000.00010.0000
m ( A 3 ) 0.59460.49450.10690.0166
m ( A 1 ) 0.40540.72110.99100.9996
Deng [29] m ( A 2 ) 0.00010.00400.00010.0000
m ( A 3 ) 0.59460.27490.00890.0003
m ( A 1 ) 0.57450.83820.95580.9968
Wang [28] m ( A 2 ) 0.00330.01420.00100.0001
m ( A 3 ) 0.42230.14760.04310.0031
m ( A 1 ) 0.40540.72110.99100.9996
Chen [30] m ( A 2 ) 0.00010.00400.00010.0000
m ( A 3 ) 0.59460.27490.00890.0003
m ( A 1 ) 0.27900.57630.93970.9963
Xiao [31] m ( A 2 ) 0.00010.00650.00040.0000
m ( A 3 ) 0.72100.41730.05990.0037
m ( A 1 ) 0.45710.71780.97920.9991
Zhao [32] m ( A 2 ) 0.00000.00460.00010.0000
m ( A 3 ) 0.54290.27750.02070.0009
m ( A 1 ) 0.57840.84060.99620.9999
Ours m ( A 2 ) 0.00000.01870.00020.0000
m ( A 3 ) 0.42160.14070.00360.0001
Table 4. Single-subset proposition conflicting evidence.
Table 4. Single-subset proposition conflicting evidence.
Evidence m ( A 1 ) m ( A 2 ) m ( A 3 ) m ( A 1 A 3 )
E 1 0.410.290.300.00
E 2 0.000.900.100.00
E 3 0.580.070.000.35
E 4 0.550.100.000.35
E 5 0.600.000.100.30
Table 5. Conversed BPA.
Table 5. Conversed BPA.
Evidence m ( A 1 ) m ( A 2 ) m ( A 3 )
E 1 0.410.290.30
E 2 0.000.900.10
E 3 0.930.070.00
E 4 0.900.100.00
E 5 0.85710.000.1429
Table 6. Fusion coefficients of multi-subset proposition evidence.
Table 6. Fusion coefficients of multi-subset proposition evidence.
Coefficient E 1 E 2 E 3 E 4 E 5
C o r d ( E i ) 0.04400.02210.29240.32120.3203
C o r A ( E i ) 0.23520.06290.23360.23760.2307
C o r S ( E i ) 4.78941.59841.44701.59841.8072
C o r ( E i ) 0.12210.00540.24330.30040.3287
Table 7. Fusion results of multi-subset proposition conflict examples.
Table 7. Fusion results of multi-subset proposition conflict examples.
Fusion Times m ( A 1 ) m ( A 2 ) m ( A 3 )
First0.97900.01090.0101
Second0.99780.00120.0010
Third0.99980.00010.0001
Fourth1.00000.00000.0000
Table 8. Comparison of evidence fusion with different methods.
Table 8. Comparison of evidence fusion with different methods.
ApproachFusion Result
BPA m ( A 1 ) , m ( A 2 ) m ( A 1 ) , m ( A 2 ) ,
m ( A 3 )
m ( A 1 ) , m ( A 2 ) ,
m ( A 3 ) , m ( A 4 )
m ( A 1 ) , m ( A 2 ) , m ( A 3 ) ,
m ( A 4 ) , m ( A 5 )
m ( A 1 ) 0000
Dempster-Shafer m ( A 2 ) 0.89690.63500.33200
m ( A 3 ) 0.10310.36500.66801
m ( A 1 ) 0.09640.49390.83620.9613
Murphy [22] m ( A 2 ) 0.81190.41800.11470.0147
m ( A 3 ) 0.09170.07920.04100.0166
m ( A 1 A 3 ) 0.00000.00900.00810.0032
m ( A 1 ) 0.00000.60190.93290.9802
Deng [29] m ( A 2 ) 0.89690.29080.02250.0009
m ( A 3 ) 0.10310.09910.03540.0154
m ( A 1 A 3 ) 0.00000.00820.00920.0035
m ( A 1 ) 0.00000.79850.96290.9855
Chen [30] m ( A 2 ) 0.89690.10600.00430.0001
m ( A 3 ) 0.10310.07520.01900.0096
m ( A 1 A 3 ) 0.00000.02030.01390.0048
m ( A 1 ) 0.14200.63910.94000.9816
Xiao [31] m ( A 2 ) 0.74120.24620.01650.0006
m ( A 3 ) 0.11680.10720.03410.0141
m ( A 1 A 3 ) 0.00000.00750.00930.0037
m ( A 1 ) 0.10460.69450.93550.9817
Zhao [32] m ( A 2 ) 0.79890.19020.01630.0000
m ( A 3 ) 0.09650.10620.04090.0147
m ( A 1 A 3 ) 0.00000.00910.00730.0036
m ( A 1 ) 0.26780.67140.99831.0000
Ours m ( A 2 ) 0.55510.22050.00150.0000
m ( A 3 ) 0.17710.10800.00010.0000
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shi, X.; Liang, F.; Qin, P.; Yu, L.; He, G. A Novel Evidence Combination Method Based on Improved Pignistic Probability. Entropy 2023, 25, 948. https://doi.org/10.3390/e25060948

AMA Style

Shi X, Liang F, Qin P, Yu L, He G. A Novel Evidence Combination Method Based on Improved Pignistic Probability. Entropy. 2023; 25(6):948. https://doi.org/10.3390/e25060948

Chicago/Turabian Style

Shi, Xin, Fei Liang, Pengjie Qin, Liang Yu, and Gaojie He. 2023. "A Novel Evidence Combination Method Based on Improved Pignistic Probability" Entropy 25, no. 6: 948. https://doi.org/10.3390/e25060948

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop