Next Article in Journal
Bounded Solutions of Semi-Linear Parabolic Differential Equations with Unbounded Delay Terms
Previous Article in Journal
Two-Step Adaptive Control for Planar Type Docking of Autonomous Underwater Vehicle
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Distance-Based Knowledge Measure and Entropy for Interval-Valued Intuitionistic Fuzzy Sets

1
School of Mathematics and Statistics, Beihua University, Jilin 132000, China
2
School of Mathematics and Information Science, Shaanxi Normal University, Xi’an 710062, China
*
Author to whom correspondence should be addressed.
Mathematics 2023, 11(16), 3468; https://doi.org/10.3390/math11163468
Submission received: 9 July 2023 / Revised: 4 August 2023 / Accepted: 8 August 2023 / Published: 10 August 2023

Abstract

:
The knowledge measure or uncertainty measure for constructing interval-valued intuitionistic fuzzy sets has attracted much attention. However, many uncertainty measures are measured by the entropy of interval-valued intuitionistic fuzzy sets, which cannot adequately reflect the knowledge of interval-valued intuitionistic fuzzy sets. In this paper, we not only extend the axiomatic definition of the knowledge measure of the interval-valued intuitionistic fuzzy set to a more general level but also establish a new knowledge measure function complying with the distance function combined with the technique for order preference by similarity to ideal solution (TOPSIS). Further, we investigate the properties of the proposed knowledge measure based on mathematical analysis and numerical examples. In addition, we create the entropy function by calculating the distance from the interval-valued fuzzy set to the most fuzzy point and prove that it satisfies the axiomatic definition. Finally, the proposed entropy is applied to the multi-attribute group decision-making problem with interval-valued intuitionistic fuzzy information. Experimental results demonstrate the effectiveness and practicability of the proposed entropy measure.

1. Introduction

The concept of the intuitionistic fuzzy set (IFS) proposed by Atanassov [1] is extended to the interval-valued intuitionistic fuzzy set (IvIFS). The reason for this extension is that membership or non-membership degrees are sometimes influenced by cognitive limitations that cannot be represented entirely by a number but are assumed to be in a certain interval, which brings additional uncertainty to IFSs. Thus, it can describe fuzzy phenomena and their essence at a finer level. Due to the increasing complexity, scale and diversification of modern social development and the increasingly complex decision-making problems faced by social organizations or enterprises, it is more appropriate to use IvIFSs to describe uncertainty. Therefore, it is crucial to construct entropy and knowledge measures of IvIFSs.
Currently, entropy measures of IvIFSs are widely applied in various areas, such as medical diagnosis, pattern recognition, investment decision making and other fields [2,3,4,5,6]. Pratiksha et al. [7] not only presented a new axiomatic definition of entropy based on the concept of probability and distance for IvIFSs by considering the degree of hesitancy but also proposed some entropies and derived the relationship between the distance, entropy and similarity measures of IvIFSs. In 2018, Liu and Jiang [8] constructed a new distance measure for interval-valued intuitionistic fuzzy sets based on the distance of interval numbers, which takes into account the advantages of integers within intervals and has a clear physical meaning. In the same year, the axiomatic definition of the entropy of the IvIFS was given, and the entropy of an IvIFS was built with the distance function by Che et al. [9]. In addition, the researchers in [10] developed entropy and distance measures for IvIFSs using an exponential function and applied the proposed entropy to a multi-attribute decision-making environment. Zhang et al. [11] discussed multi-attribute decision-making based on entropy in an interval-valued intuitionistic fuzzy environment. In addition, many scholars have introduced various entropies for IvIFSs [12,13,14,15,16,17,18].
In the latest research, some knowledge measures of IvIFSs have been investigated. Nguyen [19] developed a new knowledge measure for IvIFSs and demonstrated its practicality by proposing decision cases. Based on the combination of the normalized Hamming distance and TOPSIS, a novel knowledge measure was developed by Guo et al. [20]. Some researchers advocated constructing knowledge measures based on distance [21]. Some scholars constructed an IvIFS knowledge measure and applied it to uncertain decision-making [22]. In addition, references [23,24,25,26] explored the application of IvIFSs to decision-making problems by using a score function and a variable-weight hybrid approach. In recent years, the theory and application [27,28,29] of other extended forms of intuitionistic fuzzy sets have also been extensively studied, such as intuitionistic interval type 3 fuzzy sets, intuitionistic hesitant fuzzy soft sets and so on. From the above theories, we can find that there is no axiomatic model of the knowledge measure. Inspired by this fact, in this paper, we construct a new knowledge measure model and entropy for IvIFSs. In comparison with others, several illustrative examples are presented to assess how reasonable and accurate the new knowledge measure and entropy are.
The outline of this paper is as follows: Section 2 mainly reviews some basic concepts of IvIFSs, the axiomatic definition of entropy and some existing entropies and knowledge measures. In Section 3, we develop an innovative knowledge measure definition for IvIFSs and construct the total expression of the knowledge measure for IvIFSs. In addition, we also propose a new entropy measure based on the distance function. Section 4 shows the application of the developed technique to multi-attribute group decision-making, “MAGDM”, under uncertainty. The study is then summarized in Section 5.

2. Preliminaries

2.1. Interval-Valued Intuitionistic Fuzzy Set

This section introduces the definition and related properties of interval-valued intuitionistic fuzzy sets, which lays a theoretical foundation for the subsequently described research. Further relevant details can be found in [1,7,18].
Definition 1
([1]). An intuitionistic fuzzy set (IFS) A on X can be defined in the following form:
A = { < x , μ A ( x ) , ϑ A ( x ) > x X } ,
where μ A : X 0 , 1 and ϑ A : X 0 , 1 are the membership and the non-membership, respectively, satisfying 0 μ A x + ϑ A x 1 for all x X .
Definition 2
([1]). An interval-valued intuitionistic fuzzy set (IvIFS)  A ˜ over a universe of discourse X is defined by
A ˜ = { < x , μ ˜ A ˜ ( x ) , ϑ ˜ A ˜ ( x ) > x X } ,
where μ ˜ A ˜ ( x ) = [ μ A ˜ L ( x ) , μ A ˜ U ( x ) ] and ϑ ˜ A ˜ ( x ) = [ ϑ A ˜ L ( x ) , ϑ A ˜ U ( x ) ] , with μ A ˜ U ( x ) + ϑ A ˜ U ( x ) 1 satisfied for any x X . For the interval hesitation margin π ˜ A ˜ ( x ) = [ π A ˜ L ( x ) , π A ˜ U ( x ) ] [ 0 , 1 ] , we have π A ˜ L ( x ) = 1 μ A ˜ U ( x ) ϑ A ˜ U ( x ) and π A ˜ U ( x ) = 1 μ A ˜ L ( x ) ϑ A ˜ L ( x ) .
Specifically, if μ A ˜ L ( x ) = μ A ˜ U ( x ) and ϑ A ˜ L ( x ) = ϑ A ˜ U ( x ) , then the IvIFS A ˜ is reduced to an IFS. In addition, ( μ ˜ A ˜ ( x ) , ϑ ˜ A ˜ ( x ) ) is named the interval-valued intuitionistic fuzzy value (IvIFV). Furthermore, reference [7] pointed out that the IvIFS [ 1 3 , 1 3 , 1 3 ] x = < [ 1 3 , 1 3 ] , [ 1 3 , 1 3 ] , [ 1 3 , 1 3 ] > represents the most fuzzy point.
For convenience, we denote I v I F S ( X ) as a set containing all interval-valued intuitionistic fuzzy sets on X.
Definition 3
([1]). For any A ˜ , B ˜ I v I F S ( X ) , where A ˜ = { < x , [ μ A ˜ L ( x ) , μ A ˜ U ( x ) ] , [ ϑ A ˜ L ( x ) , ϑ A ˜ U ( x ) ] > | x X } and B ˜ = { < x , [ μ B ˜ L ( x ) , μ B ˜ U ( x ) ] , [ ϑ B ˜ L ( x ) , ϑ B ˜ U ( x ) ] > | x X } , the following conditions apply:
(1
A ˜ B ˜ iff μ A ˜ L ( x ) μ B ˜ L ( x ) , μ A ˜ U ( x ) μ B ˜ U ( x ) , ϑ A ˜ L ( x ) ϑ B ˜ L ( x ) and ϑ A ˜ U ( x ) ϑ B ˜ U ( x ) for any x X ;
(2
A ˜ = B ˜ iff A ˜ B ˜ and A ˜ B ˜ , i.e., μ A ˜ L ( x ) = μ B ˜ L ( x ) , μ A ˜ U ( x ) = μ B ˜ U ( x ) , ϑ A ˜ L ( x ) = ϑ B ˜ L ( x ) and ϑ A ˜ U ( x ) = ϑ B ˜ U ( x ) ;
(3
A ˜ c = { < x , ϑ ˜ A ˜ ( x ) , μ ˜ A ˜ ( x ) > x X } .
Definition 4
([18]). A real function D : I v I F S ( X ) × I v I F S ( X ) [ 0 , 1 ] is called the distance of an IvIFS if D satisfies the following properties for any A ˜ , B ˜ and C ˜ I v I F S ( X ) :
(D1
D ( A ˜ , B ˜ ) = 0 iff A ˜ = B ˜ ;
(D2
D ( A ˜ , B ˜ ) = D ( B ˜ , A ˜ ) ;
(D3
D ( A ˜ , C ˜ ) D ( A ˜ , B ˜ ) + D ( B ˜ , C ˜ ) ;
(D4
If A ˜ B ˜ C ˜ , then max { D ( A ˜ , B ˜ ) , D ( B ˜ , C ˜ ) } D ( A ˜ , C ˜ ) .
Definition 5
([7]). For any A ˜ , B ˜ I v I F S ( X ) , a mapping E : I v I F S ( X ) [ 0 , 1 ] is called the entropy of I v I F S ( X ) if E satisfies the following conditions:
(E1
E ( A ˜ ) = 0 iff A ˜ is a crisp set;
(E2
E ( A ˜ ) = 1 iff all three descriptions of the IvIFS interval satisfy μ ˜ A ˜ ( x ) = ϑ ˜ A ˜ ( x ) = [ 1 3 , 1 3 ] for all x X ;
(E3
If D ( A ˜ , [ 1 3 , 1 3 , 1 3 ] x ) D ( B ˜ , [ 1 3 , 1 3 , 1 3 ] x ) , then E ( A ˜ ) E ( B ˜ ) , where D is a distance measure;
(E4
E ( A ˜ ) = E ( A ˜ c ) .

2.2. Some Existing Entropy and Knowledge Measures for IvIFSs

An IvIFS describes fuzzy information through membership, non-membership and hesitation, which can effectively prevent the loss of information. Therefore, there are many research results on the entropy and knowledge measures of IvIFSs. Some related typical formulas are given below.
In the early stage, refs. [12,16] were devoted to studying the entropy of IvIFSs with membership and non-membership; the specific forms are as follows:
E ZM ( A ˜ ) = 1 n i = 1 n ( 1 μ ˜ A ˜ ( x i ) + ϑ ˜ A ˜ ( x i ) ) e 1 μ ˜ A ˜ ( x i ) + ϑ ˜ A ˜ ( x i ) ,
where μ ˜ A ˜ ( x i ) = [ μ A ˜ L ( x i ) , μ A ˜ U ( x i ) ] , ϑ ˜ A ˜ ( x i ) = [ ϑ A ˜ L ( x i ) , ϑ A ˜ U ( x i ) ] and i = 1 , 2 , , n .
E Z 2 ( A ˜ ) = 1 1 2 n i = 1 n | μ A ˜ U ( x i ) 0.5 | + | μ A ˜ L ( x i ) 0.5 | + | ϑ A ˜ L ( x i ) 0.5 | + | ϑ A ˜ U ( x i ) 0.5 | .
In addition, some theories advocate the membership degree and non-membership degree, while others pay more attention to the trigonometric function [3,14]:
E Y ( A ˜ ) = 1 n i = 1 n 1 2 1 2 cos μ A ˜ L ( x i ) + μ A ˜ U ( x i ) ϑ A ˜ L ( x i ) ϑ A ˜ U ( x i ) 8 π 1 .
E C ( A ˜ ) = 1 n i = 1 n cot ( π 4 + μ A ˜ L ( x i ) μ A ˜ U ( x i ) + ϑ A ˜ L ( x i ) ϑ A ˜ U ( x i ) 4 ( 4 μ A ˜ L ( x i ) μ A ˜ U ( x i ) ϑ A ˜ L ( x i ) ϑ A ˜ U ( x i ) ) π ) .
It is not difficult to find that the above-mentioned entropies do not consider hesitation. In order to prevent the loss of information, some researchers have discussed some new entropy measures that include the membership degree, non-membership degree and hesitation degree [4,5,13,15]. The specific forms are as follows:
E W ( A ˜ ) = 1 n i = 1 n min ( μ A ˜ L ( x i ) , ϑ A ˜ L ( x i ) ) + min ( μ A ˜ U ( x i ) , ϑ A ˜ U ( x i ) ) + π A ˜ L ( x i ) + π A ˜ U ( x i ) max ( μ A ˜ L ( x i ) , ϑ A ˜ L ( x i ) ) + max ( μ A ˜ U ( x i ) , ϑ A ˜ U ( x i ) ) + π A ˜ L ( x i ) + π A ˜ U ( x i ) .
E I N ( A ˜ ) = 1 1 n i = 1 n μ A ˜ L ( x i ) ϑ A ˜ L ( x i ) + μ A ˜ U ( x i ) ϑ A ˜ U ( x i ) 2 + π A ˜ U ( x i ) + π A ˜ L ( x i ) + min μ A ˜ U ( x i ) + μ A ˜ L ( x i ) , ϑ A ˜ U ( x i ) + ϑ A ˜ L ( x i ) .
E J ( A ˜ ) = 1 n i = 1 n 2 + π A ˜ L ( x i ) + π A ˜ U ( x i ) μ A ˜ L ( x i ) ϑ A ˜ L ( x i ) μ A ˜ U ( x i ) ϑ A ˜ U ( x i ) 2 + π A ˜ L ( x i ) + π A ˜ U ( x i ) .
E W Z ( A ˜ ) = 1 n i = 1 n cos μ A ˜ L ( x i ) ϑ A ˜ L ( x i ) + μ A ˜ U ( x i ) ϑ A ˜ U ( x i ) 2 ( 2 + π A ˜ L ( x i ) + π A ˜ U ( x i ) ) π .
In addition to entropy, the knowledge measure also has the ability to describe the fuzzy degree of objects. In 2018, Guo and Zhang [22] defined the knowledge measure of an IvIFS as follows:
K G Z ( A ˜ ) = 1 1 2 n i = 1 n ( 1 1 2 ( μ A ˜ L ( x i ) ϑ A ˜ L ( x i ) + μ A ˜ U ( x i ) ϑ A ˜ U ( x i ) ) × 1 + 1 2 ( π A ˜ L ( x i ) + π A ˜ U ( x i ) ) .

3. Construct Knowledge Measure and Entropy with Distance D for IvIFSs

3.1. Construct Knowledge Measure with Distance for IvIFSs

In this section, we use the idea of TOPSIS to construct the knowledge measure for IvIFSs. Firstly, we refer to [20], which defined the two positive ideal solutions P ˜ 1 = < x , [ 1 , 1 ] , [ 0 , 0 ] > and P ˜ 2 = < x , [ 0 , 0 ] , [ 1 , 1 ] > , as well as the negative ideal solution N ˜ = < x , [ 0 , 0 ] , [ 0 , 0 ] > . Secondly, the distances from a single element A ˜ = { < x , [ μ A ˜ L ( x ) , μ A ˜ U ( x ) ] , [ ϑ A ˜ L ( x ) , ϑ A ˜ U ( x ) ] > | x X } I v I F S ( X ) to positive and negative ideal solutions are calculated, and the parameter θ is introduced by considering the proportion of positive and negative ideal solutions and the influence of human factors, thereby constructing the total expression of the knowledge measure for IvIFSs.
Definition 6.
Let A ˜ I v I F S ( X ) . The function K : I v I F S ( X ) [ 0 , 1 ] is called the knowledge measure for I v I F S ( X ) based on the distance D if K meets the following conditions:
(K1
K ( A ˜ ) = 1 A ˜ is a crisp set;
(K2
K ( A ˜ ) = 1 π A ˜ L ( x ) = π A ˜ U ( x ) = 1 ;
(K3
K ( A ˜ ) K ( B ˜ ) A ˜ contains more knowledge than B ˜ , i.e., D ( A ˜ , [ 0 , 0 , 1 ] x ) D ( B ˜ , [ 0 , 0 , 1 ] x ) , where [ 0 , 0 , 1 ] x = < [ 0 , 0 ] , [ 0 , 0 ] , [ 1 , 1 ] > ;
(K4
K ( A ˜ ) = K ( A ˜ c ) .
Theorem 1.
A mapping K : I v I F S ( X ) I ( [ 0 , 1 ] ) is called the knowledge measure of an IvIFS and can be defined by
K ( A ˜ ) = θ D ( A ˜ , N ˜ ) ( 1 θ ) D ( A ˜ , P ˜ 1 ) D ( A ˜ , P ˜ 2 ) + θ D ( A ˜ , N ˜ ) ,
where I [ 0 , 1 ] = { [ a , b ] : 0 a b 1 } , θ [ 0 , 1 ] . Here, θ refers to people’s subjective attitude coefficient, D ( A ˜ , N ˜ ) represents the distance between the single element A ˜ and negative ideal solution N ˜ , and D ( A ˜ , P ˜ 1 ) and D ( A ˜ , P ˜ 2 ) denote the distances between A ˜ and positive ideal solutions P ˜ 1 and P ˜ 2 , respectively.
Proof. 
Next, we only need to prove that K ( . ) satisfies the conditions (K1K4) in Definition 6.
( K 1 ) Referring to the definition of the distance measure, it is easy to have
K ( A ˜ ) = 1 ( 1 θ ) D ( A ˜ , P ˜ 1 ) D ( A ˜ , P ˜ 2 ) = 0 ,
and then we obtain
D ( A ˜ , P ˜ 1 ) = 0 or D ( A ˜ , P ˜ 2 ) = 0 .
This means that A ˜ = P ˜ 1 or A ˜ = P ˜ 2 . Thus, A ˜ is a crisp set.
( K 2 ) From the perspective of the definition and distance of an IvIFS,
K ( A ˜ ) = 0 iff D ( A ˜ , N ˜ ) = 0 .
It is obvious that A ˜ = N ˜ . Then, we obtain
π A ˜ L ( x ) = π A ˜ U ( x ) = 1 , x X .
( K 3 ) The derivative of Equation (10) with respect to D ( A ˜ , N ˜ ) is
K ( A ˜ ) D ( A ˜ , N ˜ ) = θ [ ( 1 θ ) D ( A ˜ , P ˜ 1 ) D ( A ˜ , P ˜ 2 ) + θ D ( A ˜ , N ˜ ) ] θ 2 D ( A ˜ , N ˜ ) [ ( 1 θ ) D ( A ˜ , P ˜ 1 ) D ( A ˜ , P ˜ 2 ) + θ D ( A ˜ , N ˜ ) ] 2 = θ [ ( 1 θ ) D ( A ˜ , P ˜ 1 ) D ( A ˜ , P ˜ 2 ) [ ( 1 θ ) D ( A ˜ , P ˜ 1 ) D ( A ˜ , P ˜ 2 ) + θ D ( A ˜ , N ˜ ) ] 2 > 0 .
The derivative of Equation (10) with respect to D ( A ˜ , P ˜ 1 ) ) is
K ( A ˜ ) D ( A ˜ , P ˜ 1 ) = ( 1 θ ) D ( A ˜ , P ˜ 2 ) [ ( 1 θ ) D ( A ˜ , P ˜ 1 ) D ( A ˜ , P ˜ 2 ) + θ D ( A ˜ , N ˜ ) ] 2 < 0 .
The derivative of Equation (10) with respect to D ( A ˜ , P ˜ 2 ) is
K ( A ˜ ) D ( A ˜ , P ˜ 2 ) = ( 1 θ ) D ( A ˜ , P ˜ 1 ) [ ( 1 θ ) D ( A ˜ , P ˜ 1 ) D ( A ˜ , P ˜ 2 ) + θ D ( A ˜ , N ˜ ) ] 2 < 0 .
Then, we find that K ( A ˜ ) increases monotonically with respect to D ( A ˜ , N ˜ ) , and K ( A ˜ ) decreases monotonically with respect to D ( A ˜ , P ˜ 1 ) , D ( A ˜ , P ˜ 2 ) .
If A ˜ contains more knowledge, we have
D ( A ˜ , N ˜ ) D ( B ˜ , N ˜ ) , D ( A ˜ , P 1 ˜ ) D ( B ˜ , P ˜ 1 ) , D ( A ˜ , P 2 ˜ ) D ( B ˜ , P ˜ 2 ) .
Finally, we obtain K ( A ˜ ) K ( B ˜ ) .
( K 4 ) According to the definition of the complement of the IvIFS A ˜ , i.e.,
D ( A ˜ , N ˜ ) = D ( A ˜ c , N ˜ ) , D ( A ˜ , P 1 ˜ ) = D ( A ˜ c , P 2 ˜ ) , D ( A ˜ , P 2 ˜ ) = D ( A ˜ c , P 1 ˜ ) ,
we can deduce E ( A ˜ ) = E ( A ˜ c ) .
Let A ˜ I v I F S ( X ) ; the distance in Equation (10) is the Hamming distance, and then the distances between the IvIFS A ˜ and the positive and negative ideal solutions are as follows:
D ( A ˜ , N ˜ ) = 1 4 n i = 1 n [ μ A ˜ L ( x i ) + μ A ˜ U ( x i ) + ϑ A ˜ L ( x i ) + ϑ A ˜ U ( x i ) ] ,
D ( A ˜ , P ˜ 1 ) = 1 4 n i = 1 n [ 2 μ A ˜ L ( x i ) μ A ˜ U ( x i ) + ϑ A ˜ L ( x i ) + ϑ A ˜ U ( x i ) ] ,
D ( A ˜ , P ˜ 2 ) = 1 4 n i = 1 n [ μ A ˜ L ( x i ) + μ A ˜ U ( x i ) + 2 ϑ A ˜ L ( x i ) ϑ A ˜ U ( x i ) ] .
Thus, we easily obtain
K ( A ˜ ) = θ D ( A ˜ , N ˜ ) ( 1 θ ) D A ˜ , P ˜ 1 D ( A ˜ , P ˜ 2 ) + θ D ( A ˜ , N ˜ ) = θ i = 1 n [ μ ˜ A ˜ ( x i ) + ϑ ˜ A ˜ ( x i ) ] 1 4 n ( 1 θ ) i = 1 n [ 2 μ ˜ A ˜ ( x i ) + ϑ ˜ A ˜ ( x i ) ] i = 1 n [ μ ˜ A ˜ ( x i ) + 2 ϑ ˜ A ˜ ( x i ) ] + θ i = 1 n [ μ ˜ A ˜ ( x i ) + ϑ ˜ A ˜ ( x i ) ] ,
where μ ˜ A ˜ ( x i ) = μ A ˜ L ( x i ) + μ A ˜ U ( x i ) , ϑ ˜ A ˜ ( x i ) = ϑ A ˜ L ( x i ) + ϑ A ˜ U ( x i ) .
When a new formula for the knowledge measure was constructed using the Euclidean distance, it was also necessary to calculate distances between A ˜ and the positive and negative solutions as follows:
D ( A ˜ , N ˜ ) = { 1 4 n i = 1 n [ μ A ˜ L ( x i ) 2 + μ A ˜ U ( x i ) 2 + ϑ A ˜ L ( x i ) 2 + ϑ A ˜ U ( x i ) 2 ] } 1 2 ,
D ( A ˜ , P ˜ 1 ) = { 1 4 n i = 1 n [ ( μ A ˜ L ( x i ) 1 ) 2 + ( μ A ˜ U ( x i ) 1 ) 2 + ϑ A ˜ L ( x i ) 2 + ϑ A ˜ U ( x i ) 2 ] } 1 2 ,
D ( A ˜ , P ˜ 2 ) = { 1 4 n i = 1 n [ μ A ˜ L ( x i ) 2 + μ A ˜ U ( x i ) 2 + ( ϑ A ˜ L ( x i ) 1 ) 2 + ( ϑ A ˜ U ( x i ) 1 ) 2 ] } 1 2 .
We can obtain
K ( A ˜ ) = θ D ( A ˜ , N ˜ ) ( 1 θ ) D A ˜ , P ˜ 1 D ( A ˜ , P ˜ 2 ) + θ D ( A ˜ , N ˜ ) = θ { i = 1 n M ( x i ) } 1 2 1 4 n 1 2 ( 1 θ ) { i = 1 n [ 2 + M ( x i ) 2 μ ˜ A ˜ ( x i ) ] } 1 2 { i = 1 n [ 2 + M ( x i ) 2 ϑ ˜ A ˜ ( x i ) ] } 1 2 + θ { i = 1 n M ( x i ) } 1 2 ,
where M ( x i ) = μ A ˜ L ( x i ) 2 + μ A ˜ U ( x i ) 2 + ϑ A ˜ L ( x i ) 2 + ϑ A ˜ U ( x i ) 2 . □

3.2. Construct Entropy Measure with Distance for IvIFSs

Entropy is an important tool for measuring the uncertainty of fuzzy variables and processing fuzzy information. An IvIFS is used to describe a set whose elements cannot be clearly defined in terms of whether they belong to a given set. In this section, we mainly take the advantage of the distance and abstract function to construct the entropy of IvIFSs.
Theorem 2.
Let H : [ 0 , 1 ] [ 0 , 1 ] be a strictly monotonically decreasing real function. A mapping E : I v I F S ( X ) I [ 0 , 1 ] is called the entropy of an IvIFS and is defined as:
E ( A ˜ ) = H ( 3 D ( A ˜ , [ 1 3 , 1 3 , 1 3 ] x ] ) ) H ( 1 ) H ( 0 ) H ( 1 ) .
Proof. 
For any A ˜ I v I F S ( X ) , we shall demonstrate that E ( A ˜ ) satisfies the four conditions in Definition 5.
( E 1 ) From the definition of the entropy E ( A ˜ ) and the distance D ( A ˜ ) , it is obvious that
E ( A ˜ ) = 0 3 D ( A ˜ , [ 1 3 , 1 3 , 1 3 ] x ) = 1 D ( A ˜ , [ 1 3 , 1 3 , 1 3 ] x ) = 1 3 ,
and then we can obtain A ˜ = < [ 1 , 1 ] , [ 0 , 0 ] , [ 0 , 0 ] > or A ˜ = < [ 0 , 0 ] , [ 1 , 1 ] , [ 0 , 0 ] > . Thus, A ˜ is a crisp set.
( E 2 ) According to the definition of distance, it is evident that
E ( A ˜ ) = 1 3 D ( A ˜ , [ 1 3 , 1 3 , 1 3 ] x ) = 0 A ˜ = [ 1 3 , 1 3 , 1 3 ] x ,
and therefore, μ ˜ A ˜ ( x ) = ϑ ˜ A ˜ ( x ) = π ˜ A ˜ ( x ) = [ 1 3 , 1 3 ] .
( E 3 ) Since the function H ( . ) is strictly monotonically decreasing, it is easy to understand that if
D ( A ˜ , [ 1 3 , 1 3 , 1 3 ] x ) D ( B ˜ , [ 1 3 , 1 3 , 1 3 ] x ) ,
then H ( 3 D ( A ˜ , [ 1 3 , 1 3 , 1 3 ] x ) H ( 3 D ( B ˜ , [ 1 3 , 1 3 , 1 3 ] x ) . Thus, we deduce E ( A ˜ ) E ( B ˜ ) .
( E 4 ) According to the definition of the complement of the IvIFS A ˜ , i.e.,
D ( A ˜ , [ 1 3 , 1 3 , 1 3 ] x ) = D ( A ˜ c , [ 1 3 , 1 3 , 1 3 ] x ) ,
it is clear that E ( A ˜ ) = E ( A ˜ c ) is obtained.
It is noteworthy that different formulas can be developed through theorems to calculate entropy measures between different strictly monotonically decreasing functions, such as ( 1 )   H ( x ) = 1 x ; ( 2 )   H ( x ) = 1 x 1 + x ;   ( 3 )   H ( x ) = 1 x e x 1 ;   ( 4 )   H ( x ) = 1 x 2 .
Let H ( x ) = 1 x and D be the above-mentioned distance measure equations between IvIFSs; then, for any A ˜ I v I F S ( X ) , the entropy of an IvIFS is as follows:
E ( A ˜ ) = 1 3 D ( A ˜ , [ 1 3 , 1 3 , 1 3 ] x ) .
It is not difficult to find that Equation (14) is consistent with the entropy defined in reference [7]. Therefore, the total entropy expression in Equation (13) proposed in this paper can not only cover the entropy constructed by other studies in the literature but also derive more entropies for IvIFSs. For example, let H ( x ) = 1 x 2 and the distance in Equation (13) be the Hamming distance; then, the specific expression is as follows:
E ( A ˜ ) = 1 9 { 1 4 n i = 1 n [ | μ A ˜ L ( x i ) 1 3 | + | μ A ˜ U ( x i ) 1 3 | + | ϑ A ˜ L ( x i ) 1 3 | + | ϑ A ˜ U ( x i ) 1 3 | ] } 2 .
Next, we compare our proposed entropy E and knowledge measure K with the existing entropy measures E Y , E W , E Z M , E I N , E C , E J , E Z 2 and E W Z and knowledge measures K G Z based on a specific numerical example. □
Example 1.
We consider six single-element IvIFSs given by
A ˜ 1 = < x , [ 0.7 , 0.7 ] , [ 0.2 , 0.2 ] > , A ˜ 2 = < x , [ 0.5 , 0.5 ] , [ 0.3 , 0.3 ] > ,
A ˜ 3 = < x , [ 0.5 , 0.5 ] , [ 0.5 , 0.5 ] > , A ˜ 4 = < x , [ 0.5 , 0.5 ] , [ 0.4 , 0.4 ] > ,
A ˜ 5 = < x , [ 0.6 , 0.6 ] , [ 0.2 , 0.2 ] > , A ˜ 6 = < x , [ 0.4 , 0.4 ] , [ 0.4 , 0.4 ] > ,
to compare calculations between the recalled entropy measures from Equations (1)–(9) and the proposed measure from Equation (11) when θ = 0.3 and according to Equation (15). The calculated results of the specific measures are summarized in Table 1.
It is worth noting that the results of E Y , E W , E I N , E C , E J and E W Z are the same for two different sets, A ˜ 3 and A ˜ 6 , and there is a counterintuitive situation. In addition, the calculation result of E Z M for sets A ˜ 1 and A ˜ 4 , as well as A ˜ 2 , A ˜ 5 and A ˜ 6 , are still the same, so the entropy fails in this case. At the same time, E Z 2 has the same results for A ˜ 2 and A ˜ 6 . However, there is no counterintuitive case for the calculation results of E, K G Z and K, so they show their superiority over other entropy formulas. It is not difficult to see that our knowledge measure is consistent with the calculation results of K G Z , which illustrates the effectiveness of the entropy and knowledge measures proposed in this paper.
Example 2.
We have chosen the real cloud service selection problem from reference [26]. Suppose there are four cloud service alternatives: SAP Sales on Demand ( x 1 ) , Salesforce Sales Cloud ( x 2 ) , Microsoft Dynamic CRM ( x 3 ) and Oracle Cloud CRM ( x 4 ) . The evaluation of the four schemes by experts needs to take into account the following five attributes: performance ( a 1 ) , payment ( a 2 ) , reputation ( a 3 ) , scalability ( a 4 ) and security ( a 5 ) . The experts’ comprehensive evaluation of the five attributes is listed in Table 2. The weighting vector of a j ( j = 1 , 2 , 3 , 4 , 5 ) is ω = ( 0.3822 , 0.1911 , 0.05 , 0.1411 , 0.2355 ) T . The goal is to select the best cloud services.
Step 1 The attribute a j ( j = 1 , 2 , 3 , 4 , 5 ) are beneficial, so the attribute values in R do not need to be normalized.
Step 2 The interval-valued intuitionistic fuzzy weighted averaging (IIFWA) operator referred to in [2] is used to set the comprehensive evaluation value z i by aggregating the individual evaluation values z i j with the attribute weight vector ω = ( 0.3822 , 0.1911 , 0.05 , 0.1411 , 0.2355 ) T :
z 1 = < [ 0.439 , 0.599 ] , [ 0.183 , 0.337 ] > , z 2 = < [ 0.413 , 0.556 ] , [ 0.226 , 0.370 ] > ,
z 3 = < [ 0.524 , 0.687 ] , [ 0.158 , 0.279 ] > , z 4 = < [ 0.261 , 0.385 ] , [ 0.426 , 0.580 ] > .
Step 3 Based on the overall collective evaluation value Z i of the alternatives, we calculate the knowledge measure of IvIFSs by using Equation (11) when θ = 0.3 . Thus, we obtain:
K ( z 1 ) = 0.3113 , K ( z 2 ) = 0.3269 , K ( z 3 ) = 0.3165 , K ( z 4 ) = 0.3699 .
These knowledge measures of IvIFSs are ranked as follows: K ( z 4 ) > K ( z 2 ) > K ( z 3 ) > K ( z 1 ) . Therefore, according to the knowledge measure, we can derive the ranking order of the alternatives: x 4 x 2 x 3 x 1 . Thus, the best alternative is x 4 —Oracle Cloud CRM.

4. Illustrative Example

From past research, it is not difficult to find that entropy can be used as an analysis tool for attribute weights based on criterion information content when dealing with multi-attribute decision-making (MADM) problems, that is, the attribute entropy weight. In addition to entropy, in the context of IvIFSs, the knowledge measure also has this property. Then, the more knowledge of an attribute, the greater the importance weight associated with it. We call it the knowledge weight on attributes. Next, this paper provides a realistic example adapted from [17] to illustrate the application of the entropy formula proposed in this paper to the MAGDM for IvIFSs. A city is planning to build a municipal library. The Urban Development Commissioner is now facing the question of what kind of air-conditioning system should be installed in the library. The contractor provided four feasible alternatives A i ( i = 1 , 2 , 3 , 4 ) . At the same time, there are five attributes C j ( j = 1 , 2 , 3 , 4 , 5 ) that need to be considered comprehensively. They are C1—performance; C2—maintainability; C3—flexibility; C4—cost; and C5—safety. A committee of four experts d k ( k = 1 , 2 , 3 , 4 ) has evaluated the options. Supposing that the weight vector of the experts is λ = ( 0.3 , 0.2 , 0.3 , 0.2 ) T , and the importance weights of the attributes are completely unknown. Let the attribute weight vector be ω = ( ω 1 , ω 2 , ω 3 , ω 4 , ω 5 ) T , where j = 1 n ω j = 1 ( j = 1 , 2 , 3 , 4 , 5 ) . The individual opinions of d k on a i with respect to c j are provided in terms of an interval-valued intuitionistic fuzzy decision matrix, denoted by R ˜ ( k ) = ( r ˜ i j ( k ) ) 4 × 5 , where r ˜ i j ( k ) = ( μ ˜ i j ( k ) , ϑ ˜ i j ( k ) ) , μ ˜ i j ( k ) = [ μ i j L ( k ) , μ i j U ( k ) ] , ϑ ˜ i j ( k ) = [ ϑ i j L ( k ) , ϑ i j U ( k ) ] are expressed as IvIFSs, i = 1 , 2 , 3 , 4 , j = 1 , 2 , 3 , 4 , 5 , k = 1 , 2 , 3 , 4 . The decision matrices given by the four experts are shown in Table 3, Table 4, Table 5 and Table 6.
Here, we use the method of calculating entropy given by Equation (15) to derive the importance weights of the attributes. To further evaluate the alternatives, all the above opinions are summarized.
Step 1 All personal opinions are summarized to form a comprehensive evaluation decision matrix R ˜ ( k ) = ( r ˜ i j ( k ) ) 4 × 5 . Then, we use the IIFWA operator by referring to [2], where
r ˜ i j = I I F W A λ ( r ˜ i j ( 1 ) , r ˜ i j ( 2 ) , r ˜ i j ( 3 ) , r ˜ i j ( 4 ) ) = [ 1 k = 1 4 ( 1 μ i j L ( k ) ) λ k , 1 k = 1 4 ( 1 μ i j U ( k ) ) λ k ] , [ k = 1 4 ( ϑ i j L ( k ) ) λ k , k = 1 4 ( ϑ i j U ( k ) ) λ k ] ,
where π A ˜ L ( x ) = k = 1 4 ( 1 μ i j U ( k ) ) λ k k = 1 4 ( ϑ i j U ( k ) ) λ k , π A ˜ U ( x ) = k = 1 4 ( 1 μ i j L ( k ) ) λ k k = 1 4 ( ϑ i j L ( k ) ) λ k , and λ k represents the weight of d k , k = 1 , 2 , 3 , 4 .
Thus, according to the IIFWA operator, the group opinion is expressed as Table 7.
Step 2 The entropy E j of each c j is calculated using Equation (15). In the interval-valued intuitionistic fuzzy environment, when the weight of the criterion is completely unknown, we determine the criterion weight using Equation (16),
ω j = 1 E j 5 j = 1 5 E j ,
where j = 1 , 2 , 3 , 4 , 5 .
The weighting vector for each c j can be determined and expressed as
ω = ( 0.179431 , 0.205183 , 0.231484 , 0.17919 , 0.204711 ) T .
Step 3 The interval-valued intuitionistic fuzzy information of each choice of all attributes is aggregated into an IvIFV expressed as:
r ˜ i = I I F W A ω ( r ˜ i 1 , r ˜ i 2 , r ˜ i 3 , r ˜ i 4 , r ˜ i 5 ) = 1 j = 1 5 1 μ i j L ω j , 1 j = 1 5 1 μ i j U ω j , j = 1 5 ϑ i j L ω j , j = 1 s ϑ i j U ω j ,
where i = 1 , 2 , 3 , 4 , j = 1 , 2 , 3 , 4 , 5 and π r i ˜ L = j = 1 5 1 μ i j U ω j j = 1 5 ϑ i j U ω j , π r i ˜ U = j = 1 5 1 μ i j L ω j j = 1 5 ϑ i j L ω j .
Group assessments of the alternatives are listed as follows:
r ˜ 1 = ( [ 0.4347 , 0.5829 ] , [ 0.1944 , 0.3388 ] ) ,
r ˜ 2 = ( [ 0.3555 , 0.4926 ] , [ 0.2899 , 0.4313 ] ) ,
r ˜ 3 = ( [ 0.5249 , 0.7040 ] , [ 0.1500 , 0.2745 ] ) ,
r ˜ 4 = ( [ 0.1709 , 0.2905 ] , [ 0.4985 , 0.6725 ] ) .
Step 4 Referring to [6], a reasonable measure Z ( r ˜ i ) of r ˜ i is calculated by using the following formula:
Z ( r ˜ i ) = 1 2 1 1 4 π r ˜ i L + π r ˜ i U μ r ˜ i L + μ r ˜ i U + 1 2 π r ˜ i L + π r ˜ i U ,
where i = 1 , 2 , 3 , 4 .
The ordering of the schemes depends on the value of Z ( r ˜ i ) . The larger the value of Z ( r ˜ i ) , the better the A i .
Based on calculations, we obtain
Z ( r ˜ 1 ) = 0.554340 , Z ( r ˜ 2 ) = 0.473324 ,
Z ( r ˜ 3 ) = 0.631118 , Z ( r ˜ 4 ) = 0.303164 .
Finally, we rank the alternatives with respect to the set of attributes in terms of the values of Z ( r ˜ i ) ( i = 1 , 2 , 3 , 4 ) , that is, A 3 A 1 A 2 A 4 . Thus, A 3 is the best investment alternative among the four candidates. For convenience, we denote Z ( r ˜ ) = ( Z ( r ˜ 1 ) , Z ( r ˜ 2 ) , Z ( r ˜ 3 ) , Z ( r ˜ 4 ) ) . The ranking in Table 8 is consistent with that in reference [22]. That is to say, the entropy is practical for decision-making problems. Through the above example, we can see that our proposed method to solve the MCDM problem has achieved reasonable results.

5. Conclusions

This paper proposes a new axiomatic definition of the knowledge measure of IvIFSs and constructs the knowledge measure and entropy of IvIFSs based on the axiomatic definition of distance. The two formulas proposed in this paper can extend a series of knowledge measures and entropy. In addition, an example is provided to illustrate the effectiveness of our proposed entropy model. Finally, we illustrate the feasibility of our method by solving the multi-attribute decision-making problem with unknown weights. In the future, we will pay more attention to implementing the transformation between the knowledge measure and entropy. In addition, the function of the similarity measure of IvIFSs will be constructed, and these information measures will be applied to multi-attribute decision-making problems, data analysis and risk assessment.

Author Contributions

Funding acquisition, C.S. and Y.L.; investigation, C.S., X.L. and Y.L.; methodology, C.S., X.L. and Y.L.; resources, C.S. and Y.L.; validation, C.S. and Y.L.; writing—original draft, C.S. and Y.L.; writing—review and editing, C.S., X.L. and Y.L. All authors have read and agreed to the published version of the manuscript.

Funding

This paper was supported by the National Science Foundation of China (Grant Nos.: 11671244, 12071271), the Natural Science Foundation of Jilin Province (Grant Nos.: YDZJ202201ZYTS320, YDZJ202201ZYTS303, YDZJ202201ZYTS648) and The Sixth Batch of Jilin Province Youth Science and Technology Talent Lifting Project.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Atanassov, K.T. Interval-valued intuitionistic fuzzy sets. Fuzzy Sets Syst. 1986, 20, 87–96. [Google Scholar] [CrossRef]
  2. Xu, Z.S. Methods for aggregating interval-valued intuitionistic fuzzy information and their application to decision making. Control. Decis. 2007, 22, 215–219. [Google Scholar]
  3. Ye, J. Multicriteria fuzzy decision-making method using entropy weights-based correlation coefficients of intervalvalued intuitionistic fuzzy sets. Appl. Math. Model. 2010, 34, 3864–3870. [Google Scholar] [CrossRef]
  4. Wei, C.P.; Wang, P.; Zhang, Y.Z. Entropy, similarity measure for interval-valued intuitionistic fuzzy sets and their application. Inf. Sci. 2011, 181, 4273–4286. [Google Scholar] [CrossRef]
  5. Wei, C.; Zhang, Y. Entropy measure for interval-valued intuitionistic fuzzy sets and their application in group decision-making. Math. Probl. Eng. 2015, 2015, 563745. [Google Scholar] [CrossRef] [Green Version]
  6. Guo, K. Amount of information and attitudinal based method for ranking Atanassov’s intuitionistic fuzzy values. IEEE Trans. Fuzzy Syst. 2014, 22, 177–188. [Google Scholar] [CrossRef]
  7. Tiwari, P.; Gupta, P. Entropy, distance and similarity measures under interval-valued Intuitionistic fuzzy environmen. Informatica 2018, 42, 617–627. [Google Scholar] [CrossRef] [Green Version]
  8. Liu, Y.; Jiang, W. A new distance measure of interval-valued intuitionistic fuzzy sets and its application in decision making. Soft Comput. 2020, 24, 6987–7003. [Google Scholar] [CrossRef]
  9. Li, X.; Suo, C.; Li, Y. Width-based distance measures on interval-valued intuitionistic fuzzy sets. J. Intell. Fuzzy Syst. 2021, 40, 1–13. [Google Scholar] [CrossRef]
  10. Ohlan, A. Novel entropy and distance measures for interval-valued intuitionistic fuzzy sets with application in multi-criteria group decision-making. Int. J. Gen. Syst. 2022, 51, 413–440. [Google Scholar] [CrossRef]
  11. Zhang, Y.J.; Li, P.H.; Wang, Y.Z.; Ma, P.J.; Su, X.H. Multiattribute decision making based on entropy under interval-valued intuitionistic fuzzy environment. Math. Probl. Eng. 2013, 2013, 526871. [Google Scholar] [CrossRef] [Green Version]
  12. Zhang, Y.J.; Ma, P.J.; Su, X.H.; Zhang, C.P. Entropy on interval-valued intuitionistic fuzzy sets and its application in multi-attribute decision making. In Proceedings of the 14th International Conference on Information Fusion, Chicago, IL, USA, 5–8 July 2011; pp. 1121–1140. [Google Scholar]
  13. Sun, M.; Liu, J. New entropy and similarity measures for interval-valued intuitionistic fuzzy sets. J. Inf. Comput. Sci. 2012, 9, 5799–5806. [Google Scholar]
  14. Chen, X.; Yang, L.; Wang, P.; Yue, W. A fuzzy multi-criteria group decision-making method with new entropy of interval-valued intuitionistic fuzzy sets. J. Appl. Math. 2013, 2013, 827268. [Google Scholar]
  15. Jing, L. Entropy and similarity measures for interval-valued intuitionistic fuzzy sets based on intuitionism and fuzziness. Adv. Model. Optim. 2013, 15, 635–643. [Google Scholar]
  16. Zhang, Q.; Xing, H.; Liu, F.; Ye, J.; Tang, P. Some new entropy measures for interval-valued intuitionistic fuzzy sets based on distances and their relationships with similarity and inclusion measures. Inf. Sci. 2014, 283, 55–69. [Google Scholar] [CrossRef]
  17. Park, J.H.; Park, I.Y.; Kwun, Y.C.; Tan, X. Extension of the TOPSIS method for decision making problems under interval-valued intuitionistic fuzzy environment. Appl. Math. Model. 2011, 35, C2544–C2556. [Google Scholar] [CrossRef]
  18. Park, J.H.; Lin, K.M.; Park, J.S.; Kwun, Y.C. Distance between interval-valued intuitionistic fuzzy sets. J. Phys. Conf. Ser. 2008, 96, 012089. [Google Scholar] [CrossRef]
  19. Nguyen, H. A new interval-valued knowledge measure for interval-valued intuitionistic fuzzy sets and application in decision making. Expert Syst. Appl. 2016, 56, 145–155. [Google Scholar] [CrossRef]
  20. Guo, K.H.; Xu, H. A unified framework for knowledge measure with application: From fuzzy sets through interval-valued intuitionistic fuzzy sets. Soft Comput. 2021, 23, 6967–6978. [Google Scholar] [CrossRef]
  21. Wu, X.; Song, Y.F.; Wang, Y.F. Distance-based knowledge measure for intuitionistic fuzzy sets with its application in decision making. Entropy 2021, 23, 1119. [Google Scholar] [CrossRef]
  22. Guo, K.H.; Zang, J. Knowledge measure for interval-valued intuitionistic fuzzy sets and its application to decision making under uncertainty. Soft Comput. 2018, 23, C6967–C6978. [Google Scholar] [CrossRef]
  23. Wei, A.P.; Li, D.F.; Lin, P.P.; Jiang, B.Q. An information-based score function of interval-valued intuitionistic fuzzy sets and its application in multiattribute decision making. Soft Comput. 2021, 25, C1913–C1923. [Google Scholar] [CrossRef]
  24. De, A.; Das, S.; Kar, S. Multiple attribute decision making based on probabilistic interval-valued intuitionistic hesitant fuzzy set and extended TOPSIS method. J. Intell. Fuzzy Syst. 2019, 37, 5229–5248. [Google Scholar] [CrossRef]
  25. Liu, S.; Yu, W.; Chan, F.T.S.; Niu, B. A variable weight-based hybrid approach for multi-attribute group decision making under interval-valued intuitionistic fuzzy sets. Int. J. Intell. Syst. 2020, 36, 1015–1052. [Google Scholar] [CrossRef]
  26. Dugenci, M. A new distance measure for interval valued intuitionistic fuzzy sets and its application to group decision making problems with incomplete weights information. Appl. Soft Comput. 2016, 41, 120–134. [Google Scholar] [CrossRef]
  27. Oscar, C.; Patricia, M. Towards interval type-3 intuitionistic fuzzy sets and systems. Mathematics 2022, 10, 4091. [Google Scholar]
  28. Xue, Z.N.; Sun, B.X.; Hou, H.D.; Pang, W.L.; Zhang, Y.N. Three-way decision models based on multi-granulation rough intuitionistic hesitant fuzzy sets. Cogn. Comput. 2022, 14, 1859–1880. [Google Scholar] [CrossRef]
  29. Jiri, M.; David, H. On unification of methods in theories of fuzzy sets, hesitant fuzzy set, fuzzy soft sets and intuitionistic fuzzy sets. Mathematics 2021, 9, 447. [Google Scholar]
Table 1. Comparison of the entropy measures and the knowledge measures for IvIFSs.
Table 1. Comparison of the entropy measures and the knowledge measures for IvIFSs.
E Y E W E ZM E IN E C E J E Z 2 E WZ E K GZ K
A ˜ 1 0.740.380.110.620.460.550.50.820.440.310.347
A ˜ 2 0.960.710.240.870.770.830.80.970.780.340.353
A ˜ 3 1.001.000.001.001.001.001.01.000.630.500.514
A ˜ 4 0.990.830.110.930.870.910.90.990.740.420.435
A ˜ 5 0.830.500.240.710.580.670.60.890.550.280.313
A ˜ 6 1.001.000.241.001.001.000.81.000.850.400.370
Table 2. Comprehensive evaluation values z i j of alternatives based on attributes.
Table 2. Comprehensive evaluation values z i j of alternatives based on attributes.
a 1 a 2 a 3 a 4 a 5
x 1 ([0.5, 0.7], [0.2, 0.3])([0.3, 0.5], [0.3, 0.4])([0.7, 0.8], [0.1, 0.2])([0.5, 0.7], [0.1, 0.2])([0.2, 0.4], [0.3, 0.5])
x 2 ([0.4, 0.5], [0.3, 0.4])([0.2, 0.3], [0.2, 0.4])([0.3, 0.4], [0.4, 0.5])([0.2, 0.3], [0.5, 0.6])([0.7, 0.8], [0.1, 0.2])
x 3 ([0.4, 0.6], [0.3, 0.4])([0.7, 0.8], [0.1, 0.2])([0.6, 0.8], [0.1, 0.1])([0.5, 0.7], [0.2, 0.3])([0.6, 0.7], [0.1, 0.3])
x 4 ([0.3, 0.5], [0.3, 0.5])([0.1, 0.3], [0.6, 0.8])([0.1, 0.2], [0.6, 0.7])([0.3, 0.4], [0.4, 0.6])([0.2, 0.4], [0.5, 0.6])
Table 3. The decision matrix given by the first decision maker R ( 1 ) .
Table 3. The decision matrix given by the first decision maker R ( 1 ) .
C 1 C 2 C 3 C 4 C 5
A 1 ([0.5, 0.6], [0.2, 0.3])([0.3, 0.5], [0.4, 0.5])([0.6, 0.7], [0.2, 0.3])([0.5, 0.7], [0.1, 0.2])([0.1, 0.4], [0.3, 0.5])
A 2 ([0.3, 0.4], [0.4, 0.6])([0.1, 0.3], [0.2, 0.4])([0.3, 0.4], [0.4, 0.5])([0.2, 0.4], [0.5, 0.6])([0.7, 0.8], [0.1, 0.2])
A 3 ([0.4, 0.5], [0.3, 0.5])([0.7, 0.8], [0.1, 0.2])([0.5, 0.8], [0.1, 0.2])([0.4, 0.6], [0.2, 0.3])([0.5, 0.6], [0.2, 0.3])
A 4 ([0.3, 0.5], [0.4, 0.5])([0.1, 0.2], [0.7, 0.8])([0.1, 0.2], [0.5, 0.8])([0.2, 0.3], [0.4, 0.6])([0.2, 0.3], [0.5, 0.6])
Table 4. The decision matrix given by the second decision maker R ( 2 ) .
Table 4. The decision matrix given by the second decision maker R ( 2 ) .
C 1 C 2 C 3 C 4 C 5
A 1 ([0.4, 0.5], [0.2, 0.4])([0.3, 0.4], [0.4, 0.6])([0.6, 0.7], [0.1, 0.2])([0.5, 0.6], [0.1, 0.3])([0.1, 0.3], [0.3, 0.5])
A 2 ([0.3, 0.4], [0.4, 0.5])([0.1, 0.3], [0.3, 0.7])([0.3, 0.4], [0.4, 0.5])([0.2, 0.3], [0.6, 0.7])([0.6, 0.8], [0.1, 0.2])
A 3 ([0.4, 0.6], [0.3, 0.4])([0.6, 0.8], [0.1, 0.2])([0.7, 0.8], [0.1, 0.2])([0.4, 0.6], [0.3, 0.4])([0.5, 0.6], [0.2, 0.4])
A 4 ([0.3, 0.4], [0.4, 0.6])([0.1, 0.2], [0.6, 0.8])([0.1, 0.2], [0.7, 0.8])([0.3, 0.4], [0.4, 0.6])([0.2, 0.4], [0.5, 0.6])
Table 5. The decision matrix given by the third decision maker R ( 3 ) .
Table 5. The decision matrix given by the third decision maker R ( 3 ) .
C 1 C 2 C 3 C 4 C 5
A 1 ([0.4, 0.7], [0.1, 0.2])([0.3, 0.5], [0.3, 0.4])([0.6, 0.7], [0.1, 0.2])([0.5, 0.6], [0.1, 0.3])([0.3, 0.5], [0.4, 0.5])
A 2 ([0.4, 0.5], [0.2, 0.4])([0.2, 0.4], [0.4, 0.5])([0.4, 0.5], [0.3, 0.4])([0.1, 0.2], [0.7, 0.8])([0.6, 0.7], [0.2, 0.3])
A 3 ([0.2, 0.4], [0.3, 0.4])([0.6, 0.8], [0.1, 0.2])([0.5, 0.7], [0.1, 0.3])([0.5, 0.7], [0.2, 0.3])([0.6, 0.8], [0.1, 0.2])
A 4 ([0.3, 0.4], [0.2, 0.4])([0.1, 0.2], [0.6, 0.8])([0.1, 0.3], [0.5, 0.7])([0.2, 0.3], [0.5, 0.7])([0.1, 0.2], [0.6, 0.8])
Table 6. The decision matrix given by the fourth decision maker R ( 4 ) .
Table 6. The decision matrix given by the fourth decision maker R ( 4 ) .
C 1 C 2 C 3 C 4 C 5
A 1 ([0.6, 0.7], [0.2, 0.3])([0.3, 0.4], [0.3, 0.4])([0.7, 0.8], [0.1, 0.2])([0.5, 0.6], [0.1, 0.3])([0.1, 0.2], [0.5, 0.7])
A 2 ([0.4, 0.5], [0.4, 0.5])([0.1, 0.2], [0.2, 0.3])([0.3, 0.4], [0.5, 0.6])([0.2, 0.3], [0.4, 0.6])([0.6, 0.7], [0.1, 0.2])
A 3 ([0.4, 0.5], [0.3, 0.4])([0.6, 0.7], [0.1, 0.3])([0.5, 0.8], [0.1, 0.2])([0.4, 0.5], [0.2, 0.3])([0.5, 0.6], [0.3, 0.4])
A 4 ([0.3, 0.4], [0.4, 0.5])([0.1, 0.3], [0.6, 0.7])([0.1, 0.2], [0.5, 0.8])([0.2, 0.3], [0.4, 0.5])([0.3, 0.4], [0.5, 0.6])
Table 7. Comprehensive decision matrix R.
Table 7. Comprehensive decision matrix R.
C 1 C 2 C 3 C 4 C 5
A 1 ([0.5, 0.6], [0.2, 0.3])[0.3, 0.5], [0.4, 0.5])([0.6, 0.7], [0.4, 0.5])([0.5, 0.6], [0.1, 0.3])([0.2, 0.4], [0.4, 0.5])
A 2 ([0.4, 0.5], [0.3, 0.5])([0.1, 0.3], [0.3, 0.5])([0.3, 0.4], [0.4, 0.5])([0.2, 0.3], [0.6, 0.7])([0.6, 0.8], [0.1, 0.2])
A 3 ([0.4, 0.5], [0.3, 0.4])([0.6, 0.8], [0.1, 0.2])([0.6, 0.8], [0.1, 0.2])([0.4, 0.6], [0.2, 0.3])([0.5, 0.7], [0.2, 0.3])
A 4 ([0.3, 0.4], [0.3, 0.5])([0.1, 0.2], [0.6, 0.8])([0.1, 0.2], [0.5, 0.8])([0.2, 0.3], [0.4, 0.6])([0.2, 0.3], [0.5, 0.7])
Table 8. Comparison results of methods.
Table 8. Comparison results of methods.
Z ( r ˜ ) Program Ranking
Guo [20] ( 0.553 , 0.474 , 0.631 , 0.303 ) A 3 A 1 A 2 A 4
Ours ( 0.551 , 0.474 , 0.640 , 0.293 ) A 3 A 1 A 2 A 4
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Suo, C.; Li, X.; Li, Y. Distance-Based Knowledge Measure and Entropy for Interval-Valued Intuitionistic Fuzzy Sets. Mathematics 2023, 11, 3468. https://doi.org/10.3390/math11163468

AMA Style

Suo C, Li X, Li Y. Distance-Based Knowledge Measure and Entropy for Interval-Valued Intuitionistic Fuzzy Sets. Mathematics. 2023; 11(16):3468. https://doi.org/10.3390/math11163468

Chicago/Turabian Style

Suo, Chunfeng, Xuanchen Li, and Yongming Li. 2023. "Distance-Based Knowledge Measure and Entropy for Interval-Valued Intuitionistic Fuzzy Sets" Mathematics 11, no. 16: 3468. https://doi.org/10.3390/math11163468

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop