Next Article in Journal / Special Issue
A Decision-Making Approach Based on a Multi Q-Hesitant Fuzzy Soft Multi-Granulation Rough Model
Previous Article in Journal
Interval-Valued Fuzzy Cooperative Games Based on the Least Square Excess and Its Application to the Profit Allocation of the Road Freight Coalition
Previous Article in Special Issue
Shadowed Sets-Based Linguistic Term Modeling and Its Application in Multi-Attribute Decision-Making
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Two Types of Single Valued Neutrosophic Covering Rough Sets and an Application to Decision Making

1
College of Electrical and Information Engineering, Shaanxi University of Science and Technology, Xi’an 710021, China
2
School of Arts and Sciences, Shaanxi University of Science and Technology, Xi’an 710021, China
*
Author to whom correspondence should be addressed.
Symmetry 2018, 10(12), 710; https://doi.org/10.3390/sym10120710
Submission received: 8 November 2018 / Revised: 25 November 2018 / Accepted: 27 November 2018 / Published: 3 December 2018
(This article belongs to the Special Issue Fuzzy Techniques for Decision Making 2018)

Abstract

:
In this paper, to combine single valued neutrosophic sets (SVNSs) with covering-based rough sets, we propose two types of single valued neutrosophic (SVN) covering rough set models. Furthermore, a corresponding application to the problem of decision making is presented. Firstly, the notion of SVN β -covering approximation space is proposed, and some concepts and properties in it are investigated. Secondly, based on SVN β -covering approximation spaces, two types of SVN covering rough set models are proposed. Then, some properties and the matrix representations of the newly defined SVN covering approximation operators are investigated. Finally, we propose a novel method to decision making (DM) problems based on one of the SVN covering rough set models. Moreover, the proposed DM method is compared with other methods in an example.

1. Introduction

Rough set theory, as a a tool to deal with various types of data in data mining, was proposed by Pawlak [1,2] in 1982. Since then, rough set theory has been extended to generalized rough sets based on other notions such as binary relations, neighborhood systems and coverings.
Covering-based rough sets [3,4,5] were proposed to deal with the type of covering data. In application, they have been applied to knowledge reduction [6,7], decision rule synthesis [8,9], and other fields [10,11,12]. In theory, covering-based rough set theory has been connected with matroid theory [13,14,15,16], lattice theory [17,18] and fuzzy set theory [19,20,21,22].
Zadeh’s fuzzy set theory [23] addresses the problem of how to understand and manipulate imperfect knowledge. It has been used in various applications [24,25,26,27]. Recent investigations have attracted more attention on combining covering-based rough set and fuzzy set theories. There are many fuzzy covering rough set models proposed by researchers, such as Ma [28] and Yang et al. [20].
Wang et al. [29] presented single valued neutrosophic sets (SVNSs) which can be regarded as an extension of IFSs [30]. Neutrosophic sets and rough sets both can deal with partial and uncertain information. Therefore, it is necessary to combine them. Recently, Mondal and Pramanik [31] presented the concept of rough neutrosophic set. Yang et al. [32] presented a SVN rough set model based on SVN relations. However, SVNSs and covering-based rough sets have not been combined up to now. In this paper, we present two types of SVN covering rough set models. This new combination is a bridge, linking SVNSs and covering-based rough sets.
As we know, the multiple criteria decision making (MCDM) is an important tool to deal with more complicated problems in our real world [33,34]. There are many MCDM methods presented based on different problems or theories. For example, Liu et al. [35] dealt with the challenges of many criteria in the MCDM problem and decision makers with heterogeneous risk preferences. Watróbski et al. [36] proposed a framework for selecting suitable MCDA methods for a particular decision situation. Faizi et al. [37,38] presented an extension of the MCDM method based on hesitant fuzzy theory. Recently, many researchers have studied decision making (DM) problems by rough set models [39,40,41,42]. For example, Zhan et al. [39] applied a type of soft rough model to DM problems. Yang et al. [32] presented a method for DM problems under a type of SVN rough set model. By investigation, we have observed that no one has applied SVN covering rough set models to DM problems. Therefore, we construct the covering SVN decision information systems according to the characterizations of DM problems. Then, we present a novel method to DM problems under one of the SVN covering rough set models. Moreover, the proposed decision making method is compared with other methods, which were presented by Yang et al. [32], Liu [43] and Ye [44].
The rest of this paper is organized as follows. Section 2 reviews some fundamental definitions about covering-based rough sets and SVNSs. In Section 3, some notions and properties in SVN β -covering approximation space are studied. In Section 4, we present two types of SVN covering rough set models, based on the SVN β -neighborhoods and the β -neighborhoods. In Section 5, some new matrices and matrix operations are presented. Based on this, the matrix representations of the SVN approximation operators are shown. In Section 6, a novel method to decision making (DM) problems under one of the SVN covering rough set models is proposed. Moreover, the proposed DM method is compared with other methods. This paper is concluded and further work is indicated in Section 7.

2. Basic Definitions

Suppose U is a nonempty and finite set called universe.
Definition 1
(Covering [45,46]). Let U be a universe and C a family of subsets of U. If none of subsets in C is empty and C = U , then C is called a covering of U.
The pair ( U , C ) is called a covering approximation space.
Definition 2
(Single valued neutrosophic set [29]). Let U be a nonempty fixed set. A single valued neutrosophic set (SVNS) A in U is defined as an object of the following form:
A = { x , T A ( x ) , I A ( x ) , F A ( x ) : x U } ,
where T A ( x ) : U [ 0 , 1 ] is a truth-membership function, I A ( x ) : U [ 0 , 1 ] is an indeterminacy-membership function and F A ( x ) : U [ 0 , 1 ] is a falsity-membership function for any x U . They satisfy 0 T A ( x ) + I A ( x ) + F A ( x ) 3 for all x U . The family of all single valued neutrosophic sets in U is denoted by S V N ( U ) . For convenience, a SVN number is represented by α = a , b , c , where a , b , c [ 0 , 1 ] and a + b + c 3 .
Specially, for two SVN numbers α = a , b , c and β = d , e , f , α β a d , b e and c f . Some operations on S V N ( U ) are listed as follows [29,32]: for any A , B S V N ( U ) ,
(1)
A B iff T A ( x ) T B ( x ) , I B ( x ) I A ( x ) and F B ( x ) F A ( x ) for all x U .
(2)
A = B iff A B and B A .
(3)
A B = { x , T A ( x ) T B ( x ) , I A ( x ) I B ( x ) , F A ( x ) F B ( x ) : x U } .
(4)
A B = { x , T A ( x ) T B ( x ) , I A ( x ) I B ( x ) , F A ( x ) F B ( x ) : x U } .
(5)
A = { x , F A ( x ) , 1 I A ( x ) , T A ( x ) : x U } .
(6)
A B = { x , T A ( x ) + T B ( x ) T A ( x ) · T B ( x ) , I A ( x ) · I B ( x ) , F A ( x ) · F B ( x ) : x U } .

3. Single Valued Neutrosophic β -Covering Approximation Space

In this section, we present the notion of SVN β -covering approximation space. There are two basic concepts in this new approximation space: SVN β -covering and SVN β -neighborhood. Then, some of their properties are studied.
Definition 3.
Let U be a universe and S V N ( U ) be the SVN power set of U. For a SVN number β = a , b , c , we call C ^ = { C 1 , C 2 , , C m } , with C i S V N ( U ) ( i = 1 , 2 , , m ) , a SVN β-covering of U, if for all x U , C i C ^ exists such that C i ( x ) β . We also call ( U , C ^ ) a SVN β-covering approximation space.
Definition 4.
Let C ^ be a SVN β-covering of U and C ^ = { C 1 , C 2 , , C m } . For any x U , the SVN β-neighborhood N ˜ x β of x induced by C ^ can be defined as:
N ˜ x β = { C i C ^ : C i ( x ) β } .
Note that C i ( x ) is a SVN number T C i ( x ) , I C i ( x ) , F C i ( x ) in Definitions 3 and 4. Hence, C i ( x ) β means T C i ( x ) a , I C i ( x ) b and F C i ( x ) c where SVN number β = a , b , c .
Remark 1.
Let C ^ be a SVN β-covering of U, β = a , b , c and C ^ = { C 1 , C 2 , , C m } . For any x U ,
N ˜ x β = { C i C ^ : T C i ( x ) a , I C i ( x ) b , F C i ( x ) c } .
Example 1.
Let U = { x 1 , x 2 , x 3 , x 4 , x 5 } , C ^ = { C 1 , C 2 , C 3 , C 4 } and β = 0 . 5 , 0 . 3 , 0 . 8 . We can see that C ^ is a SVN β-covering of U in Table 1.
Then,
N ˜ x 1 β = C 1 C 2 , N ˜ x 2 β = C 1 C 2 C 4 , N ˜ x 3 β = C 3 C 4 , N ˜ x 4 β = C 1 C 4 , N ˜ x 5 β = C 2 C 3 C 4 .
Hence, all SVN β -neighborhoods are shown in Table 2.
In a SVN β -covering approximation space ( U , C ^ ) , we present the following properties of the SVN β -neighborhood.
Theorem 1.
Let C ^ be a SVN β-covering of U and C ^ = { C 1 , C 2 , , C m } . Then, the following statements hold:
(1)
N ˜ x β ( x ) β for each x U .
(2)
x , y , z U , if N ˜ x β ( y ) β , N ˜ y β ( z ) β , then N ˜ x β ( z ) β .
(3)
For two SVN numbers β 1 , β 2 , if β 1 β 2 β , then N ˜ x β 1 N ˜ x β 2 for all x U .
Proof. 
(1)
For any x U , N ˜ x β ( x ) = ( C i ( x ) β C i ) ( x ) = C i ( x ) β C i ( x ) β .
(2)
Let I = { 1 , 2 , , m } . Since  N ˜ x β ( y ) β , for any i I , if C i ( x ) β , then C i ( y ) β . Since  N ˜ y β ( z ) β , for any i I , C i ( z ) β when C i ( y ) β . Then, for any i I , C i ( x ) β implies C i ( z ) β . Therefore, N ˜ x β ( z ) β .
(3)
For all x U , since β 1 β 2 β , { C i C ^ : C i ( x ) β 1 } { C i C ^ : C i ( x ) β 2 } . Hence, N ˜ x β 1 = { C i C ^ : C i ( x ) β 1 } { C i C ^ : C i ( x ) β 2 } = N ˜ x β 2 for all x U .
Proposition 1.
Let C ^ be a SVN β-covering of U. For any x , y U , N ˜ x β ( y ) β if and only if N ˜ y β N ˜ x β .
Proof. 
Suppose the SVN number β = a , b , c .
( ) : Since N ˜ x β ( y ) β ,
T N ˜ x β ( y ) = T T C i ( x ) a I C i ( x ) b F C i ( x ) c C i ( y ) = T C i ( x ) a I C i ( x ) b F C i ( x ) c T C i ( y ) a ,   I N ˜ x β ( y ) = I T C i ( x ) a I C i ( x ) c F C i ( x ) b C i ( y ) = T C i ( x ) a I C i ( x ) b F C i ( x ) c I C i ( y ) b
, and
F N ˜ x β ( y ) = F T C i ( x ) a I C i ( x ) c F C i ( x ) b C i ( y ) = T C i ( x ) a I C i ( x ) b F C i ( x ) c F C i ( y ) c
.
Then,
{ C i C ^ : T C i ( x ) a , I C i ( x ) b , F C i ( x ) c } { C i C ^ : T C i ( y ) a , I C i ( y ) b , F C i ( y ) c }
.
Therefore, for each z U ,
T N ˜ x β ( z ) = T C i ( x ) a I C i ( x ) b F C i ( x ) c T C i ( z ) T C i ( y ) a I C i ( y ) b F C i ( y ) c T C i ( z ) = T N ˜ y β ( z ) , I N ˜ x β ( z ) = T C i ( x ) a I C i ( x ) b F C i ( x ) c I C i ( z ) T C i ( y ) a I C i ( y ) b F C i ( y ) c I C i ( z ) = I N ˜ y β ( z ) , F N ˜ x β ( z ) = T C i ( x ) a I C i ( x ) b F C i ( x ) c F C i ( z ) T C i ( y ) a I C i ( y ) b F C i ( y ) c F C i ( z ) = F N ˜ y β ( z ) .
Hence, N ˜ y β N ˜ x β .
( ) : For any x , y U , since N ˜ y β N ˜ x β ,
T N ˜ x β ( y ) T N ˜ y β ( y ) a ,   I N ˜ x β ( y ) I N ˜ y β ( y ) b  and  F N ˜ x β ( y ) F N ˜ y β ( y ) c
.
Therefore, N ˜ x β ( y ) β . □
The notion of SVN β -neighborhood in the SVN β -covering approximation space in the following definition.
Definition 5.
Let ( U , C ^ ) be a SVN β-covering approximation space and C ^ = { C 1 , C 2 , , C m } . For each  x U , we define the β-neighborhood N ¯ x β of x as:
N ¯ x β = { y U : N ˜ x β ( y ) β } .
Note that N ˜ x β ( y ) is a SVN number T N ˜ x β ( y ) , I N ˜ x β ( y ) , F N ˜ x β ( y ) in Definition 5.
Remark 2.
Let C ^ be a SVN β-covering of U, β = a , b , c and C ^ = { C 1 , C 2 , , C m } . For each x U ,
N ¯ x β = { y U : T N ˜ x β ( y ) a , I N ˜ x β ( y ) b , F N ˜ x β ( y ) c } .
Example 2
(Continued from Example 1). Let β = 0 . 5 , 0 . 3 , 0 . 8 , then we have
N ¯ x 1 β = { x 1 , x 2 } , N ¯ x 2 β = { x 2 } , N ¯ x 3 β = { x 3 , x 5 } , N ¯ x 4 β = { x 2 , x 4 } , N ¯ x 5 β = { x 5 } .
Some properties of the β -neighborhood in a SVN β -covering of U are presented in Theorem 2 and Proposition 2.
Theorem 2.
Let C ^ be a SVN β-covering of U and C ^ = { C 1 , C 2 , , C m } . Then, the following statements hold:
(1)
x N ¯ x β for each x U .
(2)
x , y , z U , if x N ¯ y β , y N ¯ z β , then x N ¯ z β .
Proof. 
(1)
According to Theorem 1 and Definition 5, it is straightforward.
(2)
For any x , y , z U , x N ¯ y β N ˜ y β ( x ) β N ˜ x β N ˜ y β , and y N ¯ z β N ˜ z β ( y ) β N ˜ y β N ˜ z β . Hence, N ˜ x β N ˜ z β . By Proposition 1, we have N ˜ z β ( x ) β , i.e., x N ¯ z β .
Proposition 2.
Let C ^ be a SVN β-covering of U and C ^ = { C 1 , C 2 , , C m } . Then, for all x U , x N ¯ y β if and only if N ¯ x β N ¯ y β .
Proof. 
( ) : For any z N ¯ x β , we know N ˜ x β ( z ) β . Since x N ¯ y β , N ˜ y β ( x ) β . According to ( 2 ) in Theorem 1, we have N ˜ y β ( z ) β . Hence, z N ¯ y β . Therefore, N ¯ x β N ¯ y β .
( ) : According to ( 1 ) in Theorem 2, x N ¯ x β for all x U . Since N ¯ x β N ¯ y β , x N ¯ y β . □
The relationship between SVN β -neighborhoods and β -neighborhoods is presented in the following proposition.
Proposition 3.
Let C ^ be a SVN β-covering of U. For any x , y U , N ˜ x β N ˜ y β if and only if N ¯ x β N ¯ y β .
Proof. 
According to Propositions 1 and 2, it is straightforward. □

4. Two Types of Single Valued Neutrosophic Covering Rough Set Models

In this section, we propose two types of SVN covering rough set models on basis of the SVN β -neighborhoods and the β -neighborhoods, respectively. Then, we investigate the properties of the defined lower and upper approximation operators.
Definition 6.
Let ( U , C ^ ) be a SVN β-covering approximation space. For each A S V N ( U ) where A = { x , T A ( x ) , I A ( x ) , F A ( x ) : x U } , we define the single valued neutrosophic (SVN) covering upper approximation C ˜ ( A ) and lower approximation C ( A ) of A as:
C ˜ ( A ) = { x , y U [ T N ˜ x β ( y ) T A ( y ) ] , y U [ I N ˜ x β ( y ) I A ( y ) ] , y U [ F N ˜ x β ( y ) F A ( y ) ] : x U } , C ( A ) = { x , y U [ F N ˜ x β ( y ) T A ( y ) ] , y U [ ( 1 I N ˜ x β ( y ) ) I A ( y ) ] , y U [ T N ˜ x β ( y ) F A ( y ) ] : x U } .
If C ˜ ( A ) C ( A ) , then A is called the first type of SVN covering rough set.
Example 3
(Continued from Example 1). Let β = 0.5 , 0.3 , 0.8 , A = ( 0.6 , 0.3 , 0.5 ) x 1 + ( 0.4 , 0.5 , 0.1 ) x 2 + ( 0.3 , 0.2 , 0.6 ) x 3 + ( 0.5 , 0.3 , 0.4 ) x 4 + ( 0.7 , 0.2 , 0.3 ) x 5 . Then,
C ˜ ( A ) = { x 1 , 0.6 , 0.3 , 0.5 , x 2 , 0.4 , 0.3 , 0.6 , x 3 , 0.6 , 0.5 , 0.5 , x 4 , 0.5 , 0.3 , 0.6 , x 5 , 0.6 , 0.5 , 0.5 } , C ( A ) = { x 1 , 0.6 , 0.5 , 0.5 , x 2 , 0.6 , 0.5 , 0.4 , x 3 , 0.4 , 0.4 , 0.5 , x 4 , 0.4 , 0.5 , 0.4 , x 5 , 0.6 , 0.4 , 0.3 } .
Some basic properties of the SVN covering upper and lower approximation operators are proposed in the following proposition.
Proposition 4.
Let C ^ be a SVN β-covering of U. Then, the SVN covering upper and lower approximation operators in Definition 6 satisfy the following properties: for all A , B S V N ( U ) ,
(1)
C ˜ ( A ) = ( C ( A ) ) , C ( A ) = ( C ˜ ( A ) ) .
(2)
If A B , then C ( A ) C ( B ) , C ˜ ( A ) C ˜ ( B ) .
(3)
C ( A B ) = C ( A ) C ( B ) , C ˜ ( A B ) = C ˜ ( A ) C ˜ ( B ) .
(4)
C ( A B ) C ( A ) C ( B ) , C ˜ ( A B ) C ˜ ( A ) C ˜ ( B ) .
Proof. 
(1)
C ˜ ( A ) = { x , y U [ T N ˜ x β ( y ) T A ( y ) ] , y U [ I N ˜ x β ( y ) I A ( y ) ] , y U [ F N ˜ x β ( y ) F A ( y ) ] : x U } = { x , y U [ T N ˜ x β ( y ) F A ( y ) ] , y U [ I N ˜ x β ( y ) ( 1 I A ( y ) ) ] , y U [ F N ˜ x β ( y ) T A ( y ) ] : x U } = ( C ( A ) ) .
If we replace A by A in this proof, we can also prove C ( A ) = ( C ˜ ( A ) ) .
(2)
Since A B , so T A ( x ) T B ( x ) , I B ( x ) I A ( x ) and F B ( x ) F A ( x ) for all x U . Therefore,
T C ( A ) ( x ) = y U [ F N ˜ x β ( y ) T A ( y ) ] y U [ F N ˜ x β ( y ) T B ( y ) ] = T C ( B ) ( x ) , I C ( A ) ( x ) = y U [ ( 1 I N ˜ x β ( y ) ) I A ( y ) ] y U [ ( 1 I N ˜ x β ( y ) ) I B ( y ) ] = I C ( B ) ( x ) , F C ( A ) ( x ) = y U [ T N ˜ x β ( y ) F A ( y ) ] y U [ T N ˜ x β ( y ) F B ( y ) ] = F C ( B ) ( x ) .
Hence, C ( A ) C ( B ) . In the same way, there is C ˜ ( A ) C ˜ ( B ) .
(3)
C ( A B ) = { x , y U [ F N ˜ x β ( y ) T A B ( y ) ] , y U [ ( 1 I N ˜ x β ( y ) ) I A B ( y ) ] , y U [ T N ˜ x β ( y ) F A B ( y ) ] : x U } = { x , y U [ F N ˜ x β ( y ) ( T A ( y ) T B ( y ) ) ] , y U [ ( 1 I N ˜ x β ( y ) ) ( I A ( y ) I B ( y ) ) ] , y U [ T N ˜ x β ( y ) ( F A ( y ) F B ( y ) ) ] : x U } = { x , y U [ ( F N ˜ x β ( y ) T A ( y ) ) ( F N ˜ x β ( y ) T B ( y ) ) ] , y U [ ( ( 1 I N ˜ x β ( y ) ) I A ( y ) ) ( 1 I N ˜ x β ( y ) ) I B ( y ) ) ] , y U [ ( T N ˜ x β ( y ) F A ( y ) ) ( T N ˜ x β ( y ) F B ( y ) ) ] : x U } = C ( A ) C ( B ) .
Similarly, we can obtain C ˜ ( A B ) = C ˜ ( A ) C ˜ ( B ) .
(4)
Since A A B , B A B , A B A and A B B ,
C ( A ) C ( A B ) ,   C ( B ) C ( A B ) ,   C ˜ ( A B ) C ˜ ( A )  and  C ˜ ( A B ) C ˜ ( B ) .
Hence, C ( A B ) C ( A ) C ( B ) , C ˜ ( A B ) C ˜ ( A ) C ˜ ( B ) . □
We propose the other SVN covering rough set model, which concerns the crisp lower and upper approximations of each crisp set in the SVN environment.
Definition 7.
Let ( U , C ^ ) be a SVN β-covering approximation space. For each crisp subset X P ( U ) ( P ( U ) is the power set of U), we define the SVN covering upper approximation C ¯ ( X ) and lower approximation C ̲ ( X )  of X as:
C ¯ ( X ) = { x U : N ¯ x β X } , C ̲ ( X ) = { x U : N ¯ x β X } .
If C ¯ ( X ) C ̲ ( X ) , then X is called the second type of SVN covering rough set.
Example 4
(Continued from Example 2). Let β = 0.5 , 0.3 , 0.8 , X = { x 1 , x 2 } , Y = { x 2 , x 4 , x 5 } . Then,
C ¯ ( X ) = { x 1 , x 2 , x 4 } , C ̲ ( X ) = { x 1 , x 2 } , C ¯ ( Y ) = { x 1 , x 2 , x 3 , x 4 , x 5 } , C ̲ ( Y ) = { x 2 , x 4 , x 5 } , C ¯ ( U ) = U , C ̲ ( U ) = U , C ¯ ( ) = , C ̲ ( ) = .
Proposition 5.
Let C ^ be a SVN β-covering of U. Then, the SVN covering upper and lower approximation operators in Definition 7 satisfy the following properties: for all X , Y P ( U ) ,
(1)
C ̲ ( ) = , C ¯ ( U ) = U .
(2)
C ̲ ( U ) = U , C ¯ ( ) = .
(3)
C ̲ ( X ) = ( C ¯ ( X ) ) , C ¯ ( X ) = ( C ̲ ( X ) ) .
(4)
If X Y , then C ̲ ( X ) C ̲ ( Y ) , C ¯ ( X ) C ¯ ( Y ) .
(5)
C ̲ ( X Y ) = C ̲ ( X ) C ̲ ( Y ) , C ¯ ( X Y ) = C ¯ ( X ) C ¯ ( Y ) .
(6)
C ̲ ( X Y ) C ̲ ( X ) C ̲ ( Y ) , C ¯ ( X Y ) C ¯ ( X ) C ¯ ( Y ) .
(7)
C ̲ ( C ̲ ( X ) ) C ̲ ( X ) , C ¯ ( C ¯ ( X ) ) C ¯ ( X ) .
(8)
C ̲ ( X ) X C ¯ ( X ) .
(9)
X Y or Y X C ̲ ( X Y ) = C ̲ ( X ) C ̲ ( Y ) , C ¯ ( X Y ) = C ¯ ( X ) C ¯ ( Y ) .
Proof. 
It can be directly followed from Definitions 5 and 7. □

5. Matrix Representations of These Single Valued Neutrosophic Covering Rough Set Models

In this section, matrix representations of the proposed SVN covering rough set models are investigated. Firstly, some new matrices and matrix operations are presented. Then, we show the matrix representations of these SVN approximation operators defined in Definitions 6 and 7. The order of elements in U is given.
Definition 8.
Let C ^ be a SVN β-covering of U with U = { x 1 , x 2 , , x n } and C ^ = { C 1 , C 2 , , C m } . Then, M C ^ = ( C j ( x i ) ) n × m is named a matrix representation of C ^ , and M C ^ β = ( s i j ) n × m is called a β-matrix representation of C ^ , where
s i j = 1 , C j ( x i ) β ; 0 , otherwise .
Example 5
(Continued from Example 1). Let β = 0.5 , 0.3 , 0.8 .
M C ^ = 0.7 , 0.2 , 0.5 0.6 , 0.2 , 0.4 0.4 , 0.1 , 0.5 0.1 , 0.5 , 0.6 0.5 , 0.3 , 0.2 0.5 , 0.2 , 0.8 0.4 , 0.5 , 0.4 0.6 , 0.1 , 0.7 0.4 , 0.5 , 0.2 0.2 , 0.3 , 0.6 0.5 , 0.2 , 0.4 0.6 , 0.3 , 0.4 0.6 , 0.1 , 0.7 0.4 , 0.5 , 0.7 0.3 , 0.6 , 0.5 0.5 , 0.3 , 0.2 0.3 , 0.2 , 0.6 0.7 , 0.3 , 0.5 0.6 , 0.3 , 0.5 0.8 , 0.1 , 0.2 ,   M C ^ β = 1 1 0 0 1 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 .
Definition 9.
Let A = ( a i k ) n × m and B = ( b k j + , b k j , b k j ) 1 k m , 1 j l be two matrices. We define D = A * B = ( d i j + , d i j , d i j ) 1 i n , 1 j l , where
d i j + , d i j , d i j = k = 1 m [ ( 1 a i k ) b k j + ] , 1 k = 1 m [ ( 1 a i k ) ( 1 b k j ) ] , 1 k = 1 m [ ( 1 a i k ) ( 1 b k j ) ] .
Based on Definitions 8 and 9, all N ˜ x β for any x U can be obtained by matrix operations.
Proposition 6.
Let C ^ be a SVN β-covering of U with U = { x 1 , x 2 , , x n } and C ^ = { C 1 , C 2 , , C m } . Then
M C ^ β * M C ^ T = ( N ˜ x i β ( x j ) ) 1 i n , 1 j n ,
where M C ^ T is the transpose of M C ^ .
Proof. 
Suppose M C ^ T = ( C k ( x j ) ) m × n , M C ^ β = ( s i k ) n × m and M C ^ β * M C ^ T = ( d i j + , d i j , d i j ) 1 i n , 1 j n . Since  C ^ is a SVN β-covering of U, for each i ( 1 i n ), there exists k ( 1 k m ) such that s i k = 1 . Then,
d i j + , d i j , d i j = k = 1 m [ ( 1 s i k ) T C k ( x j ) ] , 1 k = 1 m [ ( 1 s i k ) ( 1 I C k ( x j ) ) ] , 1 k = 1 m [ ( 1 s i k ) ( 1 F C k ( x j ) ) ] = s i k = 1 [ ( 1 s i k ) T C k ( x j ) ] , 1 s i k = 1 [ ( 1 s i k ) ( 1 I C k ( x j ) ) ] , 1 s i k = 1 [ ( 1 s i k ) ( 1 F C k ( x j ) ) ] = s i k = 1 T C k ( x j ) , 1 s i k = 1 ( 1 I C k ( x j ) ) , 1 s i k = 1 ( 1 F C k ( x j ) ) = C k ( x i ) β T C k ( x j ) , 1 C k ( x i ) β ( 1 I C k ( x j ) ) , 1 C k ( x i ) β ( 1 F C k ( x j ) ) = ( C k ( x i ) β C k ) ( x j ) = N ˜ x i β ( x j ) , 1 i , j n .
Hence, M C ^ β * M C ^ T = ( N ˜ x i β ( x j ) ) 1 i n , 1 j n . □
Example 6
(Continued from Example 1).
M C ^ β * M C ^ T = 1 1 0 0 1 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 * 0.7 , 0.2 , 0.5 0.6 , 0.2 , 0.4 0.4 , 0.1 , 0.5 0.1 , 0.5 , 0.6 0.5 , 0.3 , 0.2 0.5 , 0.2 , 0.8 0.4 , 0.5 , 0.4 0.6 , 0.1 , 0.7 0.4 , 0.5 , 0.2 0.2 , 0.3 , 0.6 0.5 , 0.2 , 0.4 0.6 , 0.3 , 0.4 0.6 , 0.1 , 0.7 0.4 , 0.5 , 0.7 0.3 , 0.6 , 0.5 0.5 , 0.3 , 0.2 0.3 , 0.2 , 0.6 0.7 , 0.3 , 0.5 0.6 , 0.3 , 0.5 0.8 , 0.1 , 0.2 T = 1 1 0 0 1 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 * 0.7 , 0.2 , 0.5 0.5 , 0.3 , 0.2 0.4 , 0.5 , 0.2 0.6 , 0.1 , 0.7 0.3 , 0.2 , 0.6 0.6 , 0.2 , 0.4 0.5 , 0.2 , 0.8 0.2 , 0.3 , 0.6 0.4 , 0.5 , 0.7 0.7 , 0.3 , 0.5 0.4 , 0.1 , 0.5 0.4 , 0.5 , 0.4 0.5 , 0.2 , 0.4 0.3 , 0.6 , 0.5 0.6 , 0.3 , 0.5 0.1 , 0.5 , 0.6 0.6 , 0.1 , 0.7 0.6 , 0.3 , 0.4 0.5 , 0.3 , 0.2 0.8 , 0.1 , 0.2 = 0.6 , 0.2 , 0.5 0.5 , 0.3 , 0.8 0.2 , 0.5 , 0.6 0.4 , 0.5 , 0.7 0.3 , 0.3 , 0.6 0.1 , 0.5 , 0.6 0.5 , 0.3 , 0.8 0.2 , 0.5 , 0.6 0.4 , 0.5 , 0.7 0.3 , 0.3 , 0.6 0.1 , 0.5 , 0.6 0.4 , 0.5 , 0.7 0.5 , 0.3 , 0.4 0.3 , 0.6 , 0.5 0.6 , 0.3 , 0.5 0.1 , 0.5 , 0.6 0.5 , 0.3 , 0.7 0.4 , 0.5 , 0.4 0.5 , 0.3 , 0.7 0.3 , 0.2 , 0.6 0.1 , 0.5 , 0.6 0.4 , 0.5 , 0.8 0.2 , 0.3 , 0.6 0.3 , 0.6 , 0.7 0.6 , 0.3 , 0.5 = ( N x i β ( x j ) ) 1 i 5 , 1 j 5 .
Definition 10.
Let A = ( c i j + , c i j , c i j ) m × n and B = ( d j + , d j , d j ) n × 1 be two matrices. We define C = A B = ( e i + , e i , e i ) m × 1 and D = A B = ( f i + , f i , f i ) m × 1 , where
e i + , e i , e i = j = 1 n ( c i j + d j + ) , j = 1 n ( c i j d j ) , j = 1 n ( c i j d j ) , f i + , f i , f i = j = 1 n ( c i j d j + ) , j = 1 n [ ( 1 c i j ) d j ] , j = 1 n ( c i j + d j ) .
According to Proposition 6 and Definition 10, the set representations of C ˜ ( A ) and C ( A ) (for any  A S V N ( U ) ) can be converted to matrix representations.
Theorem 3.
Let C ^ be a SVN β-covering of U with U = { x 1 , x 2 , , x n } and C ^ = { C 1 , C 2 , , C m } . Then, for any A S V N ( U ) ,
C ˜ ( A ) = ( M C ^ β * M C ^ T ) A , C ( A ) = ( M C ^ β * M C ^ T ) A ,
where A = ( a i ) n × 1 with a i = T A ( x i ) , I A ( x i ) , F A ( x i ) is the vector representation of the SVNS A. C ˜ ( A ) and C ( A ) are also vector representations.
Proof. 
According to Proposition 6 and Definitions 6 and 10, for any x i ( i = 1 , 2 , , n ),
( ( M C ^ β * M C ^ T ) A ) ( x i ) = j = 1 n ( T N ˜ x i β ( x j ) T A ( x j ) ) , j = 1 n ( I N ˜ x i β ( x j ) I A ( x j ) ) , j = 1 n ( F N ˜ x i β ( x j ) F A ( x j ) ) = ( C ˜ ( A ) ) ( x i ) ,
and
( ( M C ^ β * M C ^ T ) A ) ( x i ) = j = 1 n ( F N ˜ x i β ( x j ) T A ( x j ) ) , j = 1 n [ ( 1 I N ˜ x i β ( x j ) ) I A ( x j ) ] , j = 1 n ( T N ˜ x i β ( x j ) F A ( x j ) ) = ( C ( A ) ) ( x i ) .
Hence, C ˜ ( A ) = ( M C ^ β * M C ^ T ) A , C ( A ) = ( M C ^ β * M C ^ T ) A . □
Example 7
(Continued from Example 3). Let β = 0.5 , 0.3 , 0.8 , A = ( 0.6 , 0.3 , 0.5 ) x 1 + ( 0.4 , 0.5 , 0.1 ) x 2 + ( 0.3 , 0.2 , 0.6 ) x 3 + ( 0.5 , 0.3 , 0.4 ) x 4 + ( 0.7 , 0.2 , 0.3 ) x 5 . Then,
C ˜ ( A ) = ( M C ^ β * M C ^ T ) A = 0.6 , 0.2 , 0.5 0.5 , 0.3 , 0.8 0.2 , 0.5 , 0.6 0.4 , 0.5 , 0.7 0.3 , 0.3 , 0.6 0.1 , 0.5 , 0.6 0.5 , 0.3 , 0.8 0.2 , 0.5 , 0.6 0.4 , 0.5 , 0.7 0.3 , 0.3 , 0.6 0.1 , 0.5 , 0.6 0.4 , 0.5 , 0.7 0.5 , 0.3 , 0.4 0.3 , 0.6 , 0.5 0.6 , 0.3 , 0.5 0.1 , 0.5 , 0.6 0.5 , 0.3 , 0.7 0.4 , 0.5 , 0.4 0.5 , 0.3 , 0.7 0.3 , 0.2 , 0.6 0.1 , 0.5 , 0.6 0.4 , 0.5 , 0.8 0.2 , 0.3 , 0.6 0.3 , 0.6 , 0.7 0.6 , 0.3 , 0.5 0.6 , 0.3 , 0.5 0.4 , 0.5 , 0.1 0.3 , 0.2 , 0.6 0.5 , 0.3 , 0.4 0.7 , 0.2 , 0.3 = 0.6 , 0.3 , 0.5 0.4 , 0.3 , 0.6 0.6 , 0.5 , 0.5 0.5 , 0.3 , 0.6 0.6 , 0.5 , 0.5 ,
and
C ( A ) = ( M C ^ β * M C ^ T ) A = 0.6 , 0.2 , 0.5 0.5 , 0.3 , 0.8 0.2 , 0.5 , 0.6 0.4 , 0.5 , 0.7 0.3 , 0.3 , 0.6 0.1 , 0.5 , 0.6 0.5 , 0.3 , 0.8 0.2 , 0.5 , 0.6 0.4 , 0.5 , 0.7 0.3 , 0.3 , 0.6 0.1 , 0.5 , 0.6 0.4 , 0.5 , 0.7 0.5 , 0.3 , 0.4 0.3 , 0.6 , 0.5 0.6 , 0.3 , 0.5 0.1 , 0.5 , 0.6 0.5 , 0.3 , 0.7 0.4 , 0.5 , 0.4 0.5 , 0.3 , 0.7 0.3 , 0.2 , 0.6 0.1 , 0.5 , 0.6 0.4 , 0.5 , 0.8 0.2 , 0.3 , 0.6 0.3 , 0.6 , 0.7 0.6 , 0.3 , 0.5 0.6 , 0.3 , 0.5 0.4 , 0.5 , 0.1 0.3 , 0.2 , 0.6 0.5 , 0.3 , 0.4 0.7 , 0.2 , 0.3 = 0.6 , 0.5 , 0.5 0.6 , 0.5 , 0.4 0.4 , 0.4 , 0.5 0.4 , 0.5 , 0.6 0.6 , 0.4 , 0.3 .
Two operations of matrices are defined in [28]. We can use them to study the matrix representations of C ̲ ( X ) and C ¯ ( X ) of every crisp subset X P ( U ) .
Definition 11
([28]). Let A = ( a i k ) n × m and B = ( b k j ) m × l be two matrices. We define C = A · B = ( c i j ) n × l and D = A B = ( d i j ) n × l as follows:
c i j = k = 1 m ( a i k b k j ) , d i j = k = 1 m [ ( 1 a i k ) b k j ] , f o r a n y i = 1 , 2 , , n , a n d j = 1 , 2 , , l .
Let U = { x 1 , , x n } and X P ( U ) . Then, the characteristic function of the crisp subset X is defined as χ X , where
χ X ( x i ) = 1 , x i X ; 0 , o t h e r w i s e .
Proposition 7.
Let C ^ be a SVN β-covering of U with U = { x 1 , x 2 , , x n } and C ^ = { C 1 , C 2 , , C m } . Then,
M C ^ β ( M C ^ β ) T = ( χ N ¯ x i β ( x j ) ) 1 i n , 1 j n ,
Proof. 
Suppose M C ^ β = ( s i k ) n × m and M C ^ β ( M C ^ β ) T = ( t i j ) n × n . Since C ^ is a SVN β-covering of U, for each i ( 1 i n ) there exists k ( 1 k m ) such that s i k = 1 . If t i j = 1 , then k = 1 m [ ( 1 s i k ) s j k ] = 1 .
It implies that if s i k = 1 , then s j k = 1 . Hence, C k ( x i ) β implies C k ( x j ) β . Therefore, x j N ¯ x i β , i.e., χ N ¯ x i β ( x j ) = 1 = t i j .
If t i j = 0 , then k = 1 m [ ( 1 s i k ) s j k ] = 0 . This implies that if s i k = 1 , then s j k = 0 . Hence, C k ( x i ) β implies C k ( x j ) < β . Thus, we have x j N ¯ x i β , i.e., χ N ¯ x i β ( x j ) = 1 = t i j . □
Example 8
(Continued from Example 2). According to M C ^ β in Example 5, we have the following result.
M C ^ β ( M C ^ β ) T = 1 1 0 0 0 0 1 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 1 = ( χ N ¯ x i β ( x j ) ) 1 i 5 , 1 j 5 .
For any X P ( U ) , we also denote χ X = ( a i ) n × 1 with a i = 1 iff x i X ; otherwise, a i = 0 .
Then, the set representations of C ¯ ( X ) and C ̲ ( X ) (for any X P ( U ) ) can be converted to matrix representations.
Theorem 4.
Let C ^ be a SVN β-covering of U with U = { x 1 , x 2 , , x n } and C ^ = { C 1 , C 2 , , C m } . Then, for any X P ( U ) ,
χ C ¯ ( X ) = ( M C ^ β ( M C ^ β ) T ) · χ X , χ C ̲ ( X ) = ( M C ^ β ( M C ^ β ) T ) χ X .
Proof. 
Suppose ( M C ^ β ( M C ^ β ) T ) · χ X = ( a i ) n × 1 and ( M C ^ β ( M C ^ β ) T ) χ X = ( b i ) n × 1 . For any x i U ( i = 1 , 2 , , n ),
x i C ¯ ( X ) χ C ¯ ( X ) ( x i ) = 1 a i = 1 k = 1 n [ χ N ¯ x i β ( x k ) χ X ( x k ) ] = 1 k { 1 , 2 , , n } , s . t . , χ N ¯ x i β ( x k ) = χ X ( x k ) = 1 k { 1 , 2 , , n } , s . t . , x k N ¯ x i β X N ¯ x i β X ,
and
x i C ̲ ( X ) χ C ̲ ( X ) ( x i ) = 1 b i = 1 k = 1 n [ ( 1 χ N ¯ x i β ( x k ) ) χ X ( x k ) ] = 1 χ N ¯ x i β ( x k ) = 1 χ X ( x k ) = 1 , k = 1 , 2 , , n x k N ¯ x i β x k X , k = 1 , 2 , , n N ¯ x i β X .
Example 9
(Continued from Example 4). Let X = { x 1 , x 2 } . By M C ^ β ( M C ^ β ) T in Example 8, we have
( M C ^ β ( M C ^ β ) T ) · χ X = 1 1 0 0 0 0 1 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 1 · 1 1 0 0 0 = 1 1 0 1 0 = χ C ¯ ( X ) , ( M C ^ β ( M C ^ β ) T ) χ X = 1 1 0 0 0 0 1 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 1 1 1 0 0 0 = 1 1 0 0 0 = χ C ̲ ( X ) .

6. An Application to Decision Making Problems

In this section, we present a novel approach to DM problems based on the SVN covering rough set model. Then, a comparative study with other methods is shown.

6.1. The Problem of Decision Making

Let U = { x k : k = 1 , 2 , , l } be the set of patients and V = { y i | i = 1 , 2 , , m } be the m main symptoms (for example, cough, fever, and so on) for a Disease B. Assume that Doctor R evaluates every Patient x k ( k = 1 , 2 , , l ).
Assume that Doctor R believes each Patient x k U ( k = 1 , 2 , , l ) has a symptom value C i ( i = 1 , 2 , , m ), denoted by C i ( x k ) = T C i ( x k ) , I C i ( x k , F C i ( x k ) , where T C i ( x k ) [ 0 , 1 ] is the degree that Doctor R confirms Patient x k has symptom y i , I C i ( x k ) [ 0 , 1 ] is the degree that Doctor R is not sure Patient x k has symptom y i , F C i ( x k ) [ 0 , 1 ] is the degree that Doctor R confirms Patient x k does not have symptom y i , and T C i ( x k ) + I C i ( x k ) + F C i ( x k ) 3 .
Let β = a , b , c be the critical value. If any Patient x k U , there is at least one symptom y i V such that the symptom value C i for Patient x k is not less than β, respectively, then C ^ = { C 1 , C 2 , , C m } is a SVN β-covering of U for some SVN number β.
If d is a possible degree, e is an indeterminacy degree and f is an impossible degree of Disease B of every Patient x k U that is diagnosed by Doctor R, denoted by A ( x k ) = d , e , f , then the decision maker (Doctor R) for the decision making problem needs to know how to evaluate whether Patients x k U have Disease B.

6.2. The Decision Making Algorithm

In this subsection, we give an approach for the problem of DM with the above characterizations by means of the first type of SVN covering rough set model. According to the characterizations of the DM problem in Section 6.1, we construct the SVN decision information system and present the Algorithm 1 of DM under the framework of the first type of SVN covering rough set model.
Algorithm 1 The decision making algorithm based on the SVN covering rough set model.
Input: SVN decision information system ( U , C ^ , β , A ).
Output: The score ordering for all alternatives.
 
  1: Compute the SVN β-neighborhood N ˜ x β of x induced by C ^ , for all x U according to Definition 4;
  2: Compute the SVN covering upper approximation C ˜ ( A ) and lower approximation C ( A ) of A, according to Definition 6;
  3: Compute R ˜ A = C ˜ ( A ) C ( A ) according to ( 6 ) in the basic operations on S V N ( U ) ;
  4: Compute
s ( x ) = T R ˜ A ( x ) ( T R ˜ A ( x ) ) 2 + ( I R ˜ A ( x ) ) 2 + ( F R ˜ A ( x ) ) 2 ;

  5: Rank all the alternatives s ( x ) by using the principle of numerical size and select the most possible patient.
According to the above process, we can get the decision making according to the ranking. In Step 4, S ( x ) is the cosine similarity measure between R ˜ A ( x ) and the ideal solution ( 1 , 0 , 0 ) , which was proposed by Ye  [44].

6.3. An Applied Example

Example 10.
Assume that U = { x 1 , x 2 , x 3 , x 4 , x 5 } is a set of patients. According to the patients’ symptoms, we write V = { y 1 , y 2 , y 3 , y 4 } to be four main symptoms (cough, fever, sore and headache) for Disease B. Assume that Doctor R evaluates every Patient x k ( k = 1 , 2 , , 5 ) as shown in Table 1.
Let β = 0 . 5 , 0 . 3 , 0 . 8 be the critical value. Then, C ^ = { C 1 , C 2 , C 3 , C 4 } is a SVN β-coverings of U. N ˜ x k β  ( k = 1 , 2 , 3 , 4 , 5 ) are shown in Table 2.
Assume that Doctor R diagnoses the value A = ( 0 . 6 , 0 . 3 , 0 . 5 ) x 1 + ( 0 . 4 , 0 . 5 , 0 . 1 ) x 2 + ( 0 . 3 , 0 . 2 , 0 . 6 ) x 3 + ( 0 . 5 , 0 . 3 , 0 . 4 ) x 4 + ( 0 . 7 , 0 . 2 , 0 . 3 ) x 5 of Disease B of every patient. Then,
C ˜ ( A ) = { x 1 , 0.6 , 0.3 , 0.5 , x 2 , 0.4 , 0.3 , 0.6 , x 3 , 0.6 , 0.5 , 0.5 , x 4 , 0.5 , 0.3 , 0.6 , x 5 , 0.6 , 0.5 , 0.5 } , C ( A ) = { x 1 , 0.6 , 0.5 , 0.5 , x 2 , 0.6 , 0.5 , 0.4 , x 3 , 0.4 , 0.4 , 0.5 , x 4 , 0.4 , 0.5 , 0.4 , x 5 , 0.6 , 0.4 , 0.3 } .
Then,
R ˜ A = C ˜ ( A ) C ( A ) = { x 1 , 0.84 , 0.15 , 0.25 , x 2 , 0.76 , 0.15 , 0.24 , x 3 , 0.76 , 0.2 , 0.25 , x 4 , 0.7 , 0.15 , 0.24 , x 5 , 0.84 , 0.2 , 0.15 } .
Hence, we can obtain s ( x k ) ( k = 1 , 2 , , 5 ) in Table 3.
According to the principle of numerical size, we have:
s ( x 4 ) < s ( x 3 ) < s ( x 2 ) < s ( x 1 ) < s ( x 5 ) .
Therefore, Doctor R diagnoses Patient x 5 as more likely to be sick with Disease B.

6.4. A Comparison Analysis

To validate the feasibility of the proposed decision making method, a comparative study was conducted with other methods. These methods, which were introduced by Liu [43], Yang et al. [32] and Ye [44], are compared with the proposed approach using SVN information system.

6.4.1. The Results of Liu’s Method

Liu’s method is shown in Algorithm 2.
Algorithm 2 The decision making algorithm [43].
Input: A SVN decision matrix D, a weight vector w and γ.
Output: The score ordering for all alternatives.
 
  1: Compute
n k = T n k , I n k , F n k [ 2 m m ] = H S V N N W A ( n k 1 , n k 2 , , n k m ) [ 2 m m ] = i = 1 m ( 1 + ( γ 1 ) T k i ) w i i = 1 m ( 1 T k i ) w i i = 1 m ( 1 + ( γ 1 ) T k i ) w i + ( γ 1 ) i = 1 m ( 1 T k i ) w i , [ 2 m m ] γ i = 1 m I k i w i i = 1 m ( 1 + ( γ 1 ) ( 1 I k i ) ) w i + ( γ 1 ) i = 1 m I k i w i , [ 2 m m ] γ i = 1 m F k i w i i = 1 m ( 1 + ( γ 1 ) ( 1 F k i ) ) w i + ( γ 1 ) i = 1 m F k i w i ( k = 1 , 2 , , l ) ;

  2: Calculate s ( n k ) = T n k T n k 2 + I n k 2 + F n k 2 ;
  3: Obtain the ranking for all s ( n k ) by using the principle of numerical size and select the most possible patient.
Then, Algorithm 2 can be used for Example 10. Let n k i = T k i , I k i , F k i be the evaluation information of x k on C i in Table 1. That is to say, Table 1 is the SVN decision matrix D. We suppose the weight vector of the criteria is w = ( 0.35 , 0 , 25 , 0.3 , 0.1 ) and γ = 1 .
Step 1: Based on HSVNNWA operator, we get
n 1 = 0.557 , 0.178 , 0.482 , n 2 = 0.484 , 0.283 , 0.395 , n 3 = 0.414 , 0.318 , 0.347 , n 4 = 0.465 , 0.286 , 0.558 , n 5 = 0.578 , 0.233 , 0.486 .
Step 2: We get
s ( n 1 ) = 0.735 ,   s ( n 2 ) = 0.706 ,   s ( n 3 ) = 0.660 ,   s ( n 4 ) = 0.596 ,   s ( n 5 ) = 0.734
.
Step 3: According to the cosine similarity degrees s ( n k ) ( k = 1 , 2 , , 5 ), we obtain x 4 < x 3 < x 2 < x 5 < x 1 .
Therefore, Patient x 1 is more likely to be sick with Disease B.

6.4.2. The Results of Yang’s Method

Yang’s method is shown in Algorithm 3.
Algorithm 3 The decision making algorithm [32].
Input: A generalized SVN approximation space ( U , V , R ˜ ), B S V N ( V ) .
Output: The score ordering for all alternatives.
 
  1: Calculate the lower and upper approximations R ˜ ¯ ( B ) and R ˜ ̲ ( B ) ;
  2: Compute n x k = ( R ˜ ¯ ( B ) R ˜ ̲ ( B ) ) ( x k ) ( k = 1 , 2 , , l );
  3: Compute
s ( n x k , n * ) = T n x k · T n * + I n x k · I n * + F n x k · F n * T n x k 2 + I n x k 2 + F n x k 2 · ( T n * ) 2 + ( I n * ) 2 + ( F n * ) 2 ( k = 1 , 2 , , l ) ,
where n * = T n * , I n * , F n * = 1 , 0 , 0 ;
  4: Obtain the ranking for all s ( n x k , n * ) by using the principle of numerical size and select the most possible patient.
For Example 10, we suppose Disease B S V N ( V ) and B = ( 0 . 3 , 0 . 6 , 0 . 5 ) y 1 + ( 0 . 7 , 0 . 2 , 0 . 1 ) y 2 + ( 0 . 6 , 0 . 4 , 0 . 3 ) y 3 + ( 0 . 8 , 0 . 4 , 0 . 5 ) y 4 . According to Table 1, the generalized SVN approximation space ( U , V , R ˜ ) can be obtained in Table 4, where U = { x 1 , x 2 , x 3 , x 4 , x 5 } and V = { y 1 , y 2 , y 3 , y 4 } .
Step 1: We get
R ˜ ¯ ( B ) = { x 1 , 0.6 , 0.2 , 0.4 , x 2 , 0.6 , 0.2 , 0.4 , x 3 , 0.6 , 0.3 , 0.4 , x 4 , 0.5 , 0.4 , 0.5 , x 5 , 0.8 , 0.3 , 0.5 } , R ˜ ̲ ( B ) = { x 1 , 0.5 , 0.6 , 0.5 , x 2 , 0.3 , 0.6 , 0.5 , x 3 , 0.3 , 0.5 , 0.5 , x 4 , 0.6 , 0.6 , 0.5 , x 5 , 0.6 , 0.6 , 0.5 } .
Step 2:
R ˜ ¯ ( B ) R ˜ ̲ ( B ) = { x 1 , 0.80 , 0.12 , 0.20 , x 2 , 0.72 , 0.12 , 0.20 , x 3 , 0.72 , 0.15 , 0.20 , x 4 , 0.80 , 0.24 , 0.25 , x 5 , 0 . 92 , 0 . 18 , 0 . 25 } .
Step 3: Let n * = 1 , 0 , 0 . Then,
s ( n x 1 , n * ) = 0.960 ,   s ( n x 2 , n * ) = 0.951 ,   s ( n x 3 , n * ) = 0.945 ,   s ( n x 4 , n * ) = 0.918 ,   s ( n x 5 , n * ) = 0.948
.
Step 4:
s ( n x 4 , n * ) < s ( n x 3 , n * ) < s ( n x 5 , n * ) < s ( n x 2 , n * ) < s ( n x 1 , n * )
.
Therefore, Patient x 1 is more likely to be sick with Disease B.

6.4.3. The Results of Ye’s Methods

Ye presented two methods [44]. Thus, Algorithms 4 and 5 are presented for Example 10.
Algorithm 4 The decision making algorithm [44].
Input: A SVN decision matrix D and a weight vector w .
Output: The score ordering for all alternatives.
 
  1: Compute
W k ( x k , A * ) = i = 1 m w i [ a k i · a i * + b k i · b i * + c k i · c i * ] i = 1 m w i [ a k i 2 + b k i 2 + c k i 2 ] · i = 1 m w i [ ( a i * ) 2 + ( b i * ) 2 + ( c i * ) 2 ] ( k = 1 , 2 , , l ) ,
where α i * = a i * , b i * , c i * = 1 , 0 , 0 ( i = 1 , 2 , , m );
  2: Obtain the ranking for all W k ( x k , A * ) by using the principle of numerical size and select the most possible patient.
For Example 10, Table 1 is the SVN decision matrix D. We suppose the weight vector of the criteria is w = ( 0.35 , 0 , 25 , 0.3 , 0.1 ) .
Step 1:
W 1 ( x 1 , A * ) = 0.677 ,   W 2 ( x 2 , A * ) = 0.608 ,   W 3 ( x 3 , A * ) = 0.580 ,   W 4 ( x 4 , A * ) = 0.511 ,   W 5 ( x 5 , A * ) = 0.666 .
Step 2: The ranking order of { x 1 , x 2 , , x 5 } is x 4 < x 3 < x 2 < x 5 < x 1 . Therefore, Patient x 1 is more likely to be sick with Disease B.
Algorithm 5 The other decision making algorithm [44].
Input: A SVN decision matrix D and a weight vector w .
Output: The score ordering for all alternatives.
 
  1: Compute
M k ( x k , A * ) = i = 1 m w i a k i · a i * + b k i · b i * + c k i · c i * a k i 2 + b k i 2 + c k i 2 · ( a i * ) 2 + ( b i * ) 2 + ( c i * ) 2 ( k = 1 , 2 , , l ) ,
where α i * = a i * , b i * , c i * = 1 , 0 , 0 ( i = 1 , 2 , , m );
  2: Obtain the ranking for all M k ( x k , A * ) by using the principle of numerical size and select the most possible patient.
By Algorithms 5, we have:
Step 1:
M 1 ( x 1 , A * ) = 0.676 ,   M 2 ( x 2 , A * ) = 0.637 ,   M 3 ( x 3 , A * ) = 0.581 , M 4 ( x 4 , A * ) = 0.521 ,   M 5 ( x 5 , A * ) = 0.654 .
Step 2: The ranking order of { x 1 , x 2 , , x 5 } is x 4 < x 3 < x 2 < x 5 < x 1 . Therefore, Patient x 1 is more likely to be sick with Disease B.
All results are shown in Table 5, Figure 1 and Figure 2.
Liu [43] and Ye [44] presented the methods by SVN theory. In their methods, the ranking order would be changed by different w and γ. We as well as Yang et al. [32] used different rough set models to make the decision. Yang et al. present a SVN rough set model based on SVN relations, while we present a new SVN rough set model based on coverings. The results are different by Yang’s and our methods, although the methods are both based on an operator presented by Ye [44].
In any method, if there are more than one most possible patient, then each patient will be the optimal decision. In this case, we need other methods to make a further decision. By means of different methods, the obtained results may be different. To achieve the most accurate results, further diagnosis is necessary in combination with other hybrid methods.

7. Conclusions

This paper is a bridge, linking SVNSs and covering-based rough sets. By introducing some definitions and properties in SVN β-covering approximation spaces, we present two types of SVN covering rough set models. Then, their characterizations and matrix representations are investigated. Moreover, an application to the problem of DM is proposed. The main conclusions in this paper and the further work to do are listed as follows.
  • Two types of SVN covering rough set models are first presented, which combine SVNSs with covering-based rough sets. Some definitions and properties in covering-based rough set model, such as coverings and neighborhoods, are generalized to SVN covering rough set models. Neutrosophic sets and related algebraic structures [47,48,49] will be connected with the research content of this paper in further research.
  • It would be tedious and complicated to use set representations to calculate SVN covering approximation operators. Therefore, the matrix representations of these SVN covering approximation operators make it possible to calculate them through the new matrices and matrix operations. By these matrix representations, calculations will become algorithmic and can be easily implemented by computers.
  • We propose a method to DM problems under one of the SVN covering rough set models. It is a novel method based on approximation operators specific to SVN covering rough sets firstly. The comparison analysis is very interesting to show the difference between the proposed method and other methods.

Author Contributions

All authors contributed equally to this paper. The idea of this whole thesis was put forward by X.Z., and he also completed the preparatory work of the paper. J.W. analyzed the existing work of rough sets and SVNSs, and wrote the paper.

Funding

This work was supported by the National Natural Science Foundation of China under Grant Nos. 61573240 and 61473239.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Pawlak, Z. Rough sets. Int. J. Comput. Inf. Sci. 1982, 11, 341–356. [Google Scholar]
  2. Pawlak, Z. Rough Sets: Theoretical Aspects of Reasoning about Data; Kluwer Academic Publishers: Boston, MA, USA, 1991. [Google Scholar]
  3. Bartol, W.; Miro, J.; Pioro, K.; Rossello, F. On the coverings by tolerance classes. Inf. Sci. 2004, 166, 193–211. [Google Scholar]
  4. Bianucci, D.; Cattaneo, G.; Ciucci, D. Entropies and co-entropies of coverings with application to incomplete information systems. Fundam. Inform. 2007, 75, 77–105. [Google Scholar]
  5. Zhu, W. Relationship among basic concepts in covering-based rough sets. Inf. Sci. 2009, 179, 2478–2486. [Google Scholar]
  6. Yao, Y.; Zhao, Y. Attribute reduction in decision-theoretic rough set models. Inf. Sci. 2008, 178, 3356–3373. [Google Scholar] [Green Version]
  7. Wang, J.; Zhang, X. Matrix approaches for some issues about minimal and maximal descriptions in covering-based rough sets. Int. J. Approx. Reason. 2019, 104, 126–143. [Google Scholar]
  8. Li, F.; Yin, Y. Approaches to knowledge reduction of covering decision systems based on information theory. Inf. Sci. 2009, 179, 1694–1704. [Google Scholar]
  9. Wu, W. Attribute reduction based on evidence theory in incomplete decision systems. Inf. Sci. 2008, 178, 1355–1371. [Google Scholar]
  10. Wang, J.; Zhu, W. Applications of bipartite graphs and their adjacency matrices to covering-based rough sets. Fundam. Inform. 2017, 156, 237–254. [Google Scholar]
  11. Dai, J.; Wang, W.; Xu, Q.; Tian, H. Uncertainty measurement for interval-valued decision systems based on extended conditional entropy. Knowl.-Based Syst. 2012, 27, 443–450. [Google Scholar]
  12. Wang, C.; Chen, D.; Wu, C.; Hu, Q. Data compression with homomorphism in covering information systems. Int. J. Approx. Reason. 2011, 52, 519–525. [Google Scholar]
  13. Li, X.; Yi, H.; Liu, S. Rough sets and matroids from a lattice-theoretic viewpoint. Inf. Sci. 2016, 342, 37–52. [Google Scholar]
  14. Wang, J.; Zhang, X. Four operators of rough sets generalized to matroids and a matroidal method for attribute reduction. Symmetry 2018, 10, 418. [Google Scholar]
  15. Wang, J.; Zhu, W. Applications of matrices to a matroidal structure of rough sets. J. Appl. Math. 2013, 2013, 493201. [Google Scholar]
  16. Wang, J.; Zhu, W.; Wang, F.; Liu, G. Conditions for coverings to induce matroids. Int. J. Mach. Learn. Cybern. 2014, 5, 947–954. [Google Scholar]
  17. Chen, J.; Li, J.; Lin, Y.; Lin, G.; Ma, Z. Relations of reduction between covering generalized rough sets and concept lattices. Inf. Sci. 2015, 304, 16–27. [Google Scholar]
  18. Zhang, X.; Dai, J.; Yu, Y. On the union and intersection operations of rough sets based on various approximation spaces. Inf. Sci. 2015, 292, 214–229. [Google Scholar]
  19. D’eer, L.; Cornelis, C.; Godo, L. Fuzzy neighborhood operators based on fuzzy coverings. Fuzzy Sets Syst. 2017, 312, 17–35. [Google Scholar] [Green Version]
  20. Yang, B.; Hu, B. On some types of fuzzy covering-based rough sets. Fuzzy Sets Syst. 2017, 312, 36–65. [Google Scholar]
  21. Zhang, X.; Miao, D.; Liu, C.; Le, M. Constructive methods of rough approximation operators and multigranulation rough sets. Knowl.-Based Syst. 2016, 91, 114–125. [Google Scholar]
  22. Wang, J.; Zhang, X. Two types of intuitionistic fuzzy covering rough sets and an application to multiple criteria group decision making. Symmetry 2018, 10, 462. [Google Scholar]
  23. Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar]
  24. Medina, J.; Ojeda-Aciego, M. Multi-adjoint t-concept lattices. Inf. Sci. 2010, 180, 712–725. [Google Scholar]
  25. Pozna, C.; Minculete, N.; Precup, R.E.; Kóczy, L.T.; Ballagi, Á. Signatures: Definitions, operators and applications to fuzzy modeling. Fuzzy Sets Syst. 2012, 201, 86–104. [Google Scholar]
  26. Jankowski, J.; Kazienko, P.; Watróbski, J.; Lewandowska, A.; Ziemba, P.; Zioło, M. Fuzzy multi-objective modeling of effectiveness and user experience in online advertising. Expert Syst. Appl. 2016, 65, 315–331. [Google Scholar]
  27. Vrkalovic, S.; Lunca, E.C.; Borlea, I.D. Model-free sliding mode and fuzzy controllers for reverse osmosis desalination plants. Int. J. Artif. Intell. 2018, 16, 208–222. [Google Scholar]
  28. Ma, L. Two fuzzy covering rough set models and their generalizations over fuzzy lattices. Fuzzy Sets Syst. 2016, 294, 1–17. [Google Scholar]
  29. Wang, H.; Smarandache, F.; Zhang, Y.; Sunderraman, R. Single valued neutrosophic sets. Multispace Multistruct. 2010, 4, 410–413. [Google Scholar]
  30. Atanassov, K. Intuitionistic fuzzy sets. Fuzzy Sets Syst. 1986, 20, 87–96. [Google Scholar]
  31. Mondal, K.; Pramanik, S. Rough neutrosophic multi-attribute decision-making based on grey relational analysis. Neutrosophic Sets Syst. 2015, 7, 8–17. [Google Scholar]
  32. Yang, H.; Zhang, C.; Guo, Z.; Liu, Y.; Liao, X. A hybrid model of single valued neutrosophic sets and rough sets: Single valued neutrosophic rough set model. Soft Comput. 2017, 21, 6253–6267. [Google Scholar]
  33. Zhang, X.; Xu, Z. The extended TOPSIS method for multi-criteria decision making based on hesitant heterogeneous information. In Proceedings of the 2014 2nd International Conference on Software Engineering, Knowledge Engineering and Information Engineering (SEKEIE 2014), Singapore, 5–6 August 2014. [Google Scholar]
  34. Cheng, J.; Zhang, Y.; Feng, Y.; Liu, Z.; Tan, J. Structural optimization of a high-speed press considering multi-source uncertainties based on a new heterogeneous TOPSIS. Appl. Sci. 2018, 8, 126. [Google Scholar]
  35. Liu, J.; Zhao, H.; Li, J.; Liu, S. Decision process in MCDM with large number of criteria and heterogeneous risk preferences. Oper. Res. Perspect. 2017, 4, 106–112. [Google Scholar]
  36. Watróbski, J.; Jankowski, J.; Ziemba, P.; Karczmarczyk, A.; Zioło, M. Generalised framework for multi-criteria method selection. Omega 2018. [Google Scholar] [CrossRef]
  37. Faizi, S.; Sałabun, W.; Rashid, T.; Wa̧tróbski, J.; Zafar, S. Group decision-making for hesitant fuzzy sets based on characteristic objects method. Symmetry 2017, 9, 136. [Google Scholar]
  38. Faizi, S.; Rashid, T.; Sałabun, W.; Zafar, S.; Wa̧tróbski, J. Decision making with uncertainty using hesitant fuzzy sets. Int. J. Fuzzy Syst. 2018, 20, 93–103. [Google Scholar]
  39. Zhan, J.; Ali, M.I.; Mehmood, N. On a novel uncertain soft set model: Z-soft fuzzy rough set model and corresponding decision making methods. Appl. Soft Comput. 2017, 56, 446–457. [Google Scholar]
  40. Zhan, J.; Alcantud, J.C.R. A novel type of soft rough covering and its application to multicriteria group decision making. Artif. Intell. Rev. 2018, 4, 1–30. [Google Scholar]
  41. Zhang, Z. An approach to decision making based on intuitionistic fuzzy rough sets over two universes. J. Oper. Res. Soc. 2013, 64, 1079–1089. [Google Scholar]
  42. Akram, M.; Ali, G.; Alshehri, N.O. A new multi-attribute decision-making method based on m-polar fuzzy soft rough sets. Symmetry 2017, 9, 271. [Google Scholar]
  43. Liu, P. The aggregation operators based on archimedean t-conorm and t-norm for single-valued neutrosophic numbers and their application to decision making. Int. J. Fuzzy Syst. 2016, 18, 849–863. [Google Scholar]
  44. Ye, J. Multicriteria decision-making method using the correlation coefficient under single-valued neutrosophic environment. Int. J. Gen. Syst. 2013, 42, 386–394. [Google Scholar]
  45. Bonikowski, Z.; Bryniarski, E.; Wybraniec-Skardowska, U. Extensions and intentions in the rough set theory. Inf. Sci. 1998, 107, 149–167. [Google Scholar]
  46. Pomykala, J.A. Approximation operations in approximation space. Bull. Pol. Acad. Sci. 1987, 35, 653–662. [Google Scholar]
  47. Zhang, X.; Bo, C.; Smarandache, F.; Dai, J. New inclusion relation of neutrosophic sets with applications and related lattice structure. Int. J. Mach. Learn. Cybern. 2018, 9, 1753–1763. [Google Scholar]
  48. Zhang, X. Fuzzy anti-grouped filters and fuzzy normal filters in pseudo-BCI algebras. J. Intell. Fuzzy Syst. 2017, 33, 1767–1774. [Google Scholar]
  49. Zhang, X.; Park, C.; Wu, S. Soft set theoretical approach to pseudo-BCI algebras. J. Intell. Fuzzy Syst. 2018, 34, 559–568. [Google Scholar]
Figure 1. The first chat of different values of patient in utilizing different methods in Example 10.
Figure 1. The first chat of different values of patient in utilizing different methods in Example 10.
Symmetry 10 00710 g001
Figure 2. The second chat of different values of patient in utilizing different methods in Example 10.
Figure 2. The second chat of different values of patient in utilizing different methods in Example 10.
Symmetry 10 00710 g002
Table 1. The tabular representation of single valued neutrosophic (SVN) β-covering C ^ .
Table 1. The tabular representation of single valued neutrosophic (SVN) β-covering C ^ .
U C 1 C 2 C 3 C 4
x 1 0.7 , 0.2 , 0.5 0.6 , 0.2 , 0.4 0.4 , 0.1 , 0.5 0.1 , 0.5 , 0.6
x 2 0.5 , 0.3 , 0.2 0.5 , 0.2 , 0.8 0.4 , 0.5 , 0.4 0.6 , 0.1 , 0.7
x 3 0.4 , 0.5 , 0.2 0.2 , 0.3 , 0.6 0.5 , 0.2 , 0.4 0.6 , 0.3 , 0.4
x 4 0.6 , 0.1 , 0.7 0.4 , 0.5 , 0.7 0.3 , 0.6 , 0.5 0.5 , 0.3 , 0.2
x 5 0.3 , 0.2 , 0.6 0.7 , 0.3 , 0.5 0.6 , 0.3 , 0.5 0.8 , 0.1 , 0.2
Table 2. The tabular representation of N ˜ x k β ( k = 1 , 2 , 3 , 4 , 5 ).
Table 2. The tabular representation of N ˜ x k β ( k = 1 , 2 , 3 , 4 , 5 ).
N ˜ x k β x 1 x 2 x 3 x 4 x 5
N ˜ x 1 β 0.6 , 0.2 , 0.5 0.5 , 0.3 , 0.8 0.2 , 0.5 , 0.6 0.4 , 0.5 , 0.7 0.3 , 0.3 , 0.6
N ˜ x 2 β 0.1 , 0.5 , 0.6 0.5 , 0.3 , 0.8 0.2 , 0.5 , 0.6 0.4 , 0.5 , 0.7 0.3 , 0.3 , 0.6
N ˜ x 3 β 0.1 , 0.5 , 0.6 0.4 , 0.5 , 0.7 0.5 , 0.3 , 0.4 0.3 , 0.6 , 0.5 0.6 , 0.3 , 0.5
N ˜ x 4 β 0.1 , 0.5 , 0.6 0.5 , 0.3 , 0.7 0.4 , 0.5 , 0.4 0.5 , 0.3 , 0.7 0.3 , 0.2 , 0.6
N ˜ x 5 β 0.1 , 0.5 , 0.6 0.4 , 0.5 , 0.8 0.2 , 0.3 , 0.6 0.3 , 0.6 , 0.7 0.6 , 0.3 , 0.5
Table 3. s ( x k ) ( k = 1 , 2 , , 5 ).
Table 3. s ( x k ) ( k = 1 , 2 , , 5 ).
U x 1 x 2 x 3 x 4 x 5
s ( x k ) 0.945 0.937 0.922 0.909 0.958
Table 4. The generalized SVN approximation space ( U , V , R ˜ ).
Table 4. The generalized SVN approximation space ( U , V , R ˜ ).
R ˜ x 1 x 2 x 3 x 4 x 5
y 1 0.7 , 0.2 , 0.5 0.5 , 0.3 , 0.2 0.4 , 0.5 , 0.2 0.6 , 0.1 , 0.7 0.3 , 0.2 , 0.6
y 2 0.6 , 0.2 , 0.4 0.5 , 0.2 , 0.8 0.2 , 0.3 , 0.6 0.4 , 0.5 , 0.7 0.7 , 0.3 , 0.5
y 3 0.4 , 0.1 , 0.5 0.4 , 0.5 , 0.4 0.5 , 0.2 , 0.4 0.3 , 0.6 , 0.5 0.6 , 0.3 , 0.5
y 4 0.1 , 0.5 , 0.6 0.6 , 0.1 , 0.7 0.6 , 0.3 , 0.4 0.5 , 0.3 , 0.2 0.8 , 0.1 , 0.2
Table 5. The results utilizing the different methods of Example 10.
Table 5. The results utilizing the different methods of Example 10.
MethodsThe Final RankingThe Patient Is Most Sick With the Disease B
Algorithm 2 in Liu [43] x 4 , x 3 , x 2 , x 5 , x 1 x 1
Algorithm 3 in Yang et al. [32] x 4 , x 3 , x 5 , x 2 , x 1 x 1
Algorithm 4 in Ye [44] x 4 , x 3 , x 2 , x 5 , x 1 x 1
Algorithm 5 in Ye [44] x 4 , x 3 , x 2 , x 5 , x 1 x 1
Algorithm 1 in this paper x 4 , x 3 , x 2 , x 1 , x 5 x 5

Share and Cite

MDPI and ACS Style

Wang, J.; Zhang, X. Two Types of Single Valued Neutrosophic Covering Rough Sets and an Application to Decision Making. Symmetry 2018, 10, 710. https://doi.org/10.3390/sym10120710

AMA Style

Wang J, Zhang X. Two Types of Single Valued Neutrosophic Covering Rough Sets and an Application to Decision Making. Symmetry. 2018; 10(12):710. https://doi.org/10.3390/sym10120710

Chicago/Turabian Style

Wang, Jingqian, and Xiaohong Zhang. 2018. "Two Types of Single Valued Neutrosophic Covering Rough Sets and an Application to Decision Making" Symmetry 10, no. 12: 710. https://doi.org/10.3390/sym10120710

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop