1. Introduction
Multi-attribute decision-making (MADM) problems are an important part of decision theory in which we choose the best one from the set of finite alternatives based on the collective information. Traditionally, it has been assumed that the information regarding accessing the alternatives is taken in the form of real numbers. However, uncertainty and fuzziness are big issues in real-world problems nowadays and can be found everywhere as in our discussion or the way we process information. To deal with such a situation, the theory of fuzzy sets (FSs) [1] or extended fuzzy sets such as an intuitionistic fuzzy set (IFS) [2] or interval-valued IFS (IVIFS) [3] are the most successful ones, which characterize the attribute values in terms of membership degrees. During the last few decades, researchers has been paying more attention to these theories and successfully applied them to various situations in the decision-making process. The two important aspects of solving the MADM problem are, first, to design an appropriate function that aggregates the different preferences of the decision-makers into collective ones and, second, to design appropriate measures to rank the alternatives. For the former part, an aggregation operator is an important part of the decision-making, which usually takes the form of a mathematical function to aggregate all the individual input data into a single one. Over the last decade, numerable attempts have been made by different researchers in processing the information values using different aggregation operators under IFS and IVIFS environments. For instance, Xu and Yager [4], Xu [5] presented some weighted averaging and geometric aggregation operators to aggregate the different intuitionistic fuzzy numbers (IFNs). Garg [6] and Garg [7] presented some interactive improved aggregation operators for IFNs using Einstein norm operations. Wang and Wang [8] characterized the preference of the decision-makers in terms of interval-numbers, and then, an MADM was presented corresponding to it with completely unknown weight vectors. Wei [9] presented some induced geometric aggregation operators with intuitionistic fuzzy information. Arora and Garg [10] and Arora and Garg [11] presented some aggregation operators by considering the different parameterization factors in the analysis in the intuitionistic fuzzy soft set environment. Zhou and Xu [12] presented some extreme weighted averaging aggregation operators for solving decision-making problems in terms of the optimism and pessimism points of view. Garg [13] presented some improved geometric aggregation operators for IVIFS. A complete overview about the aggregation operators in the IVIFSs was summarized by Xu and Guo in [14]. Jamkhaneh and Garg [15] presented some new operations for the generalized IFSs and applied them to solve decision-making problems. Garg and Singh [16] presented a new triangular interval Type-2 IFS and its corresponding aggregation operators.
With regard to the information measure, the entropy measure is basically known as the measure for information originating from the fundamental paper “The Mathematical theory of communication” in 1948 by C.E.Shannon [17]. Information theory is one of the trusted areas to measure the degree of uncertainty in the data. However, classical information measures deal with information that is precise in nature. In order to overcome this, Deluca and Termini [18] proposed a set of axioms for fuzzy entropy. Later on, Szmidt and Kacprzyk [19] extended the axioms of Deluca and Termini [18] to the IFS environment. Vlachos and Sergiadis [20] extended their measure to the IFS environment. Burillo and Bustince [21] introduced the entropy of IFSs as a tool to measure the degree of intuitionism associated with an IFS. Garg et al. [22] presented a generalized intuitionistic fuzzy entropy measure of order α α and degree β β to solve decision-making problems. Wei et al. [23] presented an entropy measure based on the trigonometric functions. Garg et al. [24] presented an entropy-based method for solving decision-making problems. Zhang and Jiang [25] presented an intuitionistic fuzzy entropy by generalizing the measure of Deluca and Termini [18]. Verma and Sharma [26] presented an exponential order measure between IFSs.
In contrast to the entropy measures, the distance or similarity measures are also used by researchers to measure the similarity between two IFSs. In that direction, Taneja [27] presented a theory on the generalized information measures in the fuzzy environment. Boekee and Van der Lubbe [28] presented the R-norm information measure. Hung and Yang [29] presented the similarity measures between the two different IFSs based on the Hausdorff distance. Garg [30], Garg and Arora [31] presented a series of distance and similarity measures in the different sets of the environment to solve decision-making problems. Joshi and Kumar [32] presented an ( R,S R,S )-norm fuzzy information measures to solve decision-making problems. Garg and Kumar [33,34] presented some similarity and distance measures of IFSs by using the set pair analysis theory. Meanwhile, decision-making methods based on some measures (such as distance, similarity degree, correlation coefficient and entropy) were proposed to deal with fuzzy IF and interval-valued IF MADM problems [35,36,37,38].
In [39,40,41,42,43], emphasis was given by the researchers to the attribute weights during ranking of the alternatives. It is quite obvious that the final ranking order of the alternatives highly depends on the attribute weights, because the variation of weight values may result in a different final ranking order of alternatives [39,44,45,46,47]. Now, based on the characteristics of the attribute weights, the decision-making problem can be classified into three types: (a) the decision-making situation where the attribute weights are completely known; (b) the decision-making situation where the attribute weights are completely unknown; (c) the decision-making situation where the attribute weights are partially known. Thus, based on these types, the attribute weights in MADM can be classified as subjective and objective attribute weights based on the information acquisition approach. If the decision-maker gives weights to the attributes, then such information is called subjective. The classical approaches to determine the subjective attribute weights are the analytic hierarchy process (AHP) method [48] and the Delphi method [49]. On the other hand, the objective attribute weights are determined by the decision-making matrix, and one of the most important approaches is the Shannon entropy method [17], which expresses the relative intensities of the attributes’ importance to signify the average intrinsic information transmitted to the decision-maker. In the literature, several authors [39,44,50,51,52] have addressed the MADM problem with subjective weight information. However, some researchers formulated a nonlinear programming model to determine the attribute weights. For instance, Chen and Li [44] presented an approach to assess the attribute weights by utilizing IF entropy in the IFS environment. Garg [53] presented a generalized intuitionistic fuzzy entropy measure to determine the completely unknown attribute weight to solve the decision-making problems. Although some researchers put some efforts into determining the unknown attribute weights [45,46,54,55] under different environments, still it remains an open problem.
Therefore, in an attempt to address such problems and motivated by the characteristics of the IFSs to describe the uncertainties in the data, this paper addresses a new entropy measure to quantify the degree of fuzziness of a set in the IFS environment. The aim of this entropy is to determine the attribute weights under the characteristics of the attribute weights that they are either partially known or completely unknown. For this, we propose a novel entropy measure named the ( R,S R,S)-norm-based information measure, which makes the decision more flexible and reliable corresponding to different values of the parameters R and S. Some of the desirable properties of the proposed measures are investigated, and some of their correlations are dreived. From the proposed entropy measures, some of the existing measures are considered as a special case. Furthermore, we propose two approaches for solving the MADM approach based on the proposed entropy measures by considering the characteristics of the attribute weights being either partially known or completely unknown. Two illustrative examples are considered to demonstrate the approach and compare the results with some of the existing approaches’ results.
The rest of this paper is organized as follows. In Section 2, we present some basic concepts of IFSs and the existing entropy measures. In Section 3, we propose a new ( R,S R,S )-norm-based information measure in the IFS environment. Various desirable relations among the approaches are also investigated in detail. Section 4 describes two approaches for solving the MADM problem with the condition that attribute weights are either partially known or completely unknown. The developed approaches have been illustrated with a numerical example. Finally, a concrete conclusion and discussion are presented in Section 5.
2. Preliminaries
Some basic concepts related to IFSs and the aggregation operators are highlighted, over the universal set X, in this section.
Definition 1.
[2] An IFS A defined in X is an ordered pair given by:
A={〈x,ζA(x),ϑA(x)〉∣x∈X}
where ζA,ϑA:X⟶[0,1] ζA,ϑA:X⟶[0,1] represent, respectively, the membership and non-membership degrees of the element x such that ζA,ϑA∈[0,1] ζA,ϑA∈[0,1] and ζA+ϑA≤1 ζA+ϑA≤1 for all x. For convenience, this pair is denoted by A=〈ζA,ϑA〉 A=〈ζA,ϑA〉 and called an intuitionistic fuzzy number (IFN) [4,5].
Definition 2.
[4,5] Let the family of all intuitionistic fuzzy sets of universal set X be denoted by FS(X). Let A, B ∈ FS(X) be such that then some operations can be defined as follows:
1. A⊆BifζA(x)≤ζB(x)andϑA(x)≥ϑB(x),forallx∈X A⊆BifζA(x)≤ζB(x)andϑA(x)≥ϑB(x),forallx∈X ;
2. A⊇BifζA(x)≥ζB(x)andϑA(x)≤ϑB(x),forallx∈X A⊇BifζA(x)≥ζB(x)andϑA(x)≤ϑB(x),forallx∈X ;
3. A=BiffζA(x)=ζB(x)andϑA(x)=ϑB(x),forallx∈X A=BiffζA(x)=ζB(x)andϑA(x)=ϑB(x),forallx∈X ;
4. A∪B={〈x,max(ζA(x),ζB(x)),min(ϑA(x),ϑB(x))〉 A∪B={〈x,max(ζA(x),ζB(x)),min(ϑA(x),ϑB(x))〉 : x∈X} x∈X} ;
5. A∩B={〈x,min(ζA(x),ζB(x)),max(ϑA(x),ϑB(x))〉 A∩B={〈x,min(ζA(x),ζB(x)),max(ϑA(x),ϑB(x))〉 : x∈X} x∈X} ;
6. Ac={〈x,ϑA(x),ζA(x)〉 Ac={〈x,ϑA(x),ζA(x)〉 : x∈X} x∈X} .
Definition 3.
[19] An entropy E: IFS(X)⟶R+ IFS(X)⟶R+ on IFS(X) is a real-valued functional satisfying the following four axioms for A,B∈IFS(X) A,B∈IFS(X)
(P1) E(A)=0 E(A)=0 if and only if A is a crisp set, i.e., either ζA(x)=1,ϑA(x)=0 ζA(x)=1,ϑA(x)=0 or ζA(x)=0,ϑA(x)=1 ζA(x)=0,ϑA(x)=1 for all x∈X x∈X .
(P2) E(A)=1 E(A)=1 if and only if ζA(x)=ϑA(x) ζA(x)=ϑA(x) for all x∈X x∈X .
(P3) E(A)=E(Ac) E(A)=E(Ac) .
(P4) If A⊆B A⊆B , that is, if ζA(x)≤ζB(x) ζA(x)≤ζB(x) and ϑA(x)≥ϑB(x) ϑA(x)≥ϑB(x) for any x∈X x∈X , then E(A)≤E(B) E(A)≤E(B) .
Vlachos and Sergiadis [20] proposed the measure of intuitionistic fuzzy entropy in the IFS environment as follows:
E(A)=-1nln2∑i=1nζA(xi)lnζA(xi)+ϑA(xi)lnϑA(xi)-(1-πA(xi))ln(1-πA(xi))-πA(xi)ln2
Zhang and Jiang [25] presented a measure of intuitionistic fuzzy entropy based on a generalization of measure of Deluca and Termini [18] as:
E(A)=-1n∑i=1nζA(xi)+1-ϑA(xi)2logζA(xi)+1-ϑA(xi)2+ϑA(xi)+1-ζA(xi)2logϑA(xi)+1-ζA(xi)2
Verma and Sharma [26] proposed an exponential order entropy in the IFS environment as:
E(A)=1n(e-1)∑i=1nζA(xi)+1-ϑA(xi)2e1-ζA(xi)+1-ϑA(xi)2+ϑA(xi)+1-ζA(xi)2e1-ϑA(xi)+1-ζA(xi)2-1
Garg et al. [22] generalized entropy measure Eαβ(A) Eαβ(A) of order α α and degree β βas:
Eαβ(A)=2-βn(2-β-α)∑i=1nlogζAα2-β(xi)+ϑAα2-β(xi)(ζA(xi)+ϑA(xi))1-α2-β+21-α2-β(1-ζA(xi)-ϑA(xi))
where log is to the base two, α>0 α>0 , β∈[0,1] β∈[0,1] , α+β≠2 α+β≠2.
3. Proposed ( R,S R,S)-Norm Intuitionistic Fuzzy Information Measure
In this section, we define a new ( R,S R,S )-norm information measure, denoted by HRS HRS , in the IFS environment. For it, let Ω Ωbe the collection of all IFSs.
Definition 4.
For a collection of IFSs A={(x,ζA(x),ϑA(x))∣x∈X} A={(x,ζA(x),ϑA(x))∣x∈X} , an information measure HRS:Ωn→R HRS:Ωn→R ; n≥2 n≥2 is defined as follows:
HRS(A)=R×Sn(R-S)∑i=1nζAS(xi)+ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1R;eitherR>1,0<S<1or0<R<1,S>1Rn(R-1)∑i=1n1-ζAR(xi)+ϑAR(xi)+πAR(xi)1R;whenS=1;0<R<1Sn(1-S)∑i=1nζAS(xi)+ϑAS(xi)+πAS(xi)1S-1;whenR=1;0<S<1-1n∑i=1nζA(xi)logζA(xi)+ϑA(xi)logϑA(xi)+πA(xi)logπA(xi);R=1=S.
Theorem 1.
An intuitionistic fuzzy entropy measure HRS(A) HRS(A) defined in Equation (6) for IFSs is a valid measure, i.e., it satisfies the following properties.
(P1) HRS(A)=0 HRS(A)=0 if and only if A is a crisp set, i.e., ζA(xi)=1,ϑA(xi)=0 ζA(xi)=1,ϑA(xi)=0 or ζA(xi)=0,ϑA(xi)=1 ζA(xi)=0,ϑA(xi)=1 for all xi∈X xi∈X .
(P2) HRS(A)=1 HRS(A)=1 if and only if ζA(xi)=ϑA(xi) ζA(xi)=ϑA(xi) for all xi∈X xi∈X .
(P3) HRS(A)≤HRS(B) HRS(A)≤HRS(B) if A is crisper than B, i.e., if ζA(xi)≤ζB(xi) ζA(xi)≤ζB(xi) & ϑA(xi)≤ϑB(xi) ϑA(xi)≤ϑB(xi) , for max{ζB(xi),ϑB(xi)}≤13 max{ζB(xi),ϑB(xi)}≤13 and ζA(xi)≥ζB(xi) ζA(xi)≥ζB(xi) & ϑA(xi)≥ϑB(xi) ϑA(xi)≥ϑB(xi) , for min{ζB(xi),ϑB(xi)}≤13 min{ζB(xi),ϑB(xi)}≤13 for all xi∈X xi∈X .
(P4) HRS(A)=HRS(Ac) HRS(A)=HRS(Ac) for all A∈IFS(X) A∈IFS(X) .
Proof.
To prove that the measure defined by Equation (6) is a valid information measure, we will have to prove that it satisfies the four properties defined in the definition of the intuitionistic fuzzy information measure.
1. Sharpness: In order to prove (P1), we need to show that HRS(A)=0 HRS(A)=0 if and only if A is a crisp set, i.e., either ζA(x)=1,ϑA(x)=0 ζA(x)=1,ϑA(x)=0 or ζA(x)=0,ϑA(x)=1 ζA(x)=0,ϑA(x)=1 for all x∈X x∈X.
Firstly, we assume that HRS(A)=0 HRS(A)=0 for R,S>0 R,S>0 and R≠S R≠S . Therefore, from Equation (6), we have:
R×Sn(R-S)∑i=1nζAS(xi)+ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1R=0⇒ζAS(xi)+ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1R=0foralli=1,2,…,n.i.e.,ζAS(xi)+ϑAS(xi)+πAS(xi)1S=ζAR(xi)+ϑAR(xi)+πAR(xi)1Rforalli=1,2,…,n.
Since R,S>0 R,S>0 and R≠S R≠S , therefore, the above equation is satisfied only if ζA(xi)=0,ϑA(xi)=1 ζA(xi)=0,ϑA(xi)=1 or ζA(xi)=1,ϑA(xi)=0 ζA(xi)=1,ϑA(xi)=0 for all i=1,2,…,n i=1,2,…,n.
Conversely, we assume that set A=(ζA,ϑA) A=(ζA,ϑA) is a crisp set i.e., either ζA(xi)=0 ζA(xi)=0 or 1. Now, for R,S>0 R,S>0 and R≠S R≠S, we can obtain that:
ζAS(xi)+ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1R=0
for all i=1,2,…,n i=1,2,…,n , which gives that HRS(A)=0 HRS(A)=0.
Hence, HRS(A)=0 HRS(A)=0iff A is a crisp set.
* Maximality: We will find maxima of the function HRS(A) HRS(A) ; for this purpose, we will differentiate Equation (6) with respect to ζA(xi) ζA(xi) and ϑA(xi) ϑA(xi). We get,
∂HRS(A)∂ζA(xi)=R×Sn(R-S)∑i=1nζAS(xi)+ϑAS(xi)+πAS(xi)1-SSζAS-1(xi)-πAS-1(xi)-ζAR(xi)+ϑAR(xi)+πAR(xi)1-RRζAR-1(xi)-πAR-1(xi)
and:
∂HRS(A)∂ϑA(xi)=R×Sn(R-S)∑i=1nζAS(xi)+ϑAS(xi)+πAS(xi)1-SSϑAS-1(xi)-πAS-1(xi)-ζAR(xi)+ϑAR(xi)+πAR(xi)1-RRϑAR-1(xi)-πAR-1(xi)
In order to check the convexity of the function, we calculate its second order derivatives as follows:
∂2 HRS(A)∂2 ζA(xi)=R×Sn(R-S)∑i=1n(1-S)ζAS(xi)+ϑAS(xi)+πAS(xi)1-2SS ζAS-1(xi)-πAS-1(xi)2+(S-1)ζAS(xi)+ϑAS(xi)+πAS(xi)1-SSζAS-2(xi)+πAS-2(xi)-(1-R)ζAR(xi)+ϑAR(xi)+πAR(xi)1-2RR ζAR-1(xi)-πAR-1(xi)2-(R-1)ζAR(xi)+ϑAR(xi)+πAR(xi)1-RRζAR-2(xi)+πAR-2(xi)
∂2 HRS(A)∂2 ϑA(xi)=R×Sn(R-S)∑i=1n(1-S)ζAS(xi)+ϑAS(xi)+πAS(xi)1-2SS ϑAS-1(xi)-πAS-1(xi)2+(S-1)ζAS(xi)+ϑAS(xi)+πAS(xi)1-SSϑAS-2(xi)+πAS-2(xi)-(1-R)ζAR(xi)+ϑAR(xi)+πAR(xi)1-2RR ϑAR-1(xi)-πAR-1(xi)2-(R-1)ζAR(xi)+ϑAR(xi)+πAR(xi)1-RRϑAR-2(xi)+πAR-2(xi)
and
∂2 HRS(A)∂ϑA(xi)∂ζA(xi)=R×Sn(R-S)∑i=1n(1-S)ζAS(xi)+ϑAS(xi)+πAS(xi)1-2SS××ϑAS-1(xi)-πAS-1(xi)ζAS-1(xi)-πAS-1(xi)-(1-R)ζAR(xi)+ϑAR(xi)+πAR(xi)1-2RR××ϑAR-1(xi)-πAR-1(xi)ζAR-1(xi)-πAR-1(xi)
To find the maximum/minimum point, we set ∂HRS(A)∂ζA(xi)=0 ∂HRS(A)∂ζA(xi)=0 and ∂HRS(A)∂ϑA(xi)=0 ∂HRS(A)∂ϑA(xi)=0 , which gives that ζA(xi)=ϑA(xi)=πA(xi)=13 ζA(xi)=ϑA(xi)=πA(xi)=13 for all i and hence called the critical point of the function HRS HRS.
(a) When R<1,S>1 R<1,S>1 , then at the critical point ζA(xi)=ϑA(xi)=πA(xi)=13 ζA(xi)=ϑA(xi)=πA(xi)=13, we compute that:
∂2 HRS(A)∂2 ζA(xi)<0and∂2 HRS(A)∂2 ζA(xi)·∂2 HRS(A)∂2 ϑA(xi)-∂2 HRS(A)∂ϑA(xi)∂ζA(xi)2>0
Therefore, the Hessian matrix of HRS(A) HRS(A) is negative semi-definite, and hence, HRS(A) HRS(A) is a concave function. As the critical point of HRS HRS is ζA=ϑA=13 ζA=ϑA=13 and by the concavity, we get that HRS(A) HRS(A) has a relative maximum value at ζA=ϑA=13 ζA=ϑA=13.
(b) When R>1,S<1 R>1,S<1, then at the critical point, we can again easily obtain that:
∂2 HRS(A)∂2 ζA(xi)<0and∂2 HRS(A)∂2 ζA(xi)·∂2 HRS(A)∂2 ϑA(xi)-∂2 HRS(A)∂ϑA(xi)ζA(xi)2>0
This proves that HRS(A) HRS(A) is a concave function and its global maximum at ζA(xi)=ϑA(xi)=13 ζA(xi)=ϑA(xi)=13.
Thus, for all R,S>0;R<1,S<1 R,S>0;R<1,S<1 or R>1,S<1 R>1,S<1 , the global maximum value of HRS(A) HRS(A) attains at the point ζA(xi)=ϑA(xi)=13 ζA(xi)=ϑA(xi)=13 , i.e., HRS(A) HRS(A)is maximum if and only if A is the most fuzzy set.
* Resolution: In order to prove that our proposed entropy function is monotonically increasing and monotonically decreasing with respect to ζA(xi) ζA(xi) and ϑA(xi) ϑA(xi) , respectively, for convince, let ζA(xi)=x ζA(xi)=x , ϑA(xi)=y ϑA(xi)=y and πA(xi)=1-x-y πA(xi)=1-x-y , then it is sufficient to prove that for R,S>0 R,S>0 , R≠S R≠S, the entropy function:
f(x,y)=R×Sn(R-S)(xS+yS+(1-x-u)S)1S-(xR+yR+(1-x-y)R)1R
where x,y∈[0,1] x,y∈[0,1]is an increasing function w.r.t. x and decreasing w.r.t. y.
Taking the partial derivative of f with respect to x and y respectively, we get:
∂f∂x=R×Sn(R-S)xS(xi)+yS(xi)+(1-x-y)S(xi)1-SSxS-1(xi)-(1-x-y)S-1-xR(xi)+yR(xi)+(1-x-y)R(xi)1-RRxR-1(xi)-(1-x-y)R-1
and:
∂f∂y=R×Sn(R-S)xS(xi)+yS(xi)+(1-x-y)S(xi)1-SSyS-1(xi)-(1-x-y)S-1-xR(xi)+yR(xi)+(1-x-y)R(xi)1-RRyR-1(xi)-(1-x-y)R-1
For the extreme point of f, we set ∂f∂x=0 ∂f∂x=0 and ∂f∂y=0 ∂f∂y=0 and get x=y=13 x=y=13.
Furthermore, ∂f∂x≥0 ∂f∂x≥0 , when x≤y x≤y such that R,S>0 R,S>0 , R≠S R≠S , i.e., f(x,y) f(x,y) is increasing with x≤y x≤y , and ∂f∂x≤0 ∂f∂x≤0 is decreasing with respect to x, when x≥y x≥y . On the other hand, ∂f∂y≥0 ∂f∂y≥0 and ∂f∂y≤0 ∂f∂y≤0 when x≥y x≥y and x≤y x≤y, respectively.
Further, since HRS(A) HRS(A) is a concave function on the IFS A, therefore, if max{ζA(x),ϑA(x)}≤13 max{ζA(x),ϑA(x)}≤13 , then ζA(xi)≤ζ( xi) ζA(xi)≤ζ( xi) and ϑA(xi)≤ϑB(xi) ϑA(xi)≤ϑB(xi), which implies that:
ζA(xi)≤ζB(xi)≤13;ϑA(xi)≤ϑB(xi)≤13;πA(xi)≥πB(xi)≥13
Thus, we observe that (ζB(xi),ϑB(xi),πB(xi)) (ζB(xi),ϑB(xi),πB(xi)) is more around (13,13,13) (13,13,13) than (ζA(xi),ϑA(xi),πA(xi)) (ζA(xi),ϑA(xi),πA(xi)) . Hence, HRS(A)≤HRB(B) HRS(A)≤HRB(B).
Similarly, if min{ζA(xi),ϑA(xi)}≥13 min{ζA(xi),ϑA(xi)}≥13 , then we get HRS(A)≤HRB(B) HRS(A)≤HRB(B).
* Symmetry: By the definition of HRS(A) HRS(A) , we can easily obtain that HRS(Ac)=HRS(A) HRS(Ac)=HRS(A).
Hence HRS(A) HRS(A)satisfies all the properties of the intuitionistic fuzzy information measure and, therefore, is a valid measure of intuitionistic fuzzy entropy. ☐
Consider two IFSs A and B defined over X={x1,x2,…,xn}. X={x1,x2,…,xn}.Take the disjoint partition of X as:
X1={xi∈X∣A⊆B},={xi∈X∣ζA(x)≤ζB(x);ϑA(x)≥ϑB(x)}
and:
X2={xi∈X∣A⊇B}={xi∈X∣ζA(x)≥ζB(x);ϑA(x)≤ϑB(x)}
Next, we define the joint and conditional entropies between IFSs A and B as follows:
1. Joint entropy:
HRS(A∪B)=R×Sn(R-S)∑i=1nζA∪BS(xi)+ϑA∪BS(xi)+(1-ζA∪B(xi)-ϑA∪B(xi))S1S-ζA∪BR(xi)+ϑA∪BR(xi)+(1-ζA∪B(xi)-ϑA∪B(xi))R1R=R×Sn(R-S)∑xi∈X1ζBS(xi)+ϑBS(xi)+(1-ζB(xi)-ϑB(xi))S1S-ζBR(xi)+ϑBR(xi)+(1-ζB(xi)-ϑB(xi))R1R+R×Sn(R-S)∑xi∈X2ζAS(xi)+ϑAS(xi)+(1-ζA(xi)-ϑA(xi))S1S-ζAR(xi)+ϑAR(xi)+(1-ζA(xi)-ϑA(xi))R1R
* Conditional entropy:
HRS(A|B)=R×Sn(R-S)∑xi∈X2ζAS(xi)+ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1R-ζBS(xi)+ϑBS(xi)+πBS(xi)1S+ζBR(xi)+ϑBR(xi)+πBR(xi)1R
and:
HRS(B|A)=R×Sn(R-S)∑xi∈X1ζBS(xi)+ϑBS(xi)+πBS(xi)1S-ζBR(xi)+ϑBR(xi)+πBR(xi)1R-ζAS(xi)+ϑAS(xi)+πAS(xi)1S+ζAR(xi)+ϑAR(xi)+πAR(xi)1R
Theorem 2.
Let A and B be the two IFSs defined on universal set X={x1,x2,…,xn} X={x1,x2,…,xn} , where, A={〈xi,ζA(xi),ϑA(xi)〉∣xi∈X} A={〈xi,ζA(xi),ϑA(xi)〉∣xi∈X} and B={〈xi,ζB(xi),ϑB(xi)〉∣xi∈X} B={〈xi,ζB(xi),ϑB(xi)〉∣xi∈X} , such that either A⊆B A⊆B or A⊇B A⊇B ∀ xi∈X xi∈X , then:
HRS(A∪B)+HRS(A∩B)=HRS(A)+HRS(B)
Proof.
Let X1 X1 and X2 X2be the two disjoint sets of X, where,
X1={x∈X:A⊆B},X2={x∈X:A⊇B}
i.e., for xi∈X1 xi∈X1 , we have ζA(xi)≤ζB(xi),ϑA(xi)≥ϑB(xi) ζA(xi)≤ζB(xi),ϑA(xi)≥ϑB(xi) and xi∈X2 xi∈X2 , implying that ζA(xi)≥ζB(xi),ϑA(xi)≤ϑB(xi) ζA(xi)≥ζB(xi),ϑA(xi)≤ϑB(xi). Therefore,
HRS(A∪B)+HRS(A∩B)=R×Sn(R-S)∑i=1nζA∪BS(xi)+ϑA∪BS(xi)+(1-ζA∪B(xi)-ϑA∪B(xi))S1S-ζA∪BR(xi)+ϑA∪BR(xi)+(1-ζA∪B(xi)-ϑA∪B(xi))R1R+R×Sn(R-S)∑i=1nζA∩BS(xi)+ϑA∩BS(xi)+(1-ζA∩B(xi)-ϑA∩B(xi))S1S-ζA∩BR(xi)+ϑA∩BR(xi)+(1-ζA∩B(xi)-ϑA∩B(xi))R1R=R×Sn(R-S)∑xi∈X1ζBS(xi)+ϑBS(xi)+πBS(xi)1S-ζBR(xi)+ϑBR(xi)+πBR(xi)1R+R×Sn(R-S)∑xi∈X2ζAS(xi)+ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1R+R×Sn(R-S)∑xi∈X1ζAS(xi)+ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1R+R×Sn(R-S)∑xi∈X2ζBS(xi)+ϑBS(xi)+πBS(xi)1S-ζB (xi)R+ϑB (xi)R+πBR(xi)1R=HRS(A)+HRS(B)
☐
Theorem 3.
The maximum and minimum values of the entropy HRSA HRSA are independent of the parameters R and S.
Proof.
As from the above theorem, we conclude that the entropy is maximum if and only if A is the most IFS and minimum when A is a crisp set. Therefore, it is enough to show that the value of HRS(A) HRS(A) in these conditions is independent of R and S. When A is the most IFS, i.e., ζA(xi)=ϑA(xi) ζA(xi)=ϑA(xi) , for all xi∈X xi∈X , then HRS(A)=1 HRS(A)=1 , and when A is a crisp set, i.e., either ζA(xi)=0 ζA(xi)=0 , ϑA(xi)=1 ϑA(xi)=1 or ζA(xi)=1,ϑA(xi)=0 ζA(xi)=1,ϑA(xi)=0 for all xi∈X xi∈X , then HRS(A)=0 HRS(A)=0 . Hence, in both cases, HRS(A) HRS(A)is independent of the parameters R and S. ☐
Remark 1.
From the proposed measure, it is observed that some of the existing measures can be obtained from it by assigning particular cases to R and S. For instance,
1. When πA(xi)=0 πA(xi)=0 for all xi∈X xi∈X , then the proposed measures reduce to the entropy measure of Joshi and Kumar [32].
2. When R=S R=S and S>0 S>0 , then the proposed measures are reduced by the measure of Taneja [27].
3. When R=1 R=1 and R≠S R≠S , then the measure is equivalent to the R-norm entropy presented by Boekee and Van der Lubbe [28].
4. When R=S=1 R=S=1 , then the proposed measure is the well-known Shannon’s entropy.
5. When S=1 S=1 and R≠S R≠S , then the proposed measure becomes the measure of Bajaj et al. [37].
Theorem 4.
Let A and B be two IFSs defined over the set X such that either A⊆B A⊆B or B⊆A B⊆A , then the following statements hold:
1. HRS(A∪B)=HRS(A)+HRS(B|A) HRS(A∪B)=HRS(A)+HRS(B|A) ;
2. HRS(A∪B)=HRS(B)+HRS(A|B) HRS(A∪B)=HRS(B)+HRS(A|B) ;
3. HRS(A∪B)=HRS(A)+HRS(B|A)=HRS(B)+HRS(A|B) HRS(A∪B)=HRS(A)+HRS(B|A)=HRS(B)+HRS(A|B) .
Proof.
For two IFSs A and B and by using the definitions of joint, conditional and the proposed entropy measures, we get:
1. Consider:
HRS(A∪B)-HRS(A)-HRS(B|A)=R×Sn(R-S)∑i=1nζA∪BS(xi)+ϑA∪BS(xi)+(1-ζA∪B(xi)-ϑA∪B(xi))S1S-ζA∪BR(xi)+ϑA∪BR(xi)+(1-ζA∪B(xi)-ϑA∪B(xi))R1R-R×Sn(R-S)∑i=1nζAS(xi)+ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1R-R×Sn(R-S)∑xi∈X1ζBS (xi)+ ϑBS(xi)+πBS(xi)1S-ζBR(xi)+ϑBR(xi)+πBR(xi)1R-ζAS(xi)+ϑAS(xi)+πAS(xi)1S+ζAR(xi)+ϑAR(xi)+πAR(xi)1R=R×Sn(R-S)∑xi∈X1ζBS(xi)+ϑBS(xi)+πBS(xi)1S-ζBR(xi)+ϑBR(xi)+πBR(xi)1R+R×Sn(R-S)∑xi∈X2ζAS(xi)+ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1R-R×Sn(R-S)∑xi∈X1ζAS(xi)+ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1R+R×Sn(R-S)∑xi∈X2ζAS(xi)+ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1R-R×Sn(R-S)∑xi∈X1ζBS (xi)+ ϑBS(xi)+πBS(xi)1S-ζBR(xi)+ϑBR(xi)+πBR(xi)1R-ζAS(xi)+ϑAS(xi)+πAS(xi)1S+ζAR(xi)+ϑAR(xi)+πAR(xi)1R=0
* Consider:
HRS(A∪B)-HRS(B)-HRS(A|B)=R×Sn(R-S)∑i=1nζA∪BS(xi)+ϑA∪BS(xi)+(1-ζA∪B(xi)-ϑA∪B(xi))S1S-ζA∪BR(xi)+ϑA∪BR(xi)+(1-ζA∪B(xi)-ϑA∪B(xi))R1R-R×Sn(R-S)∑i=1nζBS(xi)+ϑBS(xi)+πBS(xi)1S-ζBR(xi)+ϑBR(xi)+πBR(xi)1R-R×Sn(R-S)∑xi∈X2ζAS (xi)+ ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1R-ζBS(xi)+ϑBS(xi)+πBS(xi)1S+ζBR(xi)+ϑBR(xi)+πBR(xi)1R=R×Sn(R-S)∑xi∈X1ζBS(xi)+ϑBS(xi)+πBS(xi)1S-ζBR(xi)+ϑBR(xi)+πBR(xi)1R+R×Sn(R-S)∑xi∈X2ζAS(xi)+ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1R-R×Sn(R-S)∑xi∈X1ζBS(xi)+ϑBS(xi)+πBS(xi)1S-ζBR(xi)+ϑBR(xi)+πBR(xi)1R+R×Sn(R-S)∑xi∈X2ζBS(xi)+ϑBS(xi)+πBS(xi)1S-ζBR(xi)+ϑBR(xi)+πBR(xi)1R-R×Sn(R-S)∑xi∈X2ζAS(xi)+ϑAS(xi)+πAS(xi)1S-ζAR(xi)+ϑAR(xi)+πAR(xi)1RζBS(xi)+ϑBS(xi)+πBS(xi)1S-ζBR(xi)+ϑBR(xi)+πBR(xi)1R=0
* This can be deduced from Parts (1) and (2).
☐
Before elaborating on the comparison between the proposed entropy function and other entropy functions, we state a definition [56] for an IFS of the form A=〈x,ζA(xi),ϑA(xi)∣x∈X〉 A=〈x,ζA(xi),ϑA(xi)∣x∈X〉defined on universal set X, which is as follows:
An={〈x,[ζA(xi)]n,1-[1-ϑA(xi)]n〉∣x∈X}
Definition 5.
The concentration of an IFS A of the universe X is denoted by CON(A) CON(A) and is defined by:
CON(A)={〈x,ζCON(A)(x),ϑCON(A)(x)〉∣x∈X}
where ζ(CON(A))(x)=[ζA(x)]2 ζ(CON(A))(x)=[ζA(x)]2 , ϑCON(A)(x))=1-[1-ϑA(x)]2 ϑCON(A)(x))=1-[1-ϑA(x)]2 , i.e., the operation of the concentration of an IFS is defined by CON(A)=A2 CON(A)=A2 .
Definition 6.
The dilation of an IFS A of the universe X is denoted by DIL(A) DIL(A) and is defined by:
DIL(A)={〈x,ζDIL(A)(x),ϑDIL(A)(x)〉∣x∈X}
where ζDIL(A)(x)=[ζA(x)]1/2 ζDIL(A)(x)=[ζA(x)]1/2 and ϑDIL(A)(x)=1-[1-ϑA(x)]1/2 ϑDIL(A)(x)=1-[1-ϑA(x)]1/2 , i.e., the operation of the dilation of an IFS is defined by DIL(A)=A1/2 DIL(A)=A1/2
Example 1.
Consider a universe of the discourse X={x1,x2,x3,x4,x5} X={x1,x2,x3,x4,x5} , and an IFS A “LARGE” of X may be defined by:
LARGE={(x1,0.1,0.8),(x2,0.3,0.5),(x3,0.5,0.4),(x4,0.9,0),(x5,1,0)}
Using the operations as defined in Equation (12), we have generated the following IFSs
A1/2,A2,A3,A4,
which are defined as follows:
A1/2 may be treated as “More or less LARGE”A2 may be treated as “very LARGE”A3 may be treated as “quite very LARGE”A4 may be treated as “very very LARGE”
and their corresponding sets are computed as:
A12={(x1,0.3162,0.5528),(x2,0.5477,0.2929),(x3,0.7071,0.2254),(x4,0.9487,0),(x5,1,0)}A2={(x1,0.01,0.96),(x2,0.09,0.75),(x3,0.25,0.64),(x4,0.81,0),(x5,1,0)}A3={(x1,0.001,0.9920),(x2,0.0270,0.8750),(x3,0.1250,0.7840),(x4,0.7290,0),(x5,1,0)}A4={(x1,0.0001,0.9984),(x2,0.0081,0.9375),(x3,0.0625,0.8704),(x4,0.6561,0),(x5,1,0)}
From the viewpoint of mathematical operations, the entropy values of the above defined IFSs, A1/2 A1/2 , A, A2 A2 , A3 A3 and A4 A4, have the following requirement:
E(A1/2)>E(A)>E(A2)>E(A3)>E(A4)
Based on the dataset given in the above, we compute the entropy measure for them at different values of R and S. The result corresponding to these different pairs of values is summarized in Table 1 along with the existing approaches’ results. From these computed values, it is observed that the ranking order of the linguistic variable by the proposed entropy follows the pattern as described in Equation (13) for some suitable pairs of (R,S) (R,S) , while the performance order pattern corresponding to [19,21,57] and [58] is E(A)>E(A1/2)>E(A2)>E(A3)>E(A4) E(A)>E(A1/2)>E(A2)>E(A3)>E(A4) , which does not satisfy the requirement given in Equation (13). Hence, the proposed entropy measure is a good alternative and performs better than the existing measures. Furthermore, for different pairs of (R,S) (R,S), a decision-maker may have more choices to access the alternatives from the viewpoint of structured linguistic variables.
[ Table omitted. See PDF. ]
4. MADM Problem Based on the Proposed Entropy Measure
In this section, we present a method for solving the MADM problem based on the proposed entropy measure.
4.1. Approach I: When the Attribute Weight Is Completely Unknown
In this section, we present a decision-making approach for solving the multi-attribute decision-making problem in the intuitionistic fuzzy set environment. For this, consider a set of ‘n’ different alternatives, denoted by A1,A2,…,An A1,A2,…,An , which are evaluated by a decision-maker under the ‘m’ different attributes G1,G2,…,Gm G1,G2,…,Gm . Assume that a decision-maker has evaluated these alternatives in the intuitionistic fuzzy environment and noted their rating values in the form of the IFNs αij=〈ζij,ϑij〉 αij=〈ζij,ϑij〉 where ζij ζij denotes that the degree of the alternative Ai Ai satisfies under the attribute Gj Gj , while ϑij ϑij denotes the dissatisfactory degree of an alternative Ai Ai under Gj Gj such that ζij,ϑij∈[0,1] ζij,ϑij∈[0,1] and ζij+ϑij≤1 ζij+ϑij≤1 for i=1,2,…,m i=1,2,…,m and j=1,2,…,n j=1,2,…,n . Further assume that the weight vector ωj(j=1,2,…,m) ωj(j=1,2,…,m) of each attribute is completely unknown. Hence, based on the decision-maker preferences αij αij, the collective values are summarized in the form of the decision matrix D as follows:
D=G1G2…GmA1A2⋮An(〈ζ11,ϑ11〉〈ζ21,ϑ21〉⋮〈ζn1,ϑn1〉〈ζ12,ϑ12〉〈ζ22,ϑ22〉⋮〈ζn2,ϑn2〉……⋱…〈ζ1m,ϑ1m〉〈ζ2m,ϑ2m〉⋮〈ζnm,ϑnm〉)
Then, the following steps of the proposed approach are summarized to find the best alternative(s).
Step 1: Normalize the rating values of the decision-maker, if required, by converting the rating values corresponding to the cost type attribute into the benefit type. For this, the following normalization formula is used:
rij=〈ζij,ϑij〉;ifthebenefittypeattribute〈ϑij,ζij〉;ifthecosttypeattribute
and hence, we obtain the normalized IF decision matrix R=(rij)n×m R=(rij)n×m.
Step 2: Based on the matrix R, the information entropy of attribute Gj(j=1,2,…,m) Gj(j=1,2,…,m)is computed as:
(HRS)j=R×Sn(R-S)∑i=1nζijS+ϑijS+πijS1S-ζijR+ϑijR+πijR1R
where R,S>0 R,S>0 and R≠S R≠S.
Step 3: Based on the entropy matrix, HRS(αij) HRS(αij) defined in Equation (16), the degree of divergence (dj) (dj) of the average intrinsic information provided by the correspondence on the attribute Gj Gj can be defined as dj=1-κj dj=1-κj where κj=∑i=1nHRS(αij),j=1,2,…,m κj=∑i=1nHRS(αij),j=1,2,…,m . Here, the value of dj dj represents the inherent contrast intensity of attribute Gj Gj , and hence, based on this, the attributes weight ωj(j=1,2,…,n) ωj(j=1,2,…,n)is given as:
ωj=dj∑j=1mdj=1-κj∑j=1m(1-κj)=1-κjm-∑j=1mκj
Step 4: Construct the weighted sum of each alternative by multiplying the score function of each criterion by its assigned weight as:
Q(Ai)=∑j=1mωj(ζij-ϑij);i=1,2,…,n
Step 5: Rank all the alternatives Ai(i=1,2,…,n) Ai(i=1,2,…,n) according to the highest value of Q(Ai) Q(Ai)and, hence, choose the best alternative.
The above-mentioned approach has been illustrated with a practical example of the decision-maker, which can be read as:
Example 2.
Consider a decision-making problem from the field of the recruitment sector. Assume that a pharmaceutical company wants to select a lab technician for a micro-bio laboratory. For this, the company has published a notification in a newspaper and considered the four attributes required for technician selection, namely academic record (G1) (G1) , personal interview evaluation (G2) (G2) , experience (G3) (G3) and technical capability (G4) (G4) . On the basis of the notification conditions, only five candidates A1,A2,A3,A4 A1,A2,A3,A4 and A5 A5 as alternatives are interested and selected to be presented to the panel of experts for this post. Then, the main object of the company is to choose the best candidate among them for the task. In order to describe the ambiguity and uncertainties in the data, the preferences related to each alternative are represented in the IFS environment. The preferences of each alternative are represented in the form of IFNs as follows:
D=G1G2G3G4A1A2A3A4A5(〈0.7,0.2〉〈0.7,0.1〉〈0.6,0.3〉〈0.8,0.1〉〈0.6,0.3〉〈0.5,0.4〉〈0.5,0.2〉〈0.5,0.1〉〈0.6,0.3〉〈0.4,0.6〉〈0.6,0.2〉〈0.7,0.2〉〈0.5,0.3〉〈0.3,0.7〉〈0.7,0.2〉〈0.6,0.3〉〈0.4,0.5〉〈0.6,0.2〉〈0.6,0.3〉〈0.5,0.4〉)
Then, the steps of the proposed approach are followed to find the best alternative(s) as below:
Step 1: Since all the attributes are of the same type, so there is no need for the normalization process.
Step 2: Without loss of generality, we take R=0.3 R=0.3 and S=2 S=2 and, hence, compute the entropy measurement value for each attribute by using Equation (16). The results corresponding to it are HRS(G1)=3.4064 HRS(G1)=3.4064 , HRS(G2)=3.372 HRS(G2)=3.372 , HRS(G3)=3.2491 HRS(G3)=3.2491 and HRS(G4)=3.7564 HRS(G4)=3.7564.
Step 3: Based on these entropy values, the weight of each criterion is calculated as ω=(0.2459 ω=(0.2459 , 0.2425 0.2425 , 0.2298 0.2298 , 0.2817)T 0.2817)T.
Step 4: The overall weighted score values of the alternative corresponding to R=0.3 R=0.3 , S=2 S=2 and ω=(0.2459,0.2425,0.2298,0.2817)T ω=(0.2459,0.2425,0.2298,0.2817)T obtained by using Equation (18) are Q(A1)=0.3237 Q(A1)=0.3237 , Q(A2)=0.3071 Q(A2)=0.3071 , Q(A3)=0.3294 Q(A3)=0.3294 , Q(A4)=0.2375 Q(A4)=0.2375 and Q(A5)=0.1684 Q(A5)=0.1684.
Step 5: Since Q(A3)>Q(A1)>Q(A2)>Q(A4)>Q(A5) Q(A3)>Q(A1)>Q(A2)>Q(A4)>Q(A5) , hence the ranking order of the alternatives is A3≻A1≻A2≻A4≻A5 A3≻A1≻A2≻A4≻A5 . Thus, the best alternative is A3 A3.
However, in order to analyze the influence of the parameters R and S on the final ranking order of the alternatives, the steps of the proposed approach are executed by varying the values of R from 0.1 to 1.0 and S from 1.0 to 5.0. The overall score values of each alternative along with the ranking order are summarized in Table 2. From this analysis, we conclude that the decision-maker can plan to choose the values of R and S and, hence, their respective alternatives according to his goal. Therefore, the proposed measures give various choices to the decision-maker to reach the target.
[ Table omitted. See PDF. ]
4.2. Approach II: When the Attribute Weight Is Partially Known
In this section, we present an approach for solving the multi-attribute decision-making problem in the IFS environment where the information about the attribute weight is partially known. The description of the MADM problem is mentioned in Section 4.1.
Since decision-making during a real-life situation is highly complex due to a large number of constraints, human thinking is inherently subjective, and the importance of the attribute weight vector is incompletely known. In order to represent this incomplete information about the weights, the following relationship has been defined for i≠j i≠j:
1. A weak ranking: ωi≥ωj ωi≥ωj;
2. A strict ranking: ωi-ωj≥σi ωi-ωj≥σi ; (σi>0) (σi>0).
3. A ranking with multiples: ωi≥σi ωj ωi≥σi ωj , (0≤σi≤1) (0≤σi≤1);
4. An interval form: λi≤ωi≤λi+δi λi≤ωi≤λi+δi , (0≤λi≤λi+δi≤1) (0≤λi≤λi+δi≤1);
5. A ranking of differences: ωi-ωj≥ωk-ωl ωi-ωj≥ωk-ωl , (j≠k≠l) (j≠k≠l).
The set of this known weight information is denoted by Δ Δin this paper.
Then, the proposed approach is summarized in the following steps to obtain the most desirable alternative(s).
Step 1: Similar to Approach I.
Step 2: similar to Approach I.
Step 3: The overall entropy of the alternative Ai(i=1,2,…,n) Ai(i=1,2,…,n) for the attribute Gj Gjis given by:
H(Ai)=∑j=1mHRS(αij)=R×Sn(R-S)∑j=1m∑i=1n(ζijS+ϑijS+πijS)1S-(ζijR+ϑijR+πijR)1R
where R,S>0 R,S>0 and R≠S R≠S.
By considering the importance of each attribute in terms of weight vector ω=(ω1,ω2,…,ωm)T ω=(ω1,ω2,…,ωm)T, we formulate a linear programming model to determine the weight vector as follows:
minH=∑i=1nH(Ai)=∑i=1n∑j=1mωj HRS(αij)=R×Sn(R-S)∑j=1mωj∑i=1n(ζijS+ϑijS+πijS)1S-(ζijR+ϑijR+πijR)1Rs.t.∑j=1mωj=1ωj≥0;ω∈Δ
After solving this model, we get the optimal weight vector ω=(ω1,ω2,…,ωm)T ω=(ω1,ω2,…,ωm)T.
Step 4: Construct the weighted sum of each alternative by multiplying the score function of each criterion by its assigned weight as:
Q(Ai)=∑j=1mωj(ζij-ϑij);i=1,2,…,n
Step 5: Rank all the alternative Ai(i=1,2,…,n) Ai(i=1,2,…,n) according to the highest value of Q(Ai) Q(Ai)and, hence, choose the best alternative.
To demonstrate the above-mentioned approach, a numerical example has been taken, which is stated as below.
Example 3.
Consider an MADM problem, which was stated and described in Example 2, where the five alternatives A1,A2,…,A5 A1,A2,…,A5 are assessed under the four attributes G1,G2,G3,G4 G1,G2,G3,G4 in the IFS environment. Here, we assume that the information about the attribute weight is partially known and is given by the decision-maker as Δ={0.15≤ω1≤0.45,0.2≤ω2≤0.5,0.1≤ω3≤0.3,0.1≤ω4≤0.2,ω1≥ω4,∑j=14ωj=1} Δ={0.15≤ω1≤0.45,0.2≤ω2≤0.5,0.1≤ω3≤0.3,0.1≤ω4≤0.2,ω1≥ω4,∑j=14ωj=1} . Then, based on the rating values as mentioned in Equation (19), the following steps of the Approach II are executed as below:
Step 1: All the attributes are te same types, so there is no need for normalization.
Step 2: Without loss of generality, we take R=0.3 R=0.3 and S=2 S=2 and, hence, compute the entropy measurement value for each attribute by using Equation (20). The results corresponding to it are HRS(G1)=3.4064 HRS(G1)=3.4064 , HRS(G2)=3.372 HRS(G2)=3.372 , HRS(G3)=3.2491 HRS(G3)=3.2491 and HRS(G4)=3.7564 HRS(G4)=3.7564 .
Step 3: Formulate the optimization model by utilizing the information of rating values and the partial information of the weight vector Δ={0.15≤ω1≤0.45,0.2≤ω2≤0.5,0.1≤ω3≤0.3,0.1≤ω4≤0.2,ω1≥ω4,∑j=14ωj=1} Δ={0.15≤ω1≤0.45,0.2≤ω2≤0.5,0.1≤ω3≤0.3,0.1≤ω4≤0.2,ω1≥ω4,∑j=14ωj=1} as:
minH=3.4064ω1+3.372ω2+3.2491ω3+3.7564ω4subjectto0.15≤ω1≤0.45,0.2≤ω2≤0.5,0.1≤ω3≤0.3,0.1≤ω4≤0.2,ω1≥ω4,andω1+ω2+ω3+ω4=1.
Hence, we solve the model with the help of MATLAB software, and we can obtain the weight vector as ω=(0.15,0.45,0.30,0.10)T ω=(0.15,0.45,0.30,0.10)T .
Step 4: The overall weighted score values of the alternative corresponding to R=0.3 R=0.3 , S=2 S=2 and ω=(0.15,0.45,0.30,0.10)T ω=(0.15,0.45,0.30,0.10)T obtained by using Equation (21) are Q(A1)=0.2700 Q(A1)=0.2700 , Q(A2)=0.3650 Q(A2)=0.3650 , Q(A3)=0.3250 Q(A3)=0.3250 and Q(A4)=0.1500 Q(A4)=0.1500 and Q(A5)=0.1150 Q(A5)=0.1150 .
Step 5: Since Q(A2)>Q(A3)>Q(A1)>Q(A4)>Q(A5) Q(A2)>Q(A3)>Q(A1)>Q(A4)>Q(A5) , hence the ranking order of the alternatives is A2≻A3≻A1≻A4≻A5 A2≻A3≻A1≻A4≻A5 . Thus, the best alternative is A2 A2 .
5. Conclusions
In this paper, we propose an entropy measure based on the ( R,S R,S )-norm in the IFS environment. Since the uncertainties present in the data play a crucial role during the decision-making process, in order to measure the degree of fuzziness of a set and maintaining the advantages of it, in the present paper, we addressed a novel ( R,S R,S )-norm-based information measure. Various desirable relations, as well as some of its properties, were investigated in detail. From the proposed measures, it was observed that some of the existing measures were the special cases of the proposed measures. Furthermore, based on the different parametric values of R and S, the decision-maker(s) may have different choices to make a decision according to his/her choice. In addition to these and to explore the structural characteristics and functioning of the proposed measures, two decision-making approaches were presented to solve the MADM problems in the IFS environment under the characteristics that attribute weights are either partially known or completely unknown. The presented approaches were illustrated with numerical examples. The major advantages of the proposed measure are that it gives various choices to select the best alternatives, according to the decision-makers’ desired goals, and hence, it makes the decision-makers more flexible and reliable. From the studies, it is concluded that the proposed work provides a new and easy way to handle the uncertainty and vagueness in the data and, hence, provides an alternative way to solve the decision-making problem in the IFS environment. In the future, the result of this paper can be extended to some other uncertain and fuzzy environments [59,60,61,62].
Author Contributions
Conceptualization, Methodology, Validation, H.G.; Formal Analysis, Investigation, H.G., J.K.; Writing-Original Draft Preparation, H.G.; Writing-Review & Editing, H.G.; Visualization, H.G.
Conflicts of Interest
The authors declare no conflict of interest.
1. Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353.
2. Atanassov, K.T. Intuitionistic fuzzy sets. Fuzzy Sets Syst. 1986, 20, 87–96.
3. Atanassov, K.; Gargov, G. Interval-valued intuitionistic fuzzy sets. Fuzzy Sets Syst. 1989, 31, 343–349.
4. Xu, Z.S.; Yager, R.R. Some geometric aggregation operators based on intuitionistic fuzzy sets. Int. J. Gen. Syst. 2006, 35, 417–433.
5. Xu, Z.S. Intuitionistic fuzzy aggregation operators. IEEE Trans. Fuzzy Syst. 2007, 15, 1179–1187.
6. Garg, H. Generalized intuitionistic fuzzy interactive geometric interaction operators using Einstein t-norm and t-conorm and their application to decision-making. Comput. Ind. Eng. 2016, 101, 53–69.
7. Garg, H. Novel intuitionistic fuzzy decision-making method based on an improved operation laws and its application. Eng. Appl. Artif. Intell. 2017, 60, 164–174.
8. Wang, W.; Wang, Z. An approach to multi-attribute interval-valued intuitionistic fuzzy decision-making with incomplete weight information. In Proceedings of the 15th IEEE International Conference on Fuzzy Systems and Knowledge Discovery, Jinan, China, 18–20 October 2008; Volume 3, pp. 346–350.
9. Wei, G. Some induced geometric aggregation operators with intuitionistic fuzzy information and their application to group decision-making. Appl. Soft Comput. 2010, 10, 423–431.
10. Arora, R.; Garg, H. Robust aggregation operators for multi-criteria decision-making with intuitionistic fuzzy soft set environment. Sci. Iran. E 2018, 25, 931–942.
11. Arora, R.; Garg, H. Prioritized averaging/geometric aggregation operators under the intuitionistic fuzzy soft set environment. Sci. Iran. 2018, 25, 466–482.
12. Zhou, W.; Xu, Z. Extreme intuitionistic fuzzy weighted aggregation operators and their applications in optimism and pessimism decision-making processes. J. Intell. Fuzzy Syst. 2017, 32, 1129–1138.
13. Garg, H. Some robust improved geometric aggregation operators under interval-valued intuitionistic fuzzy environment for multi-criteria decision -making process. J. Ind. Manag. Optim. 2018, 14, 283–308.
14. Xu, Z.; Gou, X. An overview of interval-valued intuitionistic fuzzy information aggregations and applications. Granul. Comput. 2017, 2, 13–39.
15. Jamkhaneh, E.B.; Garg, H. Some new operations over the generalized intuitionistic fuzzy sets and their application to decision-making process. Granul. Comput. 2018, 3, 111–122.
16. Garg, H.; Singh, S. A novel triangular interval type-2 intuitionistic fuzzy sets and their aggregation operators. Iran. J. Fuzzy Syst. 2018.
17. Shanon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423.
18. Deluca, A.; Termini, S. A definition of Non-probabilistic entropy in setting of fuzzy set theory. Inf. Control 1971, 20, 301–312.
19. Szmidt, E.; Kacprzyk, J. Entropy for intuitionistic fuzzy sets. Fuzzy Sets Syst. 2001, 118, 467–477.
20. Vlachos, I.K.; Sergiadis, G.D. Intuitionistic fuzzy information-application to pattern recognition. Pattern Recognit. Lett. 2007, 28, 197–206.
21. Burillo, P.; Bustince, H. Entropy on intuitionistic fuzzy sets and on interval-valued fuzzy sets. Fuzzy Sets Syst. 1996, 78, 305–316.
22. Garg, H.; Agarwal, N.; Tripathi, A. Generalized Intuitionistic Fuzzy Entropy Measure of Order α and Degree β and its applications to Multi-criteria decision-making problem. Int. J. Fuzzy Syst. Appl. 2017, 6, 86–107.
23. Wei, C.P.; Gao, Z.H.; Guo, T.T. An intuitionistic fuzzy entropy measure based on the trigonometric function. Control Decis. 2012, 27, 571–574.
24. Garg, H.; Agarwal, N.; Tripathi, A. Entropy based multi-criteria decision-making method under Fuzzy Environment and Unknown Attribute Weights. Glob. J. Technol. Optim. 2015, 6, 13–20.
25. Zhang, Q.S.; Jiang, S.Y. A note on information entropy measure for vague sets. Inf. Sci. 2008, 178, 4184–4191.
26. Verma, R.; Sharma, B.D. Exponential entropy on intuitionistic fuzzy sets. Kybernetika 2013, 49, 114–127.
27. Taneja, I.J. On generalized information measures and their applications. In Advances in Electronics and Electron Physics; Elsevier: New York, NY, USA, 1989; Volume 76, pp. 327–413.
28. Boekee, D.E.; Van der Lubbe, J.C. The R-norm information measure. Inf. Control 1980, 45, 136–155.
29. Hung, W.L.; Yang, M.S. Similarity measures of intuitionistic fuzzy sets based on Hausdorff distance. Pattern Recognit. Lett. 2004, 25, 1603–1611.
30. Garg, H. Distance and similarity measure for intuitionistic multiplicative preference relation and its application. Int. J. Uncertain. Quantif. 2017, 7, 117–133.
31. Garg, H.; Arora, R. Distance and similarity measures for Dual hesistant fuzzy soft sets and their applications in multi criteria decision-making problem. Int. J. Uncertain. Quantif. 2017, 7, 229–248.
32. Joshi, R.; Kumar, S. An (R,S)-norm fuzzy information measure with its applications in multiple-attribute decision-making. Comput. Appl. Math. 2017, 1–22.
33. Garg, H.; Kumar, K. An advanced study on the similarity measures of intuitionistic fuzzy sets based on the set pair analysis theory and their application in decision making. Soft Comput. 2018, 1–12.
34. Garg, H.; Kumar, K. Distance measures for connection number sets based on set pair analysis and its applications to decision-making process. Appl. Intell. 2018, 1–14.
35. Garg, H.; Nancy. On single-valued neutrosophic entropy of order α. Neutrosophic Sets Syst. 2016, 14, 21–28.
36. Selvachandran, G.; Garg, H.; Alaroud, M.H.S.; Salleh, A.R. Similarity Measure of Complex Vague Soft Sets and Its Application to Pattern Recognition. Int. J. Fuzzy Syst. 2018, 1–14.
37. Bajaj, R.K.; Kumar, T.; Gupta, N. R-norm intuitionistic fuzzy information measures and its computational applications. In Eco-friendly Computing and Communication Systems; Springer: Berlin, Germany, 2012; pp. 372–380.
38. Garg, H.; Kumar, K. Improved possibility degree method for ranking intuitionistic fuzzy numbers and their application in multiattribute decision-making. Granul. Comput. 2018, 1–11.
39. Mei, Y.; Ye, J.; Zeng, Z. Entropy-weighted ANP fuzzy comprehensive evaluation of interim product production schemes in one-of-a-kind production. Comput. Ind. Eng. 2016, 100, 144–152.
40. Chen, S.M.; Chang, C.H. A novel similarity measure between Atanassov’s intuitionistic fuzzy sets based on transformation techniques with applications to pattern recognition. Inf. Sci. 2015, 291, 96–114.
41. Garg, H. Hesitant Pythagorean fuzzy sets and their aggregation operators in multiple attribute decision-making. Int. J. Uncertain. Quantif. 2018, 8, 267–289.
42. Chen, S.M.; Cheng, S.H.; Chiou, C.H. Fuzzy multiattribute group decision-making based on intuitionistic fuzzy sets and evidential reasoning methodology. Inf. Fusion 2016, 27, 215–227.
43. Kaur, G.; Garg, H. Multi-Attribute Decision-Making Based on Bonferroni Mean Operators under Cubic Intuitionistic Fuzzy Set Environment. Entropy 2018, 20, 65.
44. Chen, T.Y.; Li, C.H. Determining objective weights with intuitionistic fuzzy entropy measures: A comparative analysis. Inf. Sci. 2010, 180, 4207–4222.
45. Li, D.F. TOPSIS- based nonlinear-programming methodology for multiattribute decision-making with interval-valued intuitionistic fuzzy sets. IEEE Trans. Fuzzy Syst. 2010, 18, 299–311.
46. Garg, H.; Arora, R. A nonlinear-programming methodology for multi-attribute decision-making problem with interval-valued intuitionistic fuzzy soft sets information. Appl. Intell. 2017, 1–16.
47. Garg, H.; Nancy. Non-linear programming method for multi-criteria decision-making problems under interval neutrosophic set environment. Appl. Intell. 2017, 1–15.
48. Saaty, T.L. Axiomatic foundation of the analytic hierarchy process. Manag. Sci. 1986, 32, 841–845.
49. Hwang, C.L.; Lin, M.J. Group Decision Making under Multiple Criteria: Methods and Applications; Springer: Berlin, Germany, 1987.
50. Arora, R.; Garg, H. A robust correlation coefficient measure of dual hesistant fuzzy soft sets and their application in decision-making. Eng. Appl. Artif. Intell. 2018, 72, 80–92.
51. Garg, H.; Kumar, K. Some aggregation operators for linguistic intuitionistic fuzzy set and its application to group decision-making process using the set pair analysis. Arab. J. Sci. Eng. 2018, 43, 3213–3227.
52. Abdullah, L.; Najib, L. A new preference scale mcdm method based on interval-valued intuitionistic fuzzy sets and the analytic hierarchy process. Soft Comput. 2016, 20, 511–523.
53. Garg, H. Generalized intuitionistic fuzzy entropy-based approach for solving multi-attribute decision-making problems with unknown attribute weights. Proc. Natl. Acad. Sci. India Sect. A Phys. Sci. 2017, 1–11.
54. Xia, M.; Xu, Z. Entropy/cross entropy-based group decision-making under intuitionistic fuzzy environment. Inf. Fusion 2012, 13, 31–47.
55. Garg, H.; Nancy. Linguistic single-valued neutrosophic prioritized aggregation operators and their applications to multiple-attribute group decision-making. J. Ambient Intell. Humaniz. Comput. 2018, 1–23.
56. De, S.K.; Biswas, R.; Roy, A.R. Some operations on intuitionistic fuzzy sets. Fuzzy Sets Syst. 2000, 117, 477–484.
57. Zeng, W.; Li, H. Relationship between similarity measure and entropy of interval-valued fuzzy sets. Fuzzy Sets Syst. 2006, 157, 1477–1484.
58. Hung, W.L.; Yang, M.S. Fuzzy Entropy on intuitionistic fuzzy sets. Int. J. Intell. Syst. 2006, 21, 443–451.
59. Garg, H. Some methods for strategic decision-making problems with immediate probabilities in Pythagorean fuzzy environment. Int. J. Intell. Syst. 2018, 33, 687–712.
60. Garg, H. Linguistic Pythagorean fuzzy sets and its applications in multiattribute decision-making process. Int. J. Intell. Syst. 2018, 33, 1234–1263.
61. Garg, H. Generalized interaction aggregation operators in intuitionistic fuzzy multiplicative preference environment and their application to multicriteria decision-making. Appl. Intell. 2017, 1–17.
62. Garg, H.; Arora, R. Generalized and Group-based Generalized intuitionistic fuzzy soft sets with applications in decision-making. Appl. Intell. 2018, 48, 343–356.
School of Mathematics, Thapar Institute of Engineering & Technology, Deemed University, Patiala 147004, Punjab, India
*Author to whom correspondence should be addressed.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2018. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
The objective of this manuscript is to present a novel information measure for measuring the degree of fuzziness in intuitionistic fuzzy sets (IFSs). To achieve it, we define an(R,S)-norm-based information measure called the entropy to measure the degree of fuzziness of the set. Then, we prove that the proposed entropy measure is a valid measure and satisfies certain properties. An illustrative example related to a linguistic variable is given to demonstrate it. Then, we utilized it to propose two decision-making approaches to solve the multi-attribute decision-making (MADM) problem in the IFS environment by considering the attribute weights as either partially known or completely unknown. Finally, a practical example is provided to illustrate the decision-making process. The results corresponding to different pairs of(R,S)give different choices to the decision-maker to assess their results.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer