Home Some Interval-Valued Pythagorean Fuzzy Einstein Weighted Averaging Aggregation Operators and Their Application to Group Decision Making
Article Open Access

Some Interval-Valued Pythagorean Fuzzy Einstein Weighted Averaging Aggregation Operators and Their Application to Group Decision Making

  • Khaista Rahman EMAIL logo , Saleem Abdullah and Muhammad Sajjad Ali Khan
Published/Copyright: February 24, 2018
Become an author with De Gruyter Brill

Abstract

In this paper, we introduce the notion of Einstein aggregation operators, such as the interval-valued Pythagorean fuzzy Einstein weighted averaging aggregation operator and the interval-valued Pythagorean fuzzy Einstein ordered weighted averaging aggregation operator. We also discuss some desirable properties, such as idempotency, boundedness, commutativity, and monotonicity. The main advantage of using the proposed operators is that these operators give a more complete view of the problem to the decision makers. These operators provide more accurate and precise results as compared the existing method. Finally, we apply these operators to deal with multiple-attribute group decision making under interval-valued Pythagorean fuzzy information. For this, we construct an algorithm for multiple-attribute group decision making. Lastly, we also construct a numerical example for multiple-attribute group decision making.

1 Introduction

Atanassov [1] introduced the concept of intuitionistic fuzzy sets (IFSs) characterized by a membership function and a non-membership function. It is more suitable for dealing with fuzziness and uncertainty than the ordinary fuzzy set developed by Zadeh [33] characterized by membership function. In 1986, many scholars [2], [3], [4], [5], [6], [22] have done works in the field of IFS and its applications. Particularly, information aggregation is a very crucial research area in IFS theory that has been receiving more and more focus. Xu [23] developed some basic arithmetic aggregation operators, including intuitionistic fuzzy weighted averaging (IFWA) aggregation operator, intuitionistic fuzzy ordered weighted averaging (IFOWA) aggregation operator, and intuitionistic fuzzy hybrid averaging (IFHA) aggregation operator, and applied them to group decision making. Xu and Yager [26] defined some basic geometric aggregation operators, such as intuitionistic fuzzy weighted geometric (IFWG) aggregation operator, intuitionistic fuzzy ordered weighted geometric (IFOWG) aggregation operator, and intuitionistic fuzzy hybrid geometric (IFHG) aggregation operator. In Refs. [24], [25], Chen and Xu familiarized a series of a new types of aggregation operators, such as interval-valued IFWA (IIFWA) aggregation operator, interval-valued IFOWA (IIFOWA) aggregation operator, interval-valued IFHA (IIFHA) aggregation operator, interval-valued IFWG (IIFWG) aggregation operator, interval-valued IFOWG (IIFOWG) aggregation operator, and interval-valued IFHG (IIFHG) aggregation operator. In Refs. [20], [21], Wang and Liu introduced the concept of intuitionistic fuzzy Einstein weighted geometric (IFEWG) aggregation operator, intuitionistic fuzzy Einstein ordered weighted geometric (IFEOWG) aggregation operator, intuitionistic fuzzy Einstein weighted averaging (IFEWA) aggregation operator, and intuitionistic fuzzy Einstein ordered weighted averaging (IFEOWA) aggregation operator, and applied them to group decision making. In Refs. [29], [30], [31], [32], Yu also worked in the field of IFS theory and introduced many aggregation operators and applied them to group decision making. However, there are many cases where the decision maker may provide the degree of membership and non-membership of a particular attribute in such a way that their sum is greater than one. Therefore, Yager [27] introduced the concept of another set called Pythagorean fuzzy set. The Pythagorean fuzzy set is a more powerful tool for solving uncertain problems. Like intuitionistic fuzzy aggregation operators, Pythagorean fuzzy aggregation operators have also become an interesting and important area for research, after the advent of the Pythagorean fuzzy set theory. In Ref. [28], Yager and Abbasov introduced the notion of two new Pythagorean fuzzy aggregation operators, such as Pythagorean fuzzy weighted averaging (PFWA) aggregation operator and Pythagorean fuzzy ordered weighted averaging (PFOWA) operator. In Refs. [12], [13], [14], [16], [17], Rahman et al. introduced the concept of Pythagorean fuzzy hybrid averaging (PFHA) aggregation operator, Pythagorean fuzzy weighted geometric (PFWG) aggregation operator, Pythagorean fuzzy ordered weighted geometric (PFOWG) operator, Pythagorean fuzzy hybrid geometric (PFHG) aggregation operator, and Pythagorean fuzzy Einstein weighted geometric (PFEWG) operator, and applied them to group decision making. In Refs. [7], [8], Garg introduced the notion of Pythagorean fuzzy Einstein weighted averaging (PFEWA) aggregation operator, Pythagorean fuzzy Einstein ordered weighted averaging (PFEOWA) operator, generalized PFEWA (GPFEWA) aggregation operator, generalized PFEOWA (GPFEOWA) aggregation operator, PFWG aggregation operator, PFOWG aggregation operator, PFEWG aggregation operator, Pythagorean fuzzy Einstein ordered weighted geometric (PFEOWG) aggregation operator, GPFEOWA aggregation operator, and generalized PFEOWG (GPFEOWG) aggregation operator, and applied them to group decision making. In Ref. [11], Peng and Yang introduced the notion of interval-valued PFWA (IVPFWA) aggregation operator and interval-valued PFWG (IVPFWG) aggregation operator. In Refs. [15], [18], Rahman et al. introduced the concept of interval-valued PFOWA (IVPFOWA) aggregation operator, interval-valued PFHA (IVPFHWA) aggregation operator, interval-valued PFOWG (IVPFOWG) aggregation operator, and interval-valued Pythagorean fuzzy hybrid weighted geometric operator, and applied them to group decision making. In Refs. [9], [10], [19], [34], many scholars worked in Pythagorean fuzzy set theory and aggregation operators.

Thus, keeping the advantages of these operators, in this paper, we introduce the notion of interval-valued PFEWA (IVPFEWA) aggregation operator and interval-valued PFEOWA (IVPFEOWA) aggregation operator. By comparison with the existing method, it is decided that the method developed in this paper is a good complement to the existing method.

The remainder of this paper is structured as follows. In Section 2, we give some basic definitions and results, which will be used in later sections. In Section 3, we introduce some Einstein operations for interval-valued Pythagorean fuzzy values. In Section 4, we introduce the notion of IVPFEWA and IVPFEOWA aggregation operators. In Section 5, we apply these operators to deal with multiple-attribute group decision-making (MAGDM) problems with Pythagorean fuzzy information. In Section 6, we develop a numerical example. In Section 7, we present our conclusion.

2 Preliminaries

Definition 1 ([11]): Let K be a fixed set, then an IVPFS can be defined as

(1) I={k,uI(k),vI(k)|kK},

where

(2) uI(k)=[uIa(k),uIb(k)][0,1],
(3) vI(k)=[vIa(k),vIb(k)][0,1].

As

(4) uIa(k)=inf(uI(k)),
(5) uIb(k)=sup(uI(k)),
(6) vIa(k)=inf(vI(k)),
(7) vIb(k)=sup(vI(k)),

and

(8) 0(uIb(k))2+(vIb(k))21.

If

(9) πI(k)=[πIa(k),πIb(k)],

then it is called the interval-valued Pythagorean fuzzy index of k to I, where

(10) πIa(k)=1(uIb(k))2(vIb(k))2,
(11) πIb(k)=1(uIa(k))2(vIa(k))2.

Definition 2 ([11]). Let λ=([uλ, vλ ], [xλ, yλ ]) be an IVPFN, then the score function and accuracy function of λ can be defined as

(12) S(λ)=12[(uλ)2+(vλ)2(xλ)2(yλ)2],

and

(13) H(λ)=12[(uλ)2+(vλ)2+(xλ)2+(yλ)2].

If λ1 and λλ2 are two IVPFNs, then

  1. If S(λ1)<S(λ2), then λ1<λ2.

  2. If S(λ1)=S(λ2), then we have the following three conditions:

  1. If H(λ1)=H(λ2), then λ1=λ2.

  2. If H(λ1)<H(λ2), then λ1<λ2.

  3. If H(λ1)>H(λ2), then λ1>λ2.

Definition 3 ([11]): Let λj=([uλj,vλj],[xλj,yλj]) (j=1,2,...,n) be a collection of IVPFVs, and let IVPFWA: Θn→Θ, if

(14) IVPFWAw(λ1,λ2,λ3,...,λn)=([1j=1n(1(uλj)2)wj,1j=1n(1(vλj)2)wj],[j=1n(xλj)wj,j=1n(yλj)wj]),

where w=(w1, w2, …, wn)T is the weighted vector of λj with wj∈[0, 1] and j=1nwj=1. Then, IVPFWA is called interval-valued Pythagorean fuzzy weighted averaging operator.

Definition 4 ([18]): Let λj=([uλj,vλj],[xλj,yλj]) (j=1,2,...,n) be a collection of IVPFVs, and let IVPFOWA: Θn→Θ, if

(15) IVPFOWAw(λ1,λ2,λ3,...,λn)=([1j=1n(1(uλσ(j))2)wj,1j=1n(1(vλσ(j))2)wj],[j=1n(xλσ(j))wj,j=1n(yλσ(j))wj]),

where w=(w1, w2, …, wn)T is the weighted vector of λj with wj∈[0, 1] and j=1nwj=1, and λσ(j) is the jth largest value of λj. Then, IVPFOWA is called interval-valued Pythagorean fuzzy ordered weighted averaging operator.

Definition 5 ([18]): An IVPFHA operator of dimension n is a mapping IVPFHA: Θn → Θ, which has an associated vector w=(w1, w2, …, wn)T, such that wj∈[0, 1] and j=1nwj=1. Furthermore

(16) IVPFHAw,w(λ1,λ2,λ3,...,λn)=([1j=1n(1(uλ˙σ(j))2)wj,1j=1n(1(vλ˙σ(j))2)wj],[j=1n(xλ˙σ(j))wj,j=1n(yλ˙σ(j))wj]),

where λ˙σ(j) is the jth largest of the weighted PFVs, λ˙σ(j)(λ˙σ(j)=nwjλj), and w=(w1, w2, …, wn)T is the weighted vector of λj (j=1, 2, …, n) such that wj∈[0, 1] and j=1nwj=1 and n is the balancing coefficient, which plays a role of balancing. If the vector (w1, w2, …, wn)T approaches (1n,1n,...,1n)T, then the vector (nw1λ1, nw2λ2, …, nwnλn)T approaches (λ1, λ2, …, λn)T.

3 Some Einstein Operations of Interval-Valued Pythagorean Fuzzy Sets

Definition 6: Let λ=([u, v], [x, y]), λ1=([u1, v1], [x1, y1]), and λ2=([u2, v2], [x2, y2]) be three IVPFNs and δ>0, then some Einstein operations for λ, λ1, λ2 can be defined as follows:

(17) λ1ελ2=([u12+u221+u12u22,v12+v221+v12v22],[x1x21+(1x12)(1x22),y1y21+(1y12)(1y22)]).
(18) λ1ελ2=([u1u21+(1u12)(1u22),v1v21+(1v12)(1v22)],[x12+x221+x12x22,y12+y221+y12y22]).
(19) δλ=([(1+u2)δ(1u2)δ(1+u2)δ+(1u2)δ,  (1+v2)δ(1v2)δ(1+v2)δ+(1v2)δ],[2(x2)δ(2x2)δ+(x2)δ,  2(y2)δ(2y2)δ+(y2)δ]).
(20) λδ=([2(u2)δ(2u2)δ+(u2)δ,  2(v2)δ(2v2)δ+(v2)δ],[(1+x2)δ(1x2)δ(1+x2)δ+(1x2)δ,  (1+y2)δ(1y2)δ(1+y2)δ+(1y2)δ]).

4 Some Interval-Valued Pythagorean Fuzzy Einstein Averaging Aggregation Operators

In this section, we introduce two interval-valued Einstein aggregation operators, namely the IVPFEWA and IVPFEOWA operators. We also discuss some desirable properties of these proposed operators, such as idempotency, boundedness, commutativity, and monotonicity. These operators provide more accurate and precise results as compared to the existing method.

4.1 IVPFEWA Aggregation Operator

Definition 7. Let λj=([uj, vj], [xj, yj]) (j=1, 2, 3, …, n) be the collection of IVPFVs, then an IVPFEWA operator of dimension n is a mapping IVPFEWAw: Θn→Θ, and

(21) IVPFEWAw(λ1,λ2,λ3,...,λn)=([j=1n(1+uλj2)wjj=1n(1uλj2)wjj=1n(1+uλj2)wj+j=1n(1uλj2)wj,j=1n(1+vλj2)wjj=1n(1vλj2)wjj=1n(1+vλj2)wj+j=1n(1vλj2)wj],[2j=1n(xλj2)wjj=1n(2xλj2)wj+j=1n(xλj2)wj,2j=1n(yλj2)wjj=1n(2yλj2)wj+j=1n(yλj2)wj]),

where w=(w1, w2, w3, …, wn)T is the weighted vector of λj such that wj∈[0, 1] and j=1nwj=1.

Theorem 1. Let λj=([uλj,vλj],[xλj,yλj])(j=1,2,...,n) be the collection of IVPFVs, then their aggregated value by using the IVPFEWA operator is also an IVPFV, and

(22) IVPFEWAw(λ1,λ2,λ3,...,λn)=([j=1n(1+uλj2)wjj=1n(1uλj2)wjj=1n(1+uλj2)wj+j=1n(1uλj2)wj,j=1n(1+vλj2)wjj=1n(1vλj2)wjj=1n(1+vλj2)wj+j=1n(1vλj2)wj],[2j=1n(xλj2)wjj=1n(2xλj2)wj+j=1n(xλj2)wj,2j=1n(yλj2)wjj=1n(2yλj2)wj+j=1n(yλj2)wj]),

where w=(w1, w2, w3, …, wn)T is the weighted vector of λj (j=1, 2, 3, …, n) such that wj∈[0, 1] (j=1, 2, 3, …, n) and j=1nwj=1.

Proof. We can prove this theorem by mathematical induction. First we show that Eq. (22) holds for n=1.

Taking the left-hand side,

(23) IVPFEWAw(λ)=([(1+uλ2)w(1uλ2)w(1+uλ2)w+(1uλ2)w,(1+vλ2)w(1vλ2)w(1+vλ2)w+(1vλ2)w],[2(xλ2)w(2xλ2)w+(xλ2)w,2(yλ2)w(2yλ2)w+(yλ2)w]).

Taking the right-hand side,

(24) ([j=1n(1+uλj2)wjj=1n(1uλj2)wjj=1n(1+uλj2)wj+j=1n(1uλj2)wj,j=1n(1+vλj2)wjj=1n(1vλj2)wjj=1n(1+vλj2)wj+j=1n(1vλj2)wj],[2j=1n(xλj2)wjj=1n(2xλj2)wj+j=1n(xλj2)wj,2j=1n(yλj2)wjj=1n(2yλj2)wj+j=1n(yλj2)wj])=([(1+uλ2)w(1uλ2)w(1+uλ2)w+(1uλ2)w,(1+vλ2)w(1vλ2)w(1+vλ2)w+(1vλ2)w],[2(xλ2)w(2xλ2)w+(xλ2)w,2(yλ2)w(2yλ2)w+(yλ2)w]).

From Eqs. (23) and (24), we have Eq. (22) holds for n=1. Now we show that Eq. (22) holds for n=k. That is

(25) IVPFEWAw(λ1,λ2,λ3,...,λk)=([j=1k(1+uλj2)wjj=1k(1uλj2)wjj=1k(1+uλj2)wj+j=1n(1uλj2)wj,j=1k(1+vλj2)wjj=1k(1vλj2)wjj=1k(1+vλj2)wj+j=1k(1vλj2)wj],[2j=1k(xλj2)wjj=1k(2xλj2)wj+j=1k(xλj2)wj,2j=1k(yλj2)wjj=1k(2yλj2)wj+j=1k(yλj2)wj]).

If Eq. (25) holds for n=k, then we show that Eq. (25) holds for n=k+1.

(26) IVPFEWAw(λ1,λ2,λ3,...,λk+1)=([j=1k(1+uλj2)wjj=1k(1uλj2)wjj=1k(1+uλj2)wj+j=1n(1uλj2)wj,j=1k(1+vλj2)wjj=1k(1vλj2)wjj=1k(1+vλj2)wj+j=1k(1vλj2)wj],[2j=1k(xλj2)wjj=1k(2xλj2)wj+j=1k(xλj2)wj,2j=1k(yλj2)wjj=1k(2yλj2)wj+j=1k(yλj2)wj])ε([(1+uλk+12)wk+1(1uλk+12)wk+1(1+uλk+12)wk+1+(1uλk+12)wk+1,(1+vλk+12)wk+1(1vλk+12)wk+1(1+vλk+12)wk+1+(1vλk+12)wk+1],[2(xλk+12)wk+1(2xλk+12)wk+1+(xλk+12)wk+1,2(yλk+12)wk+1(2yλk+12)wk+1+(yλk+12)wk+1]).

Let

t1=j=1k(1+uλj2)wjj=1k(1uλj2)wj.

t2=j=1k(1+uλj2)wj+j=1n(1uλj2)wj.

p1=j=1k(1+vλj2)wjj=1k(1vλj2)wj.

p2=j=1k(1+vλj2)wj+j=1k(1vλj2)wj.

w1=(1+uλk+12)wk+1(1uλk+12)wk+1.

w2=(1+uλk+12)wk+1+(1uλk+12)wk+1.

a1=(1+vλk+12)wk+1(1vλk+12)wk+1.

a2=(1+vλk+12)wk+1+(1vλk+12)wk+1.

r2=j=1k(2xλj2)wj+j=1k(xλj2)wj.

r1=2j=1k(xλj2)wj,s1=2j=1k(yλj2)wj.

s2=j=1k(2yλj2)wj+j=1k(yλj2)wj.

b2=(2xλk+12)wk+1+(xλk+12)wk+1.

c1=2(yλk+12)wk+1,b1=2(xλk+12)wk+1.

c2=(2yλk+12)wk+1+(yλk+12)wk+1.

Now putting these values in Eq. (26), we have

(27) IVPFEWAw(λ1,λ2,λ3,...,λk+1)=([t1t2,p1p2],[r1r2,s1s2])ε([w1w2,a1a2],[b1b2,c1c2])=([(t1w2)2+(t2w1)2(t2w2)2+(t1w1)2,(p1a2)2+(a1p2)2(p2a2)2+(p1a1)2],[r1b12r22b22+r12b12r22b12r12b22,s1c12s22c22+s12c12s22c12s12c22]).

Again putting the values of (t1w2)2+(t2w1)2, (t2w2)2+(t1w1)2, (p1a2)2+(a1p2)2, (p2a2)2+(p1a1)2, r1b1, 2r22b22+r12b12r22b12r12b22,s1c1, 2s22c22+s12c12s22c12s12c22, in Eq. (27), we have

IVPFEWAw(λ1,λ2,λ3,...,λk+1)=([j=1k+1(1+uλj2)wjj=1k+1(1uλj2)wjj=1k+1(1+uλj2)wj+j=1k+1(1uλj2)wj,j=1k+1(1+vλj2)wjj=1k+1(1vλj2)wjj=1k+1(1+vλj2)wj+j=1k+1(1vλj2)wj],[2j=1k+1(xλj2)wjj=1k+1(2xλj2)wj+j=1k+1(xλj2)wj,2j=1k+1(yλj2)wjj=1k+1(2yλj2)wj+j=1k+1(yλj2)wj]).

Hence, Eq. (22) holds for n=k+1. Thus, Eq. (22) holds for all n.□

Lemma 1 ([16]). Let λj>0, wj>0 (j=1, 2, …, n) and j=1nwj=1, then

(28) j=1n(λj)wjj=1nwjλj,

where the equality holds if and only if λ1=λ2= … =λn.

Theorem 2. Let λj=([uλj,vλj],[xλj,yλj]) (j=1,2,...,n) be the collection of IVPFVs, where the weighted vector of λj is w=(w1, w2, …, wn)T such that wj∈[0, 1] and j=1nwj=1, then

(29) IVPFEWAw(λ1,λ2,λ3,...,λn)IVPFWAw(λ1,λ2,λ3,...,λn).

Proof. Straightforward.□

Example 1. Let

λ1=([0.3,0.4],[0.5,0.7]),λ2=([0.2,0.6],[0.3,0.6]),λ3=([0.3,0.6],[0.3,0.5]),λ4=([0.4,0.7],[0.2,0.6]),

and let w=(0.1, 0.2, 0.3, 0.4)T be the weighted vector of λj (j=1, 2, 3, 4), then we have

IVPFEWAw(λ1,λ2,λ3,λ4)=([j=14(1+uλj2)wjj=14(1uλj2)wjj=14(1+uλj2)wj+j=14(1uλj2)wj,j=14(1+vλj2)wjj=14(1vλj2)wjj=14(1+vλj2)wj+j=14(1vλj2)wj],[2j=14(xλj2)wjj=14(2xλj2)wj+j=14(xλj2)wj,2j=14(yλj2)wjj=14(2yλj2)wj+j=14(yλj2)wj])=([0.3289,0.6293],[0.2693,0.5790]).

Now

IVPFWAw(λ1,λ2,λ3,λ4)=([1j=14(1(uλj)2)wj,1j=14(1(vλj)2)wj],[j=14(xλj)wj,j=14(yλj)wj])=([0.3306,0.6321],[0.2684,0.5768]).

Theorem 3. Commutativity: Let λj and λj (j = 1, 2, ...,n) be two collection of IVPFVs, where (λ1,λ2,...,λn) is any permutation of (λ1, λ2, …, λn), then

(30) IVPFEWAw(λ1,λ2,λ3,...,λn)=IVPFEWAw(λ1,λ2,λ3,...,λn).

Proof. As we know that

(31) IVPFEWAw(λ1,λ2,λ3,...,λn)=w1λ1εw2λ2ε...εwnλn,

and

(32) IVPFEWAw(λ1,λ2,λ3,...,λn)=w1λ1εw2λ2ε...εwnλn,

as (λ1,λ2,λ3,...,λn) is any permutation of (λ1, λ1, λ3, …, λn). Thus, Eq. (33) always holds.□

Theorem 4. Idempotency: If λj=λ for all j (j=1, 2, 3, …, n), where λ=([u, v], [x, y]), then

(33) IVPFEWAw(λ1,λ2,λ3,...,λn)=λ.

Proof. As λj=λ for all j, then we have

IVPFEWAw(λ1,λ2,λ3,...,λn)=w1λεw2λεw3λε...εwnλ=(w1εw2εw3ε...εwn)λ=λ.

This completes the proof.□

Theorem 5. Boundedness: Let λj=([uλj,vλj],[xλj,yλj])(j=1,2,...,n) be a collection of IVPFVs and let w=(w1, w2, …, wn)T be the weighted vector of λj, such that wj ∈ [0, 1], j=1nwj=1, then

(34) λminIVPFEWA(λ1,λ2,λ3,...,λn)λmax,

for all wj and also

(35) λmax=maxj(λj),
(36) λmin=minj(λj).

Proof. Let

(37) IVPFEWA=λ=([u,v],[x,y]).

Now by the score function, we have

(38) ([umin,vmin],[xmax,ymax])([u,v],[x,y]),
(39) ([umax,vmax],[xmin,ymin])([u,v],[x,y]),

as from Eqs. (38) and (39), we have

λminIVPFEWA (λ1,λ2,λ3,...,λn)λmax.

Thus, Eq. (34) always holds.□

Theorem 6. Monotonicity: If λjλj for all j, where j=1, 2, 3, …, n, then

(40) IVPFEWAw(λ1,λ2,λ3,...,λn)IVPFEWAw(λ1,λ2,λ3,,λn).

Proof. As we know that

(41) IVPFEWAw(λ1,λ2,λ3,,λn)=w1λ1εw2λ2εεwnλn.

and

(42) IVPFEWAw(λ1,λ2,λ3,,λn)=w1λ1εw2λ2εεwnλn,

as λjλj for all j. Thus, Eq. (40) always holds.□

4.2 IVPFEOWA Aggregation Operator

Definition 8: Let λj (j=1, 2, 3, …, n) be a collection of IVPFVs, then an IVPFEOWA operator of dimension n is a mapping IVPFEOWAw: Θn→Θ, and

(43) IVPFEOWAw(λ1,λ2,λ3,,λn)=([j=1n(1+uλσ(j)2)wjj=1n(1uλσ(j)2)wjj=1n(1+uλσ(j)2)wj+j=1n(1uλσ(j)2)wj,j=1n(1+vλσ(j)2)wjj=1n(1vλσ(j)2)wjj=1n(1+vλσ(j)2)wj+j=1n(1vλσ(j)2)wj],[2j=1n(xλσ(j)2)wjj=1n(2xλσ(j)2)wj+j=1n(xλσ(j)2)wj,2j=1n(yλσ(j)2)wjj=1n(2yλσ(j)2)wj+j=1n(yλσ(j)2)wj]),

where (σ(1), σ(2), …, σ(n)) is a permutation of (1, 2, …, n) such that σ(j)≤σ(j–1) for all j, and w=(w1, w2, …, wn)T is the weighted vector of λσ(j) (j=1, 2, …, n) such that wj∈[0, 1] and j=1nwj=1.

Theorem 7. Let λj=([uλj,vλj],[xλj,yλj]) (j=1,2,,n) be the collection of IVPFVs, then their aggregated value by using the IVPFEOWA operator is also an IVPFV, and

(44) IVPFEOWAw(λ1,λ2,λ3,,λn)=([j=1n(1+uλσ(j)2)wjj=1n(1uλσ(j)2)wjj=1n(1+uλσ(j)2)wj+j=1n(1uλσ(j)2)wj,j=1n(1+vλσ(j)2)wjj=1n(1vλσ(j)2)wjj=1n(1+vλσ(j)2)wj+j=1n(1vλσ(j)2)wj],[2j=1n(xλσ(j)2)wjj=1n(2xλσ(j)2)wj+j=1n(xλσ(j)2)wj,2j=1n(yλσ(j)2)wjj=1n(2yλσ(j)2)wj+j=1n(yλσ(j)2)wj]),

where (σ(1), σ(2), …, σ(n)) is a permutation of (1, 2, …, n) such that σ(j)≤σ(j−1) for all j, and w=(w1, w2, …wn)T is the weighted vector of λσ(j) (j=1, 2, …n) such that wj∈[0, 1] (j=1, 2, …, n) and j=1nwj=1.

Proof. Proof is similar to Theorem 1.□

Theorem 8. Let λj=([uλj,vλj],[xλj,yλj]) (j=1,2,,n) be a collection of IVPFVs, where the weighted vector of λj is w=(w1, w2, w3, …, wn)T such that wj∈[0, 1] and j=1nwj=1, then

(45) IVPFEOWAw(λ1,λ2,λ3,,λn)IVPFOWAw(λ1,λ2,λ3,,λn).

Proof. Straightforward.□

Theorem 9. Commutativity: If λj(j=1,,n) is any permutation of λj (j=1, …, n), then

(46) IVPFEOWAw(λ1,λ2,λ3,,λn)=IVPFEOWAw(λ1,λ2,λ3,,λn).

Proof. Proof is similar to Theorem 3.□

Theorem 10. Idempotency: If λj=λ for all j (j=1, 2, 3, …, n), where λ=([u, v], [x, y]), then

(47) IVPFEOWAw(λ1,λ2,λ3,,λn)=λ.

Proof. Proof is similar to Theorem 4.□

Theorem 11. Boundedness: Let λj=([uλj,vλj], [xλj,yλj])(j=1,2,,n) be a collection of IVPFVs and let w=(w1, w2, …, wn)T be the weighted vector of λσ(j), such that wj[0,1],j=1nwj=1, then

(48) λminIVPFEOWAw(λ1,λ2,λ3,,λn)λmax,

for all wj and

(49) λmax=maxj(λj),
(50) λmin=minj(λj).

Proof. Proof is similar to Theorem 5.□

Theorem 12. Monotonicity: If λjλj for all j, where j=1, 2, …, n, then

(51) IVPFEOWAw(λ1,λ2,λ3,,λn)IVPFEOWAw(λ1,λ2,λ3,,λn).

Proof. Proof is similar to Theorem 6.□

5 An Approach to the MAGDM Problem Based on Interval-Valued Pythagorean Fuzzy Information

Algorithm 1. Let X={X1, X2, X3, …, Xm} be a finite set of m alternatives, and C={C1, C2, C3, …, Cn} be a finite set of attributes. Suppose the grade of the alternatives Xi (i=1, 2, …, m) on attributes Cj (j=1, 2, …, n) given by decision makers are IVPFNs. Let D={D1, D2, D3, …, Dk} be the set of k decision makers, and let w=(w1, w2, …, wn)T be the weighted vector of the attributes Cj (j=1, 2, …, n), such that wj∈[0, 1] and j=1nwj=1, and let ω=(ω1, ω2, …, ωk)T be the weighted vector of the decision makers Ds (s=1, 2, …, k), such that ωs∈[0, 1] and s=1kωs=1. Let D=(aji)=〈[uji, vji], [xji, yji]〉 (i=1, 2, …, m, j=1, 2, 3, …, n), where [uji, vji] indicates the interval degree that the alternative Xi satisfies the attribute Cj and [xji, yji] indicates the interval degree that the alternative Xi does not satisfy the attribute Cj. Also, [uji, vji]∈[0, 1], [xji, yji]∈[0, 1] with condition 0≤(vji)2+(yji)2≤1 (i=1, 2, …, m, j=1, 2, …, n). This method has the following steps.

  • Step 1: In this step, we construct the interval-valued Pythagorean fuzzy decision-making matrices, Ds=[aji(s)]n×m (s=1, 2, …, k) for the decision.

  • Step 2: If the criteria have two types, such as benefit criteria and cost criteria, then the interval-valued Pythagorean fuzzy decision matrices, Ds=[ajis]n×m can be converted into the normalized interval-valued Pythagorean fuzzy decision matrices, Rs=[rji(s)]n×m, where

    rji(s)={aji(s), forbenefitcriteriaCj,a¯ji(s), forcostcriteriaCj, (j=1,2,,n,i=1,2,,m)

    and a¯ji(s) is the complement of αjis. If all the criteria have the same type, then there is no need for normalization.

  • Step 3: In this step, we apply the IVPFEWA operator to aggregate all the individual normalized interval-valued Pythagorean fuzzy decision matrices, Rs=[rji(s)]n×m(s=1,,k), into a single interval-valued Pythagorean fuzzy decision matrix, R=[rji]n×m .

  • Step 4: In this step, we apply the IVPFEWA operator to aggregate all preference values.

  • Step 5: In this step, we calculate the score functions. If there is no difference between two or more than two scores, then we must find out the accuracy degrees of the collective overall preference values.

  • Step 6: Arrange the scores of all alternatives in descending order and select that alternative having the highest score function.

6 Illustrative Example

Suppose a company wants to invest money in the following best options: X1, car company; X2, food company; and X3, computer company. There are three experts Ds (s=1, 2, 3) from a group to act as decision makers, whose weight vector is ω=(0.2, 0.3, 0.5)T. There are many factors that must be considered when selecting the most suitable company; however, here, we have to consider only the following four criteria, whose weighted vector is w=(0.1, 0.2, 0.3, 0.4)T:

  1. C1: risk analysis,

  2. C2: growth analysis,

  3. C3: social political impact analysis,

  4. C4: environmental analysis,

where C1 and C3, are cost-type criteria and C2 and C4 are benefit-type criteria, i.e. the attributes have two types of criteria. Thus, we must change the cost-type criteria into the benefit-type criteria.

  • Step 1: The decision makers give their decision in Tables 111.

Table 1:

Interval-Valued Pythagorean Fuzzy Decision Matrix of D1.

X1 X2 X3
C1 ([0.5, 0.8], [0.3, 0.4]) ([0.6, 0.7], [0.3, 0.6]) ([0.3, 0.7], [0.3, 0.5])
C2 ([0.3, 0.5], [0.6, 0.7]) ([0.3, 0.7], [0.2, 0.6]) ([0.3, 0.6], [0.4, 0.7])
C3 ([0.5, 0.7], [0.3, 0.7]) ([0.5, 0.6], [0.3, 0.7]) ([0.2, 0.6], [0.3, 0.7])
C4 ([0.3, 0.6], [0.6, 0.7]) ([0.6, 0.5], [0.2, 0.7]) ([0.3, 0.4], [0.5, 0.6])
Table 2:

Interval-Valued Pythagorean Fuzzy Decision Matrix of D2.

X1 X2 X3
C1 ([0.5, 0.6], [0.3, 0.5]) ([0.5, 0.7], [0.3, 0.6]) ([0.2, 0.8], [0.3, 0.4])
C2 ([0.3, 0.4], [0.6, 0.8]) ([0.3, 0.8], [0.2, 0.6]) ([0.3, 0.6], [0.3, 0.7])
C3 ([0.4, 0.5], [0.3, 0.8]) ([0.5, 0.7], [0.3, 0.6]) ([0.2, 0.6], [0.3, 0.8])
C4 ([0.3, 0.6], [0.5, 0.7]) ([0.3, 0.4], [0.2, 0.8]) ([0.3, 0.5], [0.5, 0.7])
Table 3:

Interval-Valued Pythagorean Fuzzy Decision Matrix of D3.

X1 X2 X3
C1 ([0.3, 0.8], [0.5, 0.6]) ([0.3, 0.5], [0.5, 0.7]) ([0.2, 0.4], [0.5, 0.7])
C2 ([0.5, 0.7], [0.3, 0.4]) ([0.4, 0.6], [0.5, 0.8]) ([0.5, 0.7], [0.2, 0.5])
C3 ([0.3, 0.6], [0.4, 0.6]) ([0.3, 0.5], [0.5, 0.6]) ([0.2, 0.8], [0.4, 0.6])
C4 ([0.5, 0.7], [0.3, 0.4]) ([0.5, 0.7], [0.2, 0.4]) ([0.5, 0.6], [0.3, 0.5])
Table 4:

Normalized Pythagorean Fuzzy Decision Matrix R1.

X1 X2 X3
C1 ([0.3, 0.4], [0.5, 0.8]) ([0.3, 0.6], [0.6, 0.7]) ([0.3, 0.5], [0.3, 0.7])
C2 ([0.3, 0.5], [0.6, 0.7]) ([0.3, 0.7], [0.2, 0.6]) ([0.3, 0.6], [0.4, 0.7])
C3 ([0.3, 0.7], [0.5, 0.7]) ([0.3, 0.7], [0.5, 0.6]) ([0.3, 0.7], [0.2, 0.6])
C4 ([0.3, 0.6], [0.6, 0.7]) ([0.6, 0.5], [0.2, 0.7]) ([0.3, 0.4], [0.5, 0.6])
Table 5:

Normalized Pythagorean Fuzzy Decision Matrix R2.

X1 X2 X3
C1 ([0.3, 0.5], [0.5, 0.6]) ([0.3, 0.6], [0.5, 0.7]) ([0.3, 0.4], [0.2, 0.8])
C2 ([0.3, 0.4], [0.6, 0.8]) ([0.3, 0.8], [0.2, 0.6]) ([0.3, 0.6], [0.3, 0.7])
C3 ([0.3, 0.8], [0.4, 0.5]) ([0.3, 0.6], [0.5, 0.7]) ([0.3, 0.8], [0.2, 0.6])
C4 ([0.3, 0.6], [0.5, 0.7]) ([0.3, 0.4], [0.2, 0.8]) ([0.3, 0.5], [0.5, 0.7])
Table 6:

Normalized Pythagorean Fuzzy Decision Matrix R3.

X1 X2 X3
C1 ([0.5, 0.6], [0.3, 0.8]) ([0.5, 0.7], [0.3, 0.5]) ([0.5, 0.7], [0.2, 0.4])
C2 ([0.5, 0.7], [0.3, 0.4]) ([0.4, 0.6], [0.5, 0.8]) ([0.5, 0.7], [0.2, 0.5])
C3 ([0.4, 0.6], [0.3, 0.6]) ([0.5, 0.6], [0.3, 0.5]) ([0.4, 0.6], [0.2, 0.8])
C4 ([0.5, 0.7], [0.3, 0.4]) ([0.5, 0.7], [0.2, 0.4]) ([0.5, 0.6], [0.3, 0.5])
Table 7:

Collective Interval-Valued Pythagorean Fuzzy Decision Matrix R.

X1 X2 X3
C1 ([0.413, 0.537], [0.389, 0.738]) ([0.413, 0.653], [0.405, 0.595]) ([0.413, 0.593], [0.216, 0.562])
C2 ([0.413, 0.593], [0.429, 0.563]) ([0.352, 0.692], [0.320, 0.697]) ([0.413, 0.653], [0.259, 0.595])
C3 ([0.352, 0.692], [0.363, 0.587]) ([0.413, 0.622], [0.389, 0.576]) ([0.352, 0.693], [0.200, 0.697])
C4 ([0.413, 0.653], [0.405, 0.536]) ([0.475, 0.593], [0.200, 0.563]) ([0.413, 0.538], [0.389, 0.576])
Table 8:

Pythagorean Fuzzy Ordered Decision Matrix R1.

X1 X2 X3
C1 ([0.3, 0.7], [0.5, 0.7]) ([0.3, 0.7], [0.2, 0.6]) ([0.3, 0.7], [0.2, 0.6])
C2 ([0.3, 0.6], [0.6, 0.7]) ([0.6, 0.5], [0.2, 0.7]) ([0.3, 0.6], [0.4, 0.7])
C3 ([0.3, 0.5], [0.6, 0.7]) ([0.3, 0.7], [0.5, 0.6]) ([0.3, 0.5], [0.3, 0.7])
C4 ([0.3, 0.4], [0.5, 0.8]) ([0.3, 0.6], [0.6, 0.7]) ([0.3, 0.4], [0.5, 0.6])
Table 9:

Pythagorean Fuzzy Ordered Decision Matrix R2.

X1 X2 X3
C1 ([0.3, 0.8], [0.4, 0.5]) ([0.3, 0.8], [0.2, 0.6]) ([0.3, 0.8], [0.2, 0.6])
C2 ([0.3, 0.5], [0.5, 0.6]) ([0.3, 0.6], [0.5, 0.7]) ([0.3, 0.6], [0.3, 0.7])
C3 ([0.3, 0.6], [0.5, 0.7]) ([0.3, 0.6], [0.5, 0.7]) ([0.3, 0.5], [0.5, 0.7])
C4 ([0.3, 0.4], [0.6, 0.8]) ([0.3, 0.4], [0.2, 0.8]) ([0.3, 0.4], [0.2, 0.8])
Table 10:

Pythagorean Fuzzy Ordered Decision Matrix R3.

X1 X2 X3
C1 ([0.5, 0.7], [0.3, 0.4]) ([0.5, 0.7], [0.2, 0.4]) ([0.5, 0.7], [0.2, 0.4])
C2 ([0.5, 0.7], [0.3, 0.4]) ([0.5, 0.7], [0.3, 0.5]) ([0.5, 0.7], [0.2, 0.5])
C3 ([0.4, 0.6], [0.3, 0.6]) ([0.5, 0.6], [0.3, 0.5]) ([0.5, 0.6], [0.3, 0.5])
C4 ([0.5, 0.6], [0.3, 0.8]) ([0.4, 0.6], [0.5, 0.8]) ([0.4, 0.6], [0.2, 0.8])
Table 11:

Collective Pythagorean Fuzzy Ordered Decision Matrix R.

X1 X2 X3
C1 ([0.413, 0.734], [0.363, 0.481]) ([0.413, 0.734], [0.200, 0.495]) ([0.413, 0.734], [0.200, 0.492])
C2 ([0.413, 0.630], [0.404, 0.509]) ([0.476, 0.638], [0.323, 0.595]) ([0.413, 0.653], [0.259, 0.595])
C3 ([0.352, 0.582], [0.404, 0.649]) ([0.413, 0.622], [0.389, 0.576]) ([0.413, 0.553], [0.351, 0.595])
C4 ([0.413, 0.512], [0.412, 0.800]) ([0.352, 0.550], [0.399, 0.779]) ([0.352, 0.512], [0.241, 0.758])
  • Step 2: In this step, we normalize the decision matrices.

  • Step 3: In this step, we apply the IVPFEWA operator to aggregate all the individual normalized interval-valued Pythagorean fuzzy decision matrices, Rs=[rji(s)]n×m, into a single interval-valued Pythagorean fuzzy decision matrix, R=[rji]n×m .

  • Step 4: In this step, we apply the IVPFEWA aggregation operator to aggregate all preference values:

    r1=([0.395,0.644],[0.394,0.575]).

    r2=([0.427,0.617],[0.289,0.596]).

    r3=([0.495,0.619],[0.277,0.613]).

  • Step 5: In this step, we calculate the score functions:

    S(r1)=0.042,S(r2)=0.069,S(r3)=0.083.

  • Step 6: Arrange the scores of all alternatives in descending order and select that alternative having the highest score function. Hence, X3>X2>X1. Thus, the best alternative is X3.

For the IVPFEOWA aggregation operator:

  • Step 1: In this step, we construct the interval-valued Pythagorean fuzzy ordered decision matrices.

  • Step 2: In this step, we apply the IVPFEOWA operator to aggregate all the individual interval-valued Pythagorean fuzzy ordered decision matrices, Rs=[rji(s)]n×m, into a single interval-valued Pythagorean fuzzy decision matrix, R=[rji]n×m .

  • Step 3: In this step, we apply the IVPFEOWA aggregation operator to aggregate all preference values:

    r1=([0.395,0.579],[0.402,0.659]).

    r2=([0.403,0.612],[0.354,0.649]).

    r3=([0.389,0.576],[0.268,0.646]).

  • Step 4: In this step, we calculate the score functions:

    S(r1)=0.052,S(r1)=0.004,S(r1)=0.003.

  • Step 5: Arrange the scores of all alternatives in descending order and select that alternative having the highest score function. Hence, X3>X2>X1. Thus, the best alternative is X3.

7 Conclusion

In this paper, we have developed the notions of the IVPFEWA operator, IVPFEOWA operator, and interval-valued Pythagorean fuzzy Einstein hybrid weighted averaging operator. We have also discussed some of their desirable properties, such as idempotency, boundedness, commutativity, and monotonicity. Finally, we have applied these operators to deal with the MAGDM problem under interval-valued Pythagorean fuzzy information. For this, we constructed an algorithm for the MAGDM problem. Lastly, we also developed a numerical example for MAGDM.

Bibliography

[1] K. Atanassov, Intuitionistic fuzzy sets, Fuzzy Sets Syst. 20 (1986), 87–96.10.1016/S0165-0114(86)80034-3Search in Google Scholar

[2] K. Atanassov, New operations defined over the intuitionistic fuzzy sets, Fuzzy Sets Syst. 61 (1994), 137–142.10.1016/0165-0114(94)90229-1Search in Google Scholar

[3] K. Atanassov, Remarks on the intuitionistic fuzzy sets, III, Fuzzy Sets Syst. 75 (1995), 401–402.10.1016/0165-0114(95)00004-5Search in Google Scholar

[4] K. Atanassov, Equality between intuitionistic fuzzy sets, Fuzzy Sets Syst. 79 (1996), 257–258.10.1016/0165-0114(95)00173-5Search in Google Scholar

[5] K. Atanassov, Intuitionistic fuzzy sets: theory and applications, Physica-Verlag, Heidelberg, Germany, 1999.10.1007/978-3-7908-1870-3Search in Google Scholar

[6] S. K. De, R. Biswas and A. R. Roy, Some operations on intuitionistic fuzzy sets, Fuzzy Sets Syst. 114 (2000), 477–484.10.1016/S0165-0114(98)00191-2Search in Google Scholar

[7] H. Garg, A new generalized Pythagorean fuzzy information aggregation using Einstein operations and its application to decision making, Int. J. Intell. Syst. 31 (2016), 1–35.10.1002/int.21809Search in Google Scholar

[8] H. Garg, Generalized Pythagorean fuzzy geometric aggregation operators using Einstein tnorm and t-conorm for multicriteria decision-making process, Int. J. Intell. Syst. (2016), 1–34.10.1002/int.21860Search in Google Scholar

[9] X. J. Gou, Z. S. Xu and P. J. Ren, The properties of continuous Pythagorean fuzzy information, Int. J. Intell. Syst. 31 (2016), 401–424.10.1002/int.21788Search in Google Scholar

[10] Z. M. Ma and Z. S. Xu, Symmetric Pythagorean fuzzy weighted geometric averaging operators and their application in multi-criteria decision making problems, Int. J. Intell. Syst. 31 (2016), 1198–1219.10.1002/int.21823Search in Google Scholar

[11] X. Peng and Y. Yang, Fundamental properties of interval-valued Pythagorean fuzzy aggregation operators, Int. J. Intell. Syst. 31 (2015), 444–487.10.1002/int.21790Search in Google Scholar

[12] K. Rahman, S. Abdullah, R. Ahmed and M. Ullah, Pythagorean fuzzy Einstein weighted geometric aggregation operator and their application to multiple attribute group decision making, J. Intell. Fuzzy Syst. 33 (2016), 1–13.10.3233/JIFS-16797Search in Google Scholar

[13] K. Rahman, S. Abdullah, M. S. Ali Khan and M. Shakeel, Pythagorean fuzzy hybrid geometric aggregation operator and their applications to multiple attribute decision making, Int. J. Comput. Sci. Inform. Secur. (IJCSIS) 14 (2016), 837–854.Search in Google Scholar

[14] K. Rahman, S. Abdullah, F. Husain M. S. Ali Khan, M. Shakeel, Pythagorean fuzzy ordered weighted geometric aggregation operator and their application to multiple attribute group decision making, J. Appl. Environ. Biol. Sci. 7 (2017), 67–83.Search in Google Scholar

[15] K. Rahman, S. Abdullah, M. Shakeel, M. Sajjad Ali Khan and M. Ullah, Interval-valued Pythagorean fuzzy geometric aggregation operators and their application to group decision making, Cogent Math. 4 (2017), 1–20.10.1080/23311835.2017.1338638Search in Google Scholar

[16] K. Rahman, M. S. Ali Khan, M. Ullah and A. Fahmi, Multiple attribute group decision making for plant location selection with Pythagorean fuzzy weighted geometric aggregation operator, Nucleus 1 (2017), 66–74.10.71330/thenucleus.2017.107Search in Google Scholar

[17] K. Rahman, S. Abdullah, M. S. Ali Khan, A. Ali and F. Amin, Pythagorean fuzzy hybrid averaging aggregation operator and its application to multiple attribute decision making, Ital. J. Pure Appl. Math. (in press).10.1515/jisys-2018-0071Search in Google Scholar

[18] K. Rahman, A. Ali and M. S. A. Khan, Some interval-valued Pythagorean fuzzy weighted averaging aggregation operators and their application to multiple attribute decision making, Punjab Univ. J. Math. 50 (2018).10.1515/jisys-2017-0212Search in Google Scholar

[19] P. J. Ren, Z. S. Xu and X. J. Gou, Pythagorean fuzzy TODIM approach to multi-criteria decision making, Appl. Soft Comput. 42 (2016), 246–259.10.1016/j.asoc.2015.12.020Search in Google Scholar

[20] W. Z. Wang, X. W. Liu, Intuitionistic fuzzy geometric aggregation operators based on Einstein operations, Int. J. Intell. Syst. 26 (2011), 1049–1075.10.1002/int.20498Search in Google Scholar

[21] W. Z. Wang and X. W. Liu, Intuitionistic fuzzy information aggregation using Einstein operations, IEEE Trans. Fuzzy Syst. 20 (2012), 923–938.10.1109/TFUZZ.2012.2189405Search in Google Scholar

[22] M. Xia and Z. S. Xu, Generalized point operators for aggregating intuitionistic fuzzy information, Int. J. Intell. Syst. 25 (2010), 1061–1080.10.1002/int.20439Search in Google Scholar

[23] Z. S. Xu, Intuitionistic fuzzy aggregation operators, IEEE Trans. Fuzzy Syst. 15 (2007), 1179–1187.10.1109/TFUZZ.2006.890678Search in Google Scholar

[24] Z. S. Xu, Methods for aggregating interval-valued intuitionistic fuzzy information and their application to decision making, Control Decis. 22 (2007), 215–219 (in Chinese).Search in Google Scholar

[25] Z. S. Xu and J. Chen, On geometric aggregation over interval-valued intuitionistic fuzzy information, in: Fourth International Conference on Fuzzy Systems and Knowledge Discovery (FSKD), vol. 2, pp. 466–471, 2007.10.1109/FSKD.2007.427Search in Google Scholar

[26] Z. S. Xu and R. R. Yager, Some geometric aggregation operators based on intuitionistic fuzzy sets, Int. J. Gen. Syst. 35 (2006), 417–433.10.1080/03081070600574353Search in Google Scholar

[27] R. R. Yager, Pythagorean fuzzy subsets, in: Proc. Joint IFSA World Congress and NAFIPS Annual Meeting, Edmonton, Canada, pp. 57–61, 2013.10.1109/IFSA-NAFIPS.2013.6608375Search in Google Scholar

[28] R. R. Yager and A. M. Abbasov, Pythagorean membership grades, complex numbers and decision making, Int. J. Intell. Syst. 28 (2013), 436–452.10.1002/int.21584Search in Google Scholar

[29] D. Yu, Decision making based on generalized geometric operator under interval-valued intuitionistic fuzzy environment, J. Intell. Fuzzy Syst. 25 (2013), 471–480.10.3233/IFS-120652Search in Google Scholar

[30] D. Yu, Multi-criteria decision making based on generalized prioritized aggregation operators under intuitionistic fuzzy environment, Int. J. Fuzzy Syst. 15 (2013), 47–54.Search in Google Scholar

[31] D. Yu, A scientometrics review on aggregation operator research, Scientometrics 105 (2015), 115–133.10.1007/s11192-015-1695-2Search in Google Scholar

[32] D. Yu, Group decision making under interval-valued multiplicative intuitionistic fuzzy environment based on Archimedean t-conorm and t-norm, Int. J. Intell. Syst. 30 (2015), 590–616.10.1002/int.21710Search in Google Scholar

[33] L. A. Zadeh, Fuzzy sets, Inf. Control 8 (1965), 338–353.10.21236/AD0608981Search in Google Scholar

[34] X. L. Zhang and Z. S. Xu, Extension of TOPSIS to multiple criteria decision making with Pythagorean fuzzy sets, Int. J. Intell. Syst. 29 (2014), 1061–1078.10.1002/int.21676Search in Google Scholar

Received: 2017-04-03
Published Online: 2018-02-24

©2020 Walter de Gruyter GmbH, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 Public License.

Articles in the same Issue

  1. An Optimized K-Harmonic Means Algorithm Combined with Modified Particle Swarm Optimization and Cuckoo Search Algorithm
  2. Texture Feature Extraction Using Intuitionistic Fuzzy Local Binary Pattern
  3. Leaf Disease Segmentation From Agricultural Images via Hybridization of Active Contour Model and OFA
  4. Deadline Constrained Task Scheduling Method Using a Combination of Center-Based Genetic Algorithm and Group Search Optimization
  5. Efficient Classification of DDoS Attacks Using an Ensemble Feature Selection Algorithm
  6. Distributed Multi-agent Bidding-Based Approach for the Collaborative Mapping of Unknown Indoor Environments by a Homogeneous Mobile Robot Team
  7. An Efficient Technique for Three-Dimensional Image Visualization Through Two-Dimensional Images for Medical Data
  8. Combined Multi-Agent Method to Control Inter-Department Common Events Collision for University Courses Timetabling
  9. An Improved Particle Swarm Optimization Algorithm for Global Multidimensional Optimization
  10. A Kernel Probabilistic Model for Semi-supervised Co-clustering Ensemble
  11. Pythagorean Hesitant Fuzzy Information Aggregation and Their Application to Multi-Attribute Group Decision-Making Problems
  12. Using an Efficient Optimal Classifier for Soil Classification in Spatial Data Mining Over Big Data
  13. A Bayesian Multiresolution Approach for Noise Removal in Medical Magnetic Resonance Images
  14. Gbest-Guided Artificial Bee Colony Optimization Algorithm-Based Optimal Incorporation of Shunt Capacitors in Distribution Networks under Load Growth
  15. Graded Soft Expert Set as a Generalization of Hesitant Fuzzy Set
  16. Universal Liver Extraction Algorithm: An Improved Chan–Vese Model
  17. Software Effort Estimation Using Modified Fuzzy C Means Clustering and Hybrid ABC-MCS Optimization in Neural Network
  18. Handwritten Indic Script Recognition Based on the Dempster–Shafer Theory of Evidence
  19. An Integrated Intuitionistic Fuzzy AHP and TOPSIS Approach to Evaluation of Outsource Manufacturers
  20. Automatically Assess Day Similarity Using Visual Lifelogs
  21. A Novel Bio-Inspired Algorithm Based on Social Spiders for Improving Performance and Efficiency of Data Clustering
  22. Discriminative Training Using Noise Robust Integrated Features and Refined HMM Modeling
  23. Self-Adaptive Mussels Wandering Optimization Algorithm with Application for Artificial Neural Network Training
  24. A Framework for Image Alignment of TerraSAR-X Images Using Fractional Derivatives and View Synthesis Approach
  25. Intelligent Systems for Structural Damage Assessment
  26. Some Interval-Valued Pythagorean Fuzzy Einstein Weighted Averaging Aggregation Operators and Their Application to Group Decision Making
  27. Fuzzy Adaptive Genetic Algorithm for Improving the Solution of Industrial Optimization Problems
  28. Approach to Multiple Attribute Group Decision Making Based on Hesitant Fuzzy Linguistic Aggregation Operators
  29. Cubic Ordered Weighted Distance Operator and Application in Group Decision-Making
  30. Fault Signal Recognition in Power Distribution System using Deep Belief Network
  31. Selector: PSO as Model Selector for Dual-Stage Diabetes Network
  32. Oppositional Gravitational Search Algorithm and Artificial Neural Network-based Classification of Kidney Images
  33. Improving Image Search through MKFCM Clustering Strategy-Based Re-ranking Measure
  34. Sparse Decomposition Technique for Segmentation and Compression of Compound Images
  35. Automatic Genetic Fuzzy c-Means
  36. Harmony Search Algorithm for Patient Admission Scheduling Problem
  37. Speech Signal Compression Algorithm Based on the JPEG Technique
  38. i-Vector-Based Speaker Verification on Limited Data Using Fusion Techniques
  39. Prediction of User Future Request Utilizing the Combination of Both ANN and FCM in Web Page Recommendation
  40. Presentation of ACT/R-RBF Hybrid Architecture to Develop Decision Making in Continuous and Non-continuous Data
  41. An Overview of Segmentation Algorithms for the Analysis of Anomalies on Medical Images
  42. Blind Restoration Algorithm Using Residual Measures for Motion-Blurred Noisy Images
  43. Extreme Learning Machine for Credit Risk Analysis
  44. A Genetic Algorithm Approach for Group Recommender System Based on Partial Rankings
  45. Improvements in Spoken Query System to Access the Agricultural Commodity Prices and Weather Information in Kannada Language/Dialects
  46. A One-Pass Approach for Slope and Slant Estimation of Tri-Script Handwritten Words
  47. Secure Communication through MultiAgent System-Based Diabetes Diagnosing and Classification
  48. Development of a Two-Stage Segmentation-Based Word Searching Method for Handwritten Document Images
  49. Pythagorean Fuzzy Einstein Hybrid Averaging Aggregation Operator and its Application to Multiple-Attribute Group Decision Making
  50. Ensembles of Text and Time-Series Models for Automatic Generation of Financial Trading Signals from Social Media Content
  51. A Flame Detection Method Based on Novel Gradient Features
  52. Modeling and Optimization of a Liquid Flow Process using an Artificial Neural Network-Based Flower Pollination Algorithm
  53. Spectral Graph-based Features for Recognition of Handwritten Characters: A Case Study on Handwritten Devanagari Numerals
  54. A Grey Wolf Optimizer for Text Document Clustering
  55. Classification of Masses in Digital Mammograms Using the Genetic Ensemble Method
  56. A Hybrid Grey Wolf Optimiser Algorithm for Solving Time Series Classification Problems
  57. Gray Method for Multiple Attribute Decision Making with Incomplete Weight Information under the Pythagorean Fuzzy Setting
  58. Multi-Agent System Based on the Extreme Learning Machine and Fuzzy Control for Intelligent Energy Management in Microgrid
  59. Deep CNN Combined With Relevance Feedback for Trademark Image Retrieval
  60. Cognitively Motivated Query Abstraction Model Based on Associative Root-Pattern Networks
  61. Improved Adaptive Neuro-Fuzzy Inference System Using Gray Wolf Optimization: A Case Study in Predicting Biochar Yield
  62. Predict Forex Trend via Convolutional Neural Networks
  63. Optimizing Integrated Features for Hindi Automatic Speech Recognition System
  64. A Novel Weakest t-norm based Fuzzy Fault Tree Analysis Through Qualitative Data Processing and Its Application in System Reliability Evaluation
  65. FCNB: Fuzzy Correlative Naive Bayes Classifier with MapReduce Framework for Big Data Classification
  66. A Modified Jaya Algorithm for Mixed-Variable Optimization Problems
  67. An Improved Robust Fuzzy Algorithm for Unsupervised Learning
  68. Hybridizing the Cuckoo Search Algorithm with Different Mutation Operators for Numerical Optimization Problems
  69. An Efficient Lossless ROI Image Compression Using Wavelet-Based Modified Region Growing Algorithm
  70. Predicting Automatic Trigger Speed for Vehicle-Activated Signs
  71. Group Recommender Systems – An Evolutionary Approach Based on Multi-expert System for Consensus
  72. Enriching Documents by Linking Salient Entities and Lexical-Semantic Expansion
  73. A New Feature Selection Method for Sentiment Analysis in Short Text
  74. Optimizing Software Modularity with Minimum Possible Variations
  75. Optimizing the Self-Organizing Team Size Using a Genetic Algorithm in Agile Practices
  76. Aspect-Oriented Sentiment Analysis: A Topic Modeling-Powered Approach
  77. Feature Pair Index Graph for Clustering
  78. Tangramob: An Agent-Based Simulation Framework for Validating Urban Smart Mobility Solutions
  79. A New Algorithm Based on Magic Square and a Novel Chaotic System for Image Encryption
  80. Video Steganography Using Knight Tour Algorithm and LSB Method for Encrypted Data
  81. Clay-Based Brick Porosity Estimation Using Image Processing Techniques
  82. AGCS Technique to Improve the Performance of Neural Networks
  83. A Color Image Encryption Technique Based on Bit-Level Permutation and Alternate Logistic Maps
  84. A Hybrid of Deep CNN and Bidirectional LSTM for Automatic Speech Recognition
  85. Database Creation and Dialect-Wise Comparative Analysis of Prosodic Features for Punjabi Language
  86. Trapezoidal Linguistic Cubic Fuzzy TOPSIS Method and Application in a Group Decision Making Program
  87. Histopathological Image Segmentation Using Modified Kernel-Based Fuzzy C-Means and Edge Bridge and Fill Technique
  88. Proximal Support Vector Machine-Based Hybrid Approach for Edge Detection in Noisy Images
  89. Early Detection of Parkinson’s Disease by Using SPECT Imaging and Biomarkers
  90. Image Compression Based on Block SVD Power Method
  91. Noise Reduction Using Modified Wiener Filter in Digital Hearing Aid for Speech Signal Enhancement
  92. Secure Fingerprint Authentication Using Deep Learning and Minutiae Verification
  93. The Use of Natural Language Processing Approach for Converting Pseudo Code to C# Code
  94. Non-word Attributes’ Efficiency in Text Mining Authorship Prediction
  95. Design and Evaluation of Outlier Detection Based on Semantic Condensed Nearest Neighbor
  96. An Efficient Quality Inspection of Food Products Using Neural Network Classification
  97. Opposition Intensity-Based Cuckoo Search Algorithm for Data Privacy Preservation
  98. M-HMOGA: A New Multi-Objective Feature Selection Algorithm for Handwritten Numeral Classification
  99. Analogy-Based Approaches to Improve Software Project Effort Estimation Accuracy
  100. Linear Regression Supporting Vector Machine and Hybrid LOG Filter-Based Image Restoration
  101. Fractional Fuzzy Clustering and Particle Whale Optimization-Based MapReduce Framework for Big Data Clustering
  102. Implementation of Improved Ship-Iceberg Classifier Using Deep Learning
  103. Hybrid Approach for Face Recognition from a Single Sample per Person by Combining VLC and GOM
  104. Polarity Analysis of Customer Reviews Based on Part-of-Speech Subcategory
  105. A 4D Trajectory Prediction Model Based on the BP Neural Network
  106. A Blind Medical Image Watermarking for Secure E-Healthcare Application Using Crypto-Watermarking System
  107. Discriminating Healthy Wheat Grains from Grains Infected with Fusarium graminearum Using Texture Characteristics of Image-Processing Technique, Discriminant Analysis, and Support Vector Machine Methods
  108. License Plate Recognition in Urban Road Based on Vehicle Tracking and Result Integration
  109. Binary Genetic Swarm Optimization: A Combination of GA and PSO for Feature Selection
  110. Enhanced Twitter Sentiment Analysis Using Hybrid Approach and by Accounting Local Contextual Semantic
  111. Cloud Security: LKM and Optimal Fuzzy System for Intrusion Detection in Cloud Environment
  112. Power Average Operators of Trapezoidal Cubic Fuzzy Numbers and Application to Multi-attribute Group Decision Making
Downloaded on 13.9.2025 from https://www.degruyterbrill.com/document/doi/10.1515/jisys-2017-0212/html
Scroll to top button