Startseite Pythagorean Fuzzy Einstein Hybrid Averaging Aggregation Operator and its Application to Multiple-Attribute Group Decision Making
Artikel Open Access

Pythagorean Fuzzy Einstein Hybrid Averaging Aggregation Operator and its Application to Multiple-Attribute Group Decision Making

  • Khaista Rahman EMAIL logo , Saleem Abdullah , Asad Ali und Fazli Amin
Veröffentlicht/Copyright: 9. Juli 2018
Veröffentlichen auch Sie bei De Gruyter Brill

Abstract

Pythagorean fuzzy set is one of the successful extensions of the intuitionistic fuzzy set for handling uncertainties in information. Under this environment, in this paper, we introduce the notion of Pythagorean fuzzy Einstein hybrid averaging (PFEHA) aggregation operator along with some of its properties, namely idempotency, boundedness, and monotonicity. PFEHA aggregation operator is the generalization of Pythagorean fuzzy Einstein weighted averaging aggregation operator and Pythagorean fuzzy Einstein ordered weighted averaging aggregation operator. The operator proposed in this paper provides more accurate and precise results as compared to the existing operators. Therefore, this method plays a vital role in real-world problems. Finally, we applied the proposed operator and method to multiple-attribute group decision making.

1 Introduction

Multi-criteria decision making is one of the processes for finding the optimal alternative from all feasible alternatives according to some criteria or attributes. Traditionally, it has been generally assumed that all data that access the alternative in terms of criteria and their corresponding weights are expressed in the form of crisp numbers. However, most of the decisions in real-life situations are taken in the environment where the goals and constraints are generally imprecise or vague in nature. In order to handle the uncertainties, the intuitionistic fuzzy set [1] theory, one of the successful extensions of the fuzzy set theory [36], which is characterized by the degree of membership and degree of non-membership, has been presented. Xu [25] developed some basic arithmetic aggregation operators, including intuitionistic fuzzy weighted averaging operator, intuitionistic fuzzy ordered weighted averaging operator, and intuitionistic fuzzy hybrid averaging operator. Xu and Yager [29] defined some basic geometric aggregation operators, such as intuitionistic fuzzy weighted geometric operator, intuitionistic fuzzy ordered weighted geometric operator, and intuitionistic fuzzy hybrid geometric operator. Wang and Liu [22], [23] introduced the notion of some Einstein aggregation operators, such as intuitionistic fuzzy Einstein weighted geometric operator, intuitionistic fuzzy Einstein ordered weighted geometric operator, intuitionistic fuzzy Einstein weighted averaging operator, and intuitionistic fuzzy Einstein ordered weighted averaging operator, and applied them to group decision making. In Refs. [5], [6], [8], [9], [20], [21], [24], [26], [27], [30], [31], [35], many scholars worked in the field of intuitionistic fuzzy sets and introduced many aggregation operators and applied them to group decision making.

However, there are many cases where the decision maker may provide the degree of membership and non-membership of a particular attribute in such a way that their sum is greater than 1. For example, suppose a man expresses his preferences toward the alternative in such a way that degree of their satisfaction is 0.6 and the degree of rejection is 0.8. Obviously, its sum is greater than 1. Therefore, Yager [33] introduced the concept of another set called Pythagorean fuzzy set. Pythagorean fuzzy set is a more powerful tool to solve uncertain problems. Like intuitionistic fuzzy aggregation operators, Pythagorean fuzzy aggregation operators have also become an interesting and important area for research, after the advent of the Pythagorean fuzzy sets theory. In 2013, Yager and Abbasov [34] introduced the notion of two new Pythagorean fuzzy aggregation operators, such as Pythagorean fuzzy weighted averaging operator and Pythagorean fuzzy ordered weighted averaging operator. In Refs. [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], Rahman et al. introduced the concept of many aggregation operators using Pythagorean fuzzy numbers and also applied them to group decision making. In Refs. [2], [3], [4], Garg introduced the notion of Einstein averaging aggregation operator and Einstein geometric aggregation operator, and applied them to group decision making. In Ref. [37], Zang and Xu introduced the notion of TOPSIS for multiple-criteria decision making with Pythagorean fuzzy sets. Xue et al. [32], Liang et al. [7], and Xu and Da [28] developed some methods and aggregation operators using Pythagorean fuzzy information.

Thus, keeping the advantages and applications of the above-mentioned aggregation operators, in this paper, we introduce the notion of Pythagorean fuzzy Einstein hybrid averaging (PFEHA) aggregation operator along with its desirable properties, namely idempotency, boundedness, and monotonicity. Actually, Pythagorean fuzzy Einstein weighted averaging (PFEWA) aggregation operator weights only the Pythagorean fuzzy arguments and Pythagorean fuzzy Einstein ordered weighted averaging (PFEOWA) aggregation operator weights only the ordered positions of the Pythagorean fuzzy arguments instead of weighting the Pythagorean fuzzy arguments themselves. To overcome these limitations, we introduce the concept of PFEHA aggregation operator, which weights both the given Pythagorean fuzzy value and its ordered position. Thus, the method proposed in this paper is more general, more flexible, and provides more accurate and precise results compared to the existing methods.

The remainder of this paper is structured as follows. In Section 2, we give some basic definitions and results, which will be used in later sections. In Section 3, we introduce the notion of PFEHA aggregation operator and method. In Section 4, we apply the proposed aggregation operator to multiple-attribute group decision-making problems with Pythagorean fuzzy information. In Section 5, we construct a numerical example. In Section 6, we compare the proposed method to other methods. In Section 7, we provide our conclusion.

2 Preliminaries

In the following, we developed Pythagorean fuzzy set, score function, and accuracy function.

Definition 1 ([37]). Let Z be a universal set, then a Pythagorean fuzzy set can be defined as

(1) P={z,μP(z),ηP(z)|zZ},

where μP(z) and ηP(z) are mappings from Z to [0, 1], such that 0≤μP(z)≤1, 0≤ηP(z)≤1, and also 0μP2(z)+ηP2(z)1, for all zZ, and they denote the membership degree and non-membership degree of element zZ to set P, respectively. Let πP(z)=1μP2(z)ηP2(z), then it is called the Pythagorean fuzzy index of element zZ to set P, representing the degree of indeterminacy of z to P. Also, 0≤πP(z)≤1, for every zZ.

Definition 2 ([37]). Let α=⟨μα, ηα⟩ be a Pythagorean fuzzy number, then the score function of α can be defined as

(2) s(α)=μα2ηα2,

where s(α)∈[−1, 1].

Definition 3 ([37]). Let α=⟨μα, να⟩ be a Pythagorean fuzzy number, then the accuracy function of α can be defined as

(3) h(α)=μα2+ηα2,

where h(α)∈[0, 1].

Definition 4 ([37]). Let α1=μα1,ηα1 and α2=μα2,ηα2 be the two Pythagorean fuzzy number, then the following conditions hold:

  • 1. If s(α1)≺s(α2), then α1α2.

  • 2. If s(α1)=s(α2), then

  • 1. If h(α1)=h(α2), then α1=α2.

  • 2. If h(α1)≺h(α2), then α1α2.

  • 3. If h(α1)≻h(α2), then α1≻α2.

In the following, we developed some Einstein operational laws for sum and product.

Definition 5 ([2]). Let αj=μαj,ηαj(j=1,2) be the three Pythagorean fuzzy values and δ≻0 be any real number, then

(1)α1αε2=(μα12+μα221+μα12μεα22,ηα1ηεα21+(1ηα12)(1ηα22)ε).(2)α1αε2=(μα1μεα21+(1μα12)(1μα22)ε,ηα12+ηα221+ηα12ηεα22).(3)αεδ=(2(μα2)δ(2μα2)δ+(μα2)δ,(1+ηα2)δ(1ηα2)δ(1+ηα2)δ+(1ηα2)δ).(4)δαε=((1+μα2)δ(1μα2)δ(1+μα2)δ+(1μα2)δ,2(ηα2)δ(2ηα2)δ+(ηα2)δ).

In the following, we developed some aggregation operators, such as Pythagorean fuzzy hybrid averaging (PFHA) operator, PFEWA aggregation operator, and PFEOWA aggregation operator.

Definition 6 ([17]). Let αj=μαj,ηαj(j=1,2,3,,n) be a collection of fuzzy Pythagorean values, then PFHA aggregation operator can be defined as

(4) PFHAω,w(α1,α2,α3,,αn)=(1j=1n(1μα˙σ(j)2)wj,j=1n(ηα˙σ(j))wj),

where α˙σ(j) is the jth largest of the weighted Pythagorean fuzzy values α˙j(α˙j=nωjαj), w=(w1, w2, w3, …, wn)T is the weighted vector of the PFHA operator, such that wj∈[0, 1] and j=1nwj=1. ω=(ω1, ω2, ω3, …, ωn)T is the weighted vector of αj(j=1, 2, 3, ..., n), such that ωj∈[0, 1], j=1nωj=1, and n is the balancing coefficient, which plays a role of balance. If the vector w=(w1, w2, w3, …, wn)T approaches to (1n,1n,1n,,1n)T, then the vector (1α1, 2α2, 3α3, …, nαn)T approaches to (α1, α2, α3, …, αn)T.

Definition 7 ([2]). Let αj=μαj,ηαj(j=1,2,3,,n) be a collection of fuzzy Pythagorean values, then the PFEWA aggregation operator can be defined as

(5) PFEWAw(α1,α2,α3,,αn)=(j=1n(1+μαj2)wjj=1n(1μαj2)wjj=1n(1+μαj2)wj+j=1n(1μαj2)wj,2j=1n(ηαj2)wjj=1n(2ηαj2)wj+j=1n(ηαj2)wj),

where w=(w1, w2, w3, …, wn)T is the weighted vector of αj(j=1, 2, 3, …, n) such that wj∈[0, 1] and j=1nwj=1.

Definition 8 ([2]). Let αj=μαj,ηαj(j=1,2,3,,n) be a collection of fuzzy Pythagorean values, then the PFEOWA aggregation operator can be defined as

(6) PFEOWAw(α1,α2,α3,,αn)=(j=1n(1+μασ(j)2)wjj=1n(1μασ(j)2)wjj=1n(1+μασ(j)2)wj+j=1n(1μασ(j)2)wj,2j=1n(ηασ(j)2)wjj=1n(2ηασ(j)2)wj+j=1n(ηασ(j)2)wj),

where w=(w1, w2, w3, …, wn)T is the weighted vector of ασ(j)(j=1, 2, 3, …, n) such that wj∈[0, 1] and j=1nwj=1.

Also, (σ(1), σ(2), σ(3), …, σ(n)) is a permutation of (1, 2, 3, …, n) such that ασ(j)ασ(j−1) for all j.

3 PFEHA Aggregation Operator

In this section, we introduce the concept of PFEHA aggregation operator along with some of its basic properties, such as idempotency, boundedness, and monotonicity.

Definition 9. The PFEHA aggregation operator can be defined as follows:

(7) PFEHAω,w(α1,α2,α3,,αn)=(j=1n(1+μα˙σ(j)2)wjj=1n(1μα˙σ(j)2)wjj=1n(1+μα˙σ(j)2)wj+j=1n(1μα˙σ(j)2)wj,2j=1n(ηα˙σ(j)2)wjj=1n(2ηα˙σ(j)2)wj+j=1n(ηα˙σ(j)2)wj),

where α˙σ(j) is the jth largest of the weighted Pythagorean fuzzy values α˙j(α˙j=nωjαj), w=(w1, w2, w3, …, wn)T is the weighted vector of the PFEHA operator such that wj∈[0, 1], and j=1nwj=1. ω=(ω1, ω2, ω3, …, ωn)T is the weighted vector of αj(j=1, 2, 3, …, n) such that ωj∈[0, 1], j=1nωj=1, and n is the balancing coefficient, which plays a role of balance. If the vector w=(w1, w2, w3, …, wn)T approaches to (1n,1n,1n,,1n)T, then the vector (1α1, 2α2, 3α3, …, nαn)T approaches to (α1, α2, α3, …, αn)T.

Theorem 1. Let αj=μαj,ηαj(j=1,2,,n) be a collection of Pythagorean fuzzy values, then their aggregated value by using the PFEHA aggregation operator is also a Pythagorean fuzzy value, and

(8) PFEHAω,w(α1,α2,α3,,αn)=(j=1n(1+μα˙σ(j)2)wjj=1n(1μα˙σ(j)2)wjj=1n(1+μα˙σ(j)2)wj+j=1n(1μα˙σ(j)2)wj,2j=1n(ηα˙σ(j)2)wjj=1n(2ηα˙σ(j)2)wj+j=1n(ηα˙σ(j)2)wj).

Proof. We can prove this theorem by mathematical induction on n.

For n=2

w1α˙1=((1+μα˙12)w1(1μα˙12)w1(1+μα˙12)w1+(1μα˙12)w1,2(ηα˙12)w1(2ηα˙12)w1+(ηα˙12)w1)

and

w2α˙2=((1+μα˙22)w2(1μα˙22)w2(1+μα˙22)w2+(1μα˙22)w2,2(ηα˙22)w2(2ηα˙22)w2+(ηα˙22)w2).

Then

PFEHAω,w(α1,α2)=(j=12(1+μα˙σ(j)2)wjj=12(1μα˙σ(j)2)wjj=12(1+μα˙σ(j)2)wj+j=12(1μα˙σ(j)2)wj,2j=12(ηα˙σ(j)2)wjj=12(2ηα˙σ(j)2)wj+j=12(ηα˙σ(j)2)wj).

Thus, the result is true for n=2. Now, we assume that Eq. (8) holds for n=k. Thus

PFEHAω,w(α1,α2,α3,,αk)=(j=1k(1+μα˙σ(j)2)wjj=1k(1μα˙σ(j)2)wjj=1k(1+μα˙σ(j)2)wj+j=1k(1μα˙σ(j)2)wj,2j=1k(ηα˙σ(j)2)wjj=1k(2ηα˙σ(j)2)wj+j=1k(ηα˙σ(j)2)wj).

If Eq. (8) is true for n=k, then we show that Eq. (8) is true for n=k+1. Thus

(9) PFEHAω,w(α1,α2,α3,,αk+1)=(j=1k(1+μα˙σ(j)2)wjj=1k(1μα˙σ(j)2)wjj=1k(1+μα˙σ(j)2)wj+j=1k(1μα˙σ(j)2)wj,2j=1k(ηα˙σ(j)2)wjj=1k(2ηα˙σ(j)2)wj+j=1k(ηα˙σ(j)2)wj)ε((1+μα˙k+12)wk+1(1μα˙k+12)wk+1(1+μα˙k+12)wk+1+(1μα˙k+12)wk+1,2(ηα˙k+12)wk+1(2ηα˙k+12)wk+1+(ηα˙k+12)wk+1).

Let

p1=j=1k(1+μα˙σ(j)2)wjj=1k(1μα˙σ(j)2)wj,q1=j=1k(1+μα˙σ(j)2)wj+j=1k(1μα˙σ(j)2)wjp2=(1+μα˙k+12)wk+1(1μα˙k+12)wk+1,q2=(1+μα˙k+12)wk+1+(1μα˙k+12)wk+1,r1=2j=1k(ηα˙σ(j)2)wjs1=j=1k(2ηα˙σ(j)2)wj+j=1k(ηα˙σ(j)2)wj,s2=(2ηα˙k+12)wk+1+(ηα˙k+12)wk+1,r2=2(ηα˙k+12)wk+1.

Now, putting these values in Eq. (9), we have

PFEHAω,w(α1,α2,α3,,αk+1)=(p1q1,r1s1)ε(p2q2,r2s2).

By using the Einstein operation law, we have

(10) PFEHAω,w(α1,α2,α3,,αk+1)=(p1q1,r1s1)ε(p2q2,r2s2)=(p12q22+p22q12q12q22+p12p22,r1r22s12s22s12r22r12s22+r12r22).

Now, putting the values of p12q22+p22q12,q12q22+p12p22,r1r2,2s12s22s12r22r12s22+r12r22 in Eq. (10), then

PFEHAω,w(α1,α2,α3,,αk+1)=(j=1k+1(1+μα˙σ(j)2)wjj=1k+1(1μα˙σ(j)2)wjj=1k+1(1+μα˙δ(j)2)wj+j=1k+1(1μα˙σ(j)2)wj,2j=1k+1(ηα˙σ(j)2)wjj=1k+1(2ηα˙σ(j)2)wj+j=1k+1(ηα˙σ(j)2)wj).

Thus, Eq. (8) is true for n=k+1. Thus, Eq. (8) is true for all n.

Lemma 1 ([24], [27]). Let αj≻0, wj≻0(j=0, 2, …n) and j=1nwj=1, then

(11) j=1n(αj)wjj=1nwjαj,

where the equality holds if and only if α1=α2=…=αn.

Theorem 2. Let αj=μαj,ηαj(j=1,2,3,,n) be a collection of Pythagorean fuzzy values, then

(12) PFEHAω,w(α1,α2,α3,,αn)PFHAω,w(α1,α2,α3,,αn).

Proof. As

j=1n(1+μα˙σ(j)2)wj+j=1n(1μα˙σ(j)2)wjj=1nwj(1+μα˙σ(j)2)+j=1nwj(1μα˙σ(j)2).

Also

j=1nwj(1+μα˙σ(j)2)+j=1nwj(1μα˙σ(j)2)=2,

then

j=1n(1+μα˙σ(j)2)wj+j=1n(1μα˙σ(j)2)wj2,

thus

(13) j=1n(1+μα˙σ(j)2)wjj=1n(1μα˙δ(j)2)wjj=1n(1+μα˙δ(j)2)wj+j=1n(1μα˙σ(j)2)wj=12j=1n(1μα˙σ(j)2)wjj=1n(1+μα˙σ(j)2)wj+j=1n(1μα˙σ(j)2)wj1j=1n(1μα˙σ(j)2)wj,

where the quality holds if and only if μα˙σ(j)(j=1,2,3,,n) are equal. Again

j=1n(2ηα˙σ(j)2)wj+j=1n(ηα˙σ(j)2)wjj=1nwj(2ηα˙σ(j)2)+j=1nwj(ηα˙σ(j)2).

Also

j=1nwj(2ηα˙σ(j)2)+j=1nwj(ηα˙σ(j)2)=2

then

j=1n(2ηα˙σ(j)2)wj+j=1n(ηα˙σ(j)2)wj2,

thus,

(14) 2j=1n(ηα˙σ(j)2)wjj=1n(2ηα˙σ(j)2)wj+j=1n(ηα˙σ(j)2)wjj=1n(ηα˙σ(j))wj,

where the quality holds if and only if ηα˙σ(j)2(j=1,2,3,,n) are equal.

Let

(15) PFHAω,w(α1,α2,α3,,αn)=α

and

(16) PFEHAω,w(α1,α2,α3,,αn)=αε.

Then, Eqs. (13) and (14) can be transformed into the following forms:

(17) μα˙μα˙ε,ηα˙ηα˙ε,

thus

(18) s(α)s(αε).

If

(19) s(α)s(αε),

then

(20) PFEHAω,w(α1,α2,α3,,αn)PFHAω,w(α1,α2,α3,,αn).

If

(21) s(α)=s(αε),

then

(22) h(α)=h(αε),

thus

(23) PFEHAω,w(α1,α2,α3,,αn)=PFHAω,w(α1,α2,α3,,αn).

From Eqs. (20) to (23), Eq. (12) always holds.□

Example 1: Let

α1=(0.4,0.7),α2=(0.5,0.8),α3=(0.6,0.7),α4=(0.7,0.6),

and w=(0.1, 0.2, 0.3, 0.4)T, then

α˙1=(0.259,0.867),α˙2=(0.456,0.836),α˙3=(0.643,0.651),α˙4=(0.812,0.441).

By calculating the scores function, we have

s(α˙1)=0.684,s(α˙2)=0.491,s(α˙3)=0.010,s(α˙4)=0.465.

Hence,

s(α˙4)s(α˙3)s(α˙2)s(α˙1).

Thus

PFHAω,w(α1,α2,α3,α4)=(1j=14(1μα˙σ(j)2)wj,j=14(ηα˙σ(j))wj)=(0.517,0.717).

Now applying the PFEHA operator, we have

α˙1=(0.253,0.882),α˙2=(0.448,0.841),α˙3=(0.650,0.641),α˙4=(0.833,0.402).

By calculating the scores function, we have

s(α˙1)=0.711,s(α˙2)=0.505,s(α˙3)=0.012,s(α˙4)=0.532.

As

s(α˙4)s(α˙3)s(α˙2)s(α˙1),

thus

PFEHAω,w(α1,α2,α3,α4)=(j=14(1+μα˙σ(j)2)wjj=14(1μα˙σ(j)2)wjj=14(1+μα˙σ(j)2)wj+j=14(1μα˙σ(j)2)wj,2j=14(ηα˙σ(j)2)wjj=14(2ηα˙σ(j)2)wj+j=14(ηα˙σ(j)2)wj).=(0.507,0.742)

Theorem 3. Let αj=μαj,ηαj(j=1,2,3,,n) be a collection of Pythagorean fuzzy values, then the following properties hold:

  1. Idempotency: If α˙σ(j)=α˙, then

    (24) PFEHAω,w(α1,α2,α3,,αn)=α˙.
  2. Boundedness:

    (25) α˙minPFEHAω,w(α1,α2,α3,,αn)α˙max,

    where

    (26) α˙min=(minjμα˙σ(j),maxjηα˙σ(j)),
    (27) α˙max=(maxjμα˙σ(j),minjηα˙σ(j)).
  3. Monotonicity: Let ασ(j)=μασ(j),ηασ(j)(j=1,2,,n) be a collection of Pythagorean fuzzy values, and μασ(j)μασ(j), ηασ(j)ηασ(j), for all j, then

    (28) PFEHAω,w(α1,α2,α3,,αn)PFEHAω,w(α1,α2,α3,,αn).

Proof. Idempotency: As

PFEHAω,w(α1,α2,α3,,αn)=((1+μα˙2)j=1nwj(1μα˙2)j=1nwj(1+μα˙2)j=1nwj+(1μα˙2)j=1nwj,2(ηα˙2)j=1nwj(2ηα˙2)j=1nwj+η(να˙2)j=1nwj).=((1+μα˙2)(1μα˙2)(1+μα˙2)+(1μα˙2),2(ηα˙2)(2ηα˙2)+η(να˙2))=α˙

Boundedness: Let f(x)=2x2x2,x(0,1], then f(x)=2x3x22x20, i.e. f(x) is decreasing function on (0, 1]. As μα˙minμα˙σ(j)μα˙max, for all j, then f(μα˙max)f(μα˙σ(j))f(μα˙min), that is 2μα˙max2μα˙max22μα˙σ(j)2μα˙σ(j)22μα˙min2μα˙min2, then

(29) j=1n(2μα˙max2μα˙max2)wjj=1n(2μα˙σ(j)2μα˙σ(j)2)wjj=1n(2μα˙min2μα˙min2)wj(2μα˙max2μα˙max2)j=1nwjj=1n(2μα˙σ(j)2μα˙σ(j)2)wj(2μα˙min2μα˙min2)j=1nwj(2μα˙max2μα˙max2)+1j=1n(2μα˙σ(j)2μα˙σ(j)2)wj+1(2μα˙min2μα˙min2)+1μα˙min221j=1n(2μα˙σ(j)2μα˙σ(j)2)wj+1μα˙max22μα˙min2j=1n(μα˙σ(j)2)wjj=1n(2μα˙σ(j)2)wj+j=1n(μα˙σ(j)2)wjμα˙max.

Again, let g(y)=1y21+y2,y[0,1], then g(y)=2y(1+y2)21+y21y20, i.e. g(y) is a decreasing function on [0, 1]. As ηα˙minηα˙σ(j)ηα˙max for all j, then g(ηα˙max)g(ηα˙σ(j))g(ηα˙min) for all j, that is 1ηα˙max21+ηα˙max21ηα˙σ(j)21+ηα˙σ(j)21ηα˙min21+ηα˙min2, then

(30) (1ηα˙max21+ηα˙max2)wj(1ηα˙σ(j)21+ηα˙σ(j)2)wj(1ηα˙min21+ηα˙min2)wjj=1n(1ηα˙max21+ηα˙max2)wjj=1n(1ηα˙σ(j)21+ηα˙σ(j)2)wjj=1n(1ηα˙min21+ηα˙min2)wj(1ηα˙max21+ηα˙max2)j=1nwjj=1n(1ηα˙σ(j)21+ηα˙σ(j)2)wj(1ηα˙min21+ηα˙min2)j=1nwj1+ηα˙min22j=1n(1ηα˙σ(j)21+ηα˙σ(j)2)wj+11+ηα˙max2ηα˙minj=1n(1+ηα˙σ(j)2)wjj=1n(1ηα˙σ(j)2)wjj=1n(1ηα˙σ(j)2)wj+j=1n(1+ηα˙σ(j)2)wjηα˙max.

Let

(31) PFEHAω,w(α1,α2,α3,,αn)=α˙=(μα˙,ηα˙).

Then, Eqs. (29) and (30) can be written as

(32) μα˙minμα˙σ(j)μα˙min

and

(33) ηα˙minηα˙σ(j)ηα˙max,

thus

(34) s(α˙)s(α˙max)

and

(35) s(α˙)s(α˙min).

If

(36) s(α˙)s(α˙max)

and

(37) s(α˙)s(α˙min),

then

(38) α˙minPFEHAω,w(α1,α2,α3,,αn)α˙max.

If

(39) s(α˙)=s(α˙max),

then

(40) h(α˙)=h(α˙max).

Thus

(41) PFEHAω,w(α1,α2,α3,,αn)=α˙max.

If

(42) s(α˙)=s(α˙min),

then

(43) h(α˙)=h(α˙min).

Thus

(44) PFEHAω,w(α1,α2,α3,,αn)=α˙min.

Thus, from Eqs. (38) to (44), we have

α˙minPFEHAω,w(α1,α2,α3,,αn)α˙max.

Monotonicity: Proof is similar to 2, so it is omitted here.□

Theorem 4. The PFEWA operator is a special case of the PFEHA operator.

Theorem 5. The PFEOWA operator is a special case of the PFEHA operator.

4 An Application of the PFEHA Aggregation Operator to Multiple-Attribute Group Decision Making

In this section, we investigate an application of the PFEHA aggregation operators to multiple-attribute group decision making with Pythagorean fuzzy information.

Algorithm: Let G={G1, G2, G3, …, Gm} be the set of m alternatives, A={A1, A2, A3, …, An} be the set of n attributes, and D={D1, D2, D3, …, Dk} be the set of k decision makers. Let ω=(ω1, ω2, ω3, …, ωn)T be the weighted vector of the attributes Gi(i=1, 2, 3, …, m), such that ωi∈[0, 1] and i=1nωj=1. Let w=(w1, w2, w3, …, wk)T be the weighted vector of the decision makers Ds(s=1, 2, 3, …, k), such that ws∈[0, 1] and s=1kws=1.

  1. Construct the decision-making matrices, Ds=[αij(s)]m×n, for decision. If the criteria have two types, such as benefit criteria and cost criteria, then decision matrices Ds=[αij(s)]m×n can be converted into the decision matrices Rs=[rij(s)]m×n, where

    rijs={αijs, for benefit criteria Aja¯ijs, for cost criteria Aj,(j=1,2,,ni=1,2,,m),

    and a¯jis is the complement of αjis. If all the criteria have the same type, then there is no need of normalization.

  2. Utilize the PFEWA aggregation operators to aggregate all the individual normalized decision matrices Rs=[rij(s)]m×n into a single Pythagorean fuzzy decision matrix R=[rij]m×n, where rij=(μij, ηij)(i=1, 2, …, m, j=1, 2, …, n).

  3. Utilize α˙ij=nwjαij to derive the overall preference values.

  4. Utilize the PFEHA aggregation operators to derive the overall preference values.

  5. Calculate the scores of rj(i=1, 2, 3, …, m). If there is no difference between two or more than two scores, then we have to find out the accuracy degrees of the collective overall preference values.

  6. Arrange the scores of all alternatives in descending order and select the alternative with the highest score function.

5 Numerical Example

Suppose a company wants to invest its money in the following best option: G1, car company; G2, food company; G3, computer company; G4, TV company; and G5, fan company. The company must take a decision according to the following four attributes, whose weighted vector is ω=(0.4, 0.3, 0.2, 0.1)T. Here, A1: risk analysis, A2: growth analysis, A3: social political impact analysis, and A4: environmental analysis, where A1, A3 are cost-type criteria and A2, A4 are benefit-type criteria. There are four experts, Ds(s=1, 2, 3, 4), from a group to act as decision makers, whose weight vector is w=(0.1, 0.2, 0.3, 0.4)T.

Step 1:Construct the decision-making matrices (Tables 14).

Table 1:

Pythagorean Fuzzy Decision Matrix D1.

A1 A2 A3 A4
G1 (0.8, 0.5) (0.7, 0.4) (0.7, 0.4) (0.7, 0.5)
G2 (0.8, 0.4) (0.7, 0.5) (0.8, 0.5) (0.8, 0.3)
G3 (0.5, 0.6) (0.6, 0.5) (0.7, 0.5) (0.8, 0.3)
G4 (0.6, 0.5) (0.6, 0.4) (0.6, 0.4) (0.8, 0.4)
G5 (0.6, 0.8) (0.6, 0.6) (0.7, 0.3) (0.6, 0.5)
Table 2:

Pythagorean Fuzzy Decision Matrix D2.

A1 A2 A3 A4
G1 (0.6, 0.5) (0.8, 0.4) (0.6, 0.4) (0.6, 0.5)
G2 (0.7, 0.3) (0.8, 0.4) (0.7, 0.5) (0.7, 0.4)
G3 (0.6, 0.6) (0.6, 0.5) (0.6, 0.6) (0.7, 0.4)
G4 (0.7, 0.5) (0.6, 0.6) (0.7, 0.4) (0.8, 0.5)
G5 (0.6, 0.4) (0.7, 0.2) (0.8, 0.4) (0.8, 0.4)
Table 3:

Pythagorean Fuzzy Decision Matrix D3.

A1 A2 A3 A4
G1 (0.7, 0.5) (0.7, 0.4) (0.6, 0.5) (0.6, 0.5)
G2 (0.8, 0.3) (0.7, 0.3) (0.8, 0.3) (0.9, 0.2)
G3 (0.6, 0.5) (0.6, 0.6) (0.7, 0.4) (0.8, 0.3)
G4 (0.7, 0.5) (0.8, 0.5) (0.9, 0.1) (0.6, 0.5)
G5 (0.7, 0.5) (0.8, 0.2) (0.8, 0.2) (0.7, 0.3)
Table 4:

Pythagorean Fuzzy Decision Matrix D4.

A1 A2 A3 A4
G1 (0.8, 0.3) (0.8, 0.4) (0.7, 0.4) (0.7, 0.5)
G2 (0.8, 0.3) (0.8, 0.3) (0.8, 0.3) (0.8, 0.2)
G3 (0.6, 0.6) (0.7, 0.6) (0.7, 0.4) (0.8, 0.3)
G4 (0.7, 0.4) (0.8, 0.6) (0.8, 0.2) (0.7, 0.5)
G5 (0.6, 0.6) (0.8, 0.2) (0.8, 0.2) (0.8, 0.3)

Step 2:Construct the normalized decision-making matrices (Tables 58).

Table 5:

Normalized Decision Matrix R1.

A1 A2 A3 A4
G1 (0.5, 0.8) (0.7, 0.4) (0.4, 0.7) (0.7, 0.5)
G2 (0.4, 0.8) (0.7, 0.5) (0.5, 0.8) (0.8, 0.3)
G3 (0.6, 0.5) (0.6, 0.5) (0.5, 0.7) (0.8, 0.3)
G4 (0.5, 0.6) (0.6, 0.4) (0.4, 0.6) (0.8, 0.4)
G5 (0.8, 0.6) (0.6, 0.6) (0.3, 0.7) (0.6, 0.5)
Table 6:

Normalized Decision Matrix R2.

A1 A2 A3 A4
G1 (0.5, 0.6) (0.8, 0.4) (0.4, 0.6) (0.6, 0.5)
G2 (0.3, 0.7) (0.8, 0.4) (0.5, 0.7) (0.7, 0.4)
G3 (0.6, 0.6) (0.6, 0.5) (0.6, 0.6) (0.7, 0.4)
G4 (0.5, 0.7) (0.6, 0.6) (0.4, 0.7) (0.8, 0.5)
G5 (0.4, 0.6) (0.7, 0.2) (0.4, 0.8) (0.8, 0.4)
Table 7:

Normalized Decision Matrix R3.

A1 A2 A3 A4
G1 (0.5, 0.7) (0.7, 0.4) (0.5, 0.6) (0.6, 0.5)
G2 (0.3, 0.8) (0.7, 0.3) (0.3, 0.8) (0.9, 0.2)
G3 (0.5, 0.6) (0.6, 0.6) (0.4, 0.7) (0.8, 0.3)
G4 (0.5, 0.7) (0.8, 0.5) (0.1, 0.9) (0.6, 0.5)
G5 (0.5, 0.7) (0.8, 0.2) (0.2, 0.8) (0.7, 0.3)
Table 8:

Normalized Decision Matrix R4.

A1 A2 A3 A4
G1 (0.3, 0.8) (0.8, 0.4) (0.4, 0.7) (0.7, 0.5)
G2 (0.3, 0.8) (0.8, 0.3) (0.3, 0.8) (0.8, 0.2)
G3 (0.6, 0.6) (0.7, 0.6) (0.4, 0.7) (0.8, 0.3)
G4 (0.4, 0.7) (0.8, 0.6) (0.2, 0.8) (0.7, 0.5)
G5 (0.6, 0.6) (0.8, 0.2) (0.2, 0.8) (0.8, 0.3)

Step 3:Utilize the PFEWA operator, we have Table 9.

Table 9:

Collective Pythagorean Fuzzy Decision Matrix R.

A1 A2 A3 A4
G1 (0.432, 0.728) (0.764, 0.400) (0.432, 0.649) (0.653, 0.500)
G2 (0.311, 0.779) (0.764, 0.335) (0.372, 0.779) (0.823, 0.239)
G3 (0.572, 0.589) (0.643, 0.568) (0.459, 0.679) (0.782, 0.317)
G4 (0.463, 0.689) (0.753, 0.546) (0.259, 0.789) (0.684, 0.489)
G5 (0.568, 0.629) (0.767, 0.224) (0.263, 0.789) (0.757, 0.335)

Step 4:Utilize α˙ij=nwjαij, we have

α˙11=(0.542,0.572),α˙12=(0.815,0.318),α˙13=(0.387,0.718),α˙14=(0.424,0.793)α˙21=(0.393,0.648),α˙22=(0.815,0.318),α˙23=(0.333,0.824),α˙24=(0.564,0.627)α˙31=(0.704,0.390),α˙32=(0.695,0.485),α˙33=(0.411,0.793),α˙34=(0.527,0.687)α˙41=(0.578,0.518),α˙42=(0.805,0.470),α˙43=(0.232,0.830),α˙44=(0.453,0.787)α˙51=(0.700,0.439),α˙52=(0.818,0.156),α˙53=(0.249,0.832),α˙54=(0.505,0.699).

By calculating the score functions, we have Table 10.

Table 10:

Pythagorean Fuzzy Hybrid Decision Matrix.

A1 A2 A3 A4
G1 (0.815, 0.318) (0.542, 0.572) (0.387, 0.718) (0.424, 0.793)
G2 (0.815, 0.318) (0.564, 0.627) (0.393, 0.648) (0.333, 0.824)
G3 (0.704, 0.390) (0.695, 0.485) (0.527, 0.687) (0.411, 0.793)
G4 (0.805, 0.470) (0.578, 0.518) (0.453, 0.787) (0.232, 0.830)
G5 (0.818, 0.156) (0.700, 0.439) (0.505, 0.699) (0.249, 0.832)

Step 5:Utilize the PFEHA aggregation operator, we have

r1=(0.65,0.49),r2=(0.65,0.50),r3=(0.64,0.50),r4=(0.65,0.57),r5=(0.70,0.35).

Now we calculate the scores of s(ri)(i=1, 2, 3, 4, 5), we have

s(r1)=0.18,s(r2)=0.17,s(r3)=0.16,s(r4)=0.09,s(r5)=0.37.

Step 6: Arrange the scores in descending order, we have G5 is the best option (Table 11).

Table 11:

Comparisons with Previous Operators.

Operators Score functions Ranking
PFEWA operator s(r5)≻s(r2)≻s(r1)≻s(r3)≻s(r4) 5≻2≻1≻3≻4
PFEOWA operator s(r5)≻s(r2)≻s(r3)≻s(r1)≻s(r4) 5≻2≻3≻1≻4
PFEHA operator s(r5)≻s(r1)≻s(r2)≻s(r3)≻s(r4) 5≻1≻2≻3≻4

6 Comparison with Other Methods

In order to verify the effectiveness of the proposed method, we can compare the proposed method with other methods. First, we compare the proposed method with the method proposed by Rahman et al. [17]. The aggregation operator proposed by Rahman et al. [17] is based on algebraic operations, and that in this paper is based on Einstein operations. Obviously, the operator or method proposed in this paper is more general, more accurate, and more flexible. The Einstein operators proposed by Garg [2] are only the special cases of the proposed operator in this paper. The methods or operators proposed by Garg [2] are PFEWA aggregation operator, which weights only the Pythagorean fuzzy arguments, and PFEOWA aggregation operator, which weights only the ordered positions of the Pythagorean fuzzy arguments instead of weighting the Pythagorean fuzzy arguments themselves. To overcome these limitations in this paper, we have developed the notion of PFEHA aggregation operator, which weights both the given Pythagorean fuzzy value and its ordered position.

7 Conclusion

The objective of this paper is to present the PFEHA aggregation operator based on Pythagorean fuzzy numbers and to apply it to the multi-attribute group decision-making problems where the attribute values are Pythagorean fuzzy numbers. First, we have developed the PFEHA aggregation operator along with its properties. Furthermore, we have developed a method for multi-criteria group decision making based on this operator, and the operational processes have been illustrated in detail. An illustrative example of selecting the best company to invest money has been considered for demonstrating the approach. The suggested methodology can be used for any type of selection problem involving any number of selection attributes. We ended the paper with an application of the new approach in a group decision-making problem.

In further research, it is necessary and meaningful to give the applications of this operator to the other domains, such as induction, interval numbers, fuzzy numbers, linguistic variables, pattern recognition, fuzzy cluster analysis, uncertain programming, etc.

Bibliography

[1] K. Atanassov, Intuitionistic fuzzy sets, Fuzzy Sets Syst. 20 (1986), 87–96.10.1016/S0165-0114(86)80034-3Suche in Google Scholar

[2] H. Garg, A new generalized Pythagorean fuzzy information aggregation using Einstein operations and its application to decision making, Int. J. Intell. Syst. 31 (2016), 1–35.10.1002/int.21809Suche in Google Scholar

[3] H. Garg, Generalized Pythagorean fuzzy geometric aggregation operators using Einstein t-norm and t-conorm for multicriteria decision-making process, Int. J. Intell. Syst. 32 (2016), 1–34.10.1002/int.21860Suche in Google Scholar

[4] H. Garg, Confidence levels based Pythagorean fuzzy aggregation operators and its application to decision-making process, Comput. Math. Organ. Theory 23 (2017), 546–571.10.1007/s10588-017-9242-8Suche in Google Scholar

[5] X. Gou, Z. Xu and P. Ren, The properties of continuous Pythagorean fuzzy information, Int. J. Intell. Syst. 31 (2016), 401–424.10.1002/int.21788Suche in Google Scholar

[6] D. Liang and Z. Xu, Projection model for fusing information in Pythagorean fuzzy multi-criteria group decision making based on geometric Bonferroni mean, Int. J. Intell. Syst. 9 (2017), 966–987.10.1002/int.21879Suche in Google Scholar

[7] D. C. Liang, Y. R. J. Zhang, Z. S. Xu and A. P. Darko, Pythagorean fuzzy Bonferroni mean aggregation operator and its accelerative calculating algorithm with the multithreading, Int. J. Intell. Syst. 33 (2018), 615–633.10.1002/int.21960Suche in Google Scholar

[8] H. Liao and Z. Xu, Intuitionistic fuzzy hybrid weighted aggregation operators, Int. J. Intell. Syst. 29 (2014), 971–993.10.1002/int.21672Suche in Google Scholar

[9] Z. M. Ma and Z. Xu, Symmetric Pythagorean fuzzy weighted geometric/averaging operators and their application in multi-criteria decision making problems, Int. J. Intell. Syst. 31 (2016), 1198–1219.10.1002/int.21823Suche in Google Scholar

[10] K. Rahman, S. Abdullah, F. Husain and M. S. Ali Khan, Approaches to Pythagorean fuzzy geometric aggregation operators, Int. J. Comput. Sci. Inform. Security 14 (2016), 174–200.Suche in Google Scholar

[11] K. Rahman, S. Abdullah, M. S. Ali Khan and M. Shakeel, Pythagorean fuzzy hybrid geometric aggregation operator and their applications to multiple attribute decision making, Int. J. Comput. Sci. Inform. Security 14 (2016), 837–854.Suche in Google Scholar

[12] K. Rahman, M. S. Ali Khan, M. Ullah and A. Fahmi, Multiple attribute group decision making for plant location selection with Pythagorean fuzzy weighted geometric aggregation operator, Nucleus 1 (2017), 66–74.10.71330/thenucleus.2017.107Suche in Google Scholar

[13] K. Rahman, M. S. Ali Khan and M. Ullah, New approaches to Pythagorean fuzzy averaging aggregation operators, Math. Lett. 3 (2017), 29–36.10.11648/j.ml.20170302.12Suche in Google Scholar

[14] K. Rahman, S. Abdullah, A. Ali and F. Amin, Some induced averaging aggregation operators based on Pythagorean fuzzy numbers, Math. Lett. 3 (2017), 40–45.10.11648/j.ml.20170304.11Suche in Google Scholar

[15] K. Rahman, S. Abdullah, F. Husain M. S. Ali Khan and M. Shakeel, Pythagorean fuzzy ordered weighted geometric aggregation operator and their application to multiple attribute group decision making, J. Appl. Environ. Biol. Sci. 7 (2017), 67–83.Suche in Google Scholar

[16] K. Rahman, S. Abdullah, M. S. Ali Khan, M. Ibrar and F. Husain, Some basic operations on Pythagorean fuzzy sets, J. Appl. Environ. Biol. Sci. 7 (2017), 111–119.Suche in Google Scholar

[17] K. Rahman, M. S. Ali Khan, S. Abdullah, F. Husain and M. Ibrar, Some properties of Pythagorean fuzzy hybrid averaging aggregation operator, J. Appl. Environ. Biol. Sci. 7 (2017), 122–133.Suche in Google Scholar

[18] K. Rahman, S. Abdullah, R. Ahmed and M. Ullah, Pythagorean fuzzy Einstein weighted geometric aggregation operator and their application to multiple attribute group decision making, J. Intell. Fuzzy Syst. 33 (2017), 635–647.10.3233/JIFS-16797Suche in Google Scholar

[19] K. Rahman, A. Ali, M. Shakeel, M. S. Ali Khan and M. Ullah, Pythagorean fuzzy weighted averaging aggregation operator and its application to decision making theory, Nucleus 54 (2017), 190–196.10.71330/thenucleus.2017.184Suche in Google Scholar

[20] K. Rahman, S. Abdullah, M. Jamil and M. Y. Khan, Some generalized intuitionistic fuzzy Einstein hybrid aggregation operators and their application to multiple attribute group decision making, Int. J. Fuzzy Syst. (2018), 1–9. doi: 10.1007/s40815-018-0452–0.10.1007/s40815-018-0452–0Suche in Google Scholar

[21] P. Ren, Z. Xu and X. Gou, Pythagorean fuzzy TODIM approach to multi-criteria decision making, Appl. Soft Comput. 42 (2016), 246–259.10.1016/j.asoc.2015.12.020Suche in Google Scholar

[22] W. Wang and X. Liu, Intuitionistic fuzzy geometric aggregation operators based on Einstein operations, Int. J. Intell. Syst. 26 (2011), 1049–1075.10.1002/int.20498Suche in Google Scholar

[23] W. Wang and X. Liu, Intuitionistic fuzzy information aggregation using Einstein operations, IEEE Trans. Fuzzy Syst. 20 (2012), 923–938.10.1109/TFUZZ.2012.2189405Suche in Google Scholar

[24] Z. Xu, On consistency of the weighted geometric mean complex judgement matrix in AHP, Eur. J. Oper. Res. 126 (2000), 683–687.10.1016/S0377-2217(99)00082-XSuche in Google Scholar

[25] Z. S. Xu, Intuitionistic fuzzy aggregation operators, IEEE Trans. Fuzzy Syst. 15 (2007), 1179–1187.10.1109/TFUZZ.2006.890678Suche in Google Scholar

[26] Z. Xu, Intuitionistic Fuzzy Aggregation and Clustering, Springer, Berlin, 2013.10.1007/978-3-642-28406-9Suche in Google Scholar

[27] Z. Xu and Q. L. Da, The ordered weighted geometric averaging operators, Int. J. Intell. Syst. 17 (2002), 709–716.10.1002/int.10045Suche in Google Scholar

[28] Z. S. Xu and Q. L. Da, An overview of operators for aggregating information, Int. J. Intell. Syst. 18 (2003), 953–969.10.1002/int.10127Suche in Google Scholar

[29] Z. S. Xu and R. R. Yager, Some geometric aggregation operators based on intuitionistic fuzzy sets, Int. J. Gen. Syst. 35 (2006), 417–433.10.1080/03081070600574353Suche in Google Scholar

[30] Z. Xu and H. Hu, Projection models for intuitionistic fuzzy multiple attribute decision making, Int. J. Inform. Technol. Decis. Making 9 (2010), 267–280.10.1142/S0219622010003816Suche in Google Scholar

[31] Z. Xu and X. Q. Cai, Recent advances in intuitionistic fuzzy information aggregation, Fuzzy Optim. Decis. Making 9 (2010), 359–381.10.1007/s10700-010-9090-1Suche in Google Scholar

[32] W. T. Xue, Z. S. Xu, X. L. Zhang and X. L. Tian, Pythagorean fuzzy LINMAP method based on the entropy theory for railway project investment decision making, Int. J. Intell. Syst. 33 (2018), 93–125.10.1002/int.21941Suche in Google Scholar

[33] R. R. Yager, Pythagorean fuzzy subsets, in: Proc. Joint IFSA World Congress and NAFIPS Annual Meeting, Edmonton, Canada, pp. 57–61, 2013.10.1109/IFSA-NAFIPS.2013.6608375Suche in Google Scholar

[34] R. R. Yager and A. M. Abbasov, Pythagorean membership grades, complex numbers and decision making, Int. J. Intell. Syst. 28 (2013), 436–452.10.1002/int.21584Suche in Google Scholar

[35] D. Yu and H. Liao, Visualization and quantitative research on intuitionistic fuzzy studies, Int. J. Fuzzy Syst. 30 (2016), 3653–3663.10.3233/IFS-162111Suche in Google Scholar

[36] L. A. Zadeh, Fuzzy sets, Inf. Control 8 (1965), 338–353.10.21236/AD0608981Suche in Google Scholar

[37] X. Zhang and Z. S. Xu, Extension of TOPSIS to multiple criteria decision making with Pythagorean fuzzy sets, Int. J. Intell. Syst. 29 (2014), 1061–1078.10.1002/int.21676Suche in Google Scholar

Received: 2018-02-03
Published Online: 2018-07-09

©2020 Walter de Gruyter GmbH, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 Public License.

Artikel in diesem Heft

  1. An Optimized K-Harmonic Means Algorithm Combined with Modified Particle Swarm Optimization and Cuckoo Search Algorithm
  2. Texture Feature Extraction Using Intuitionistic Fuzzy Local Binary Pattern
  3. Leaf Disease Segmentation From Agricultural Images via Hybridization of Active Contour Model and OFA
  4. Deadline Constrained Task Scheduling Method Using a Combination of Center-Based Genetic Algorithm and Group Search Optimization
  5. Efficient Classification of DDoS Attacks Using an Ensemble Feature Selection Algorithm
  6. Distributed Multi-agent Bidding-Based Approach for the Collaborative Mapping of Unknown Indoor Environments by a Homogeneous Mobile Robot Team
  7. An Efficient Technique for Three-Dimensional Image Visualization Through Two-Dimensional Images for Medical Data
  8. Combined Multi-Agent Method to Control Inter-Department Common Events Collision for University Courses Timetabling
  9. An Improved Particle Swarm Optimization Algorithm for Global Multidimensional Optimization
  10. A Kernel Probabilistic Model for Semi-supervised Co-clustering Ensemble
  11. Pythagorean Hesitant Fuzzy Information Aggregation and Their Application to Multi-Attribute Group Decision-Making Problems
  12. Using an Efficient Optimal Classifier for Soil Classification in Spatial Data Mining Over Big Data
  13. A Bayesian Multiresolution Approach for Noise Removal in Medical Magnetic Resonance Images
  14. Gbest-Guided Artificial Bee Colony Optimization Algorithm-Based Optimal Incorporation of Shunt Capacitors in Distribution Networks under Load Growth
  15. Graded Soft Expert Set as a Generalization of Hesitant Fuzzy Set
  16. Universal Liver Extraction Algorithm: An Improved Chan–Vese Model
  17. Software Effort Estimation Using Modified Fuzzy C Means Clustering and Hybrid ABC-MCS Optimization in Neural Network
  18. Handwritten Indic Script Recognition Based on the Dempster–Shafer Theory of Evidence
  19. An Integrated Intuitionistic Fuzzy AHP and TOPSIS Approach to Evaluation of Outsource Manufacturers
  20. Automatically Assess Day Similarity Using Visual Lifelogs
  21. A Novel Bio-Inspired Algorithm Based on Social Spiders for Improving Performance and Efficiency of Data Clustering
  22. Discriminative Training Using Noise Robust Integrated Features and Refined HMM Modeling
  23. Self-Adaptive Mussels Wandering Optimization Algorithm with Application for Artificial Neural Network Training
  24. A Framework for Image Alignment of TerraSAR-X Images Using Fractional Derivatives and View Synthesis Approach
  25. Intelligent Systems for Structural Damage Assessment
  26. Some Interval-Valued Pythagorean Fuzzy Einstein Weighted Averaging Aggregation Operators and Their Application to Group Decision Making
  27. Fuzzy Adaptive Genetic Algorithm for Improving the Solution of Industrial Optimization Problems
  28. Approach to Multiple Attribute Group Decision Making Based on Hesitant Fuzzy Linguistic Aggregation Operators
  29. Cubic Ordered Weighted Distance Operator and Application in Group Decision-Making
  30. Fault Signal Recognition in Power Distribution System using Deep Belief Network
  31. Selector: PSO as Model Selector for Dual-Stage Diabetes Network
  32. Oppositional Gravitational Search Algorithm and Artificial Neural Network-based Classification of Kidney Images
  33. Improving Image Search through MKFCM Clustering Strategy-Based Re-ranking Measure
  34. Sparse Decomposition Technique for Segmentation and Compression of Compound Images
  35. Automatic Genetic Fuzzy c-Means
  36. Harmony Search Algorithm for Patient Admission Scheduling Problem
  37. Speech Signal Compression Algorithm Based on the JPEG Technique
  38. i-Vector-Based Speaker Verification on Limited Data Using Fusion Techniques
  39. Prediction of User Future Request Utilizing the Combination of Both ANN and FCM in Web Page Recommendation
  40. Presentation of ACT/R-RBF Hybrid Architecture to Develop Decision Making in Continuous and Non-continuous Data
  41. An Overview of Segmentation Algorithms for the Analysis of Anomalies on Medical Images
  42. Blind Restoration Algorithm Using Residual Measures for Motion-Blurred Noisy Images
  43. Extreme Learning Machine for Credit Risk Analysis
  44. A Genetic Algorithm Approach for Group Recommender System Based on Partial Rankings
  45. Improvements in Spoken Query System to Access the Agricultural Commodity Prices and Weather Information in Kannada Language/Dialects
  46. A One-Pass Approach for Slope and Slant Estimation of Tri-Script Handwritten Words
  47. Secure Communication through MultiAgent System-Based Diabetes Diagnosing and Classification
  48. Development of a Two-Stage Segmentation-Based Word Searching Method for Handwritten Document Images
  49. Pythagorean Fuzzy Einstein Hybrid Averaging Aggregation Operator and its Application to Multiple-Attribute Group Decision Making
  50. Ensembles of Text and Time-Series Models for Automatic Generation of Financial Trading Signals from Social Media Content
  51. A Flame Detection Method Based on Novel Gradient Features
  52. Modeling and Optimization of a Liquid Flow Process using an Artificial Neural Network-Based Flower Pollination Algorithm
  53. Spectral Graph-based Features for Recognition of Handwritten Characters: A Case Study on Handwritten Devanagari Numerals
  54. A Grey Wolf Optimizer for Text Document Clustering
  55. Classification of Masses in Digital Mammograms Using the Genetic Ensemble Method
  56. A Hybrid Grey Wolf Optimiser Algorithm for Solving Time Series Classification Problems
  57. Gray Method for Multiple Attribute Decision Making with Incomplete Weight Information under the Pythagorean Fuzzy Setting
  58. Multi-Agent System Based on the Extreme Learning Machine and Fuzzy Control for Intelligent Energy Management in Microgrid
  59. Deep CNN Combined With Relevance Feedback for Trademark Image Retrieval
  60. Cognitively Motivated Query Abstraction Model Based on Associative Root-Pattern Networks
  61. Improved Adaptive Neuro-Fuzzy Inference System Using Gray Wolf Optimization: A Case Study in Predicting Biochar Yield
  62. Predict Forex Trend via Convolutional Neural Networks
  63. Optimizing Integrated Features for Hindi Automatic Speech Recognition System
  64. A Novel Weakest t-norm based Fuzzy Fault Tree Analysis Through Qualitative Data Processing and Its Application in System Reliability Evaluation
  65. FCNB: Fuzzy Correlative Naive Bayes Classifier with MapReduce Framework for Big Data Classification
  66. A Modified Jaya Algorithm for Mixed-Variable Optimization Problems
  67. An Improved Robust Fuzzy Algorithm for Unsupervised Learning
  68. Hybridizing the Cuckoo Search Algorithm with Different Mutation Operators for Numerical Optimization Problems
  69. An Efficient Lossless ROI Image Compression Using Wavelet-Based Modified Region Growing Algorithm
  70. Predicting Automatic Trigger Speed for Vehicle-Activated Signs
  71. Group Recommender Systems – An Evolutionary Approach Based on Multi-expert System for Consensus
  72. Enriching Documents by Linking Salient Entities and Lexical-Semantic Expansion
  73. A New Feature Selection Method for Sentiment Analysis in Short Text
  74. Optimizing Software Modularity with Minimum Possible Variations
  75. Optimizing the Self-Organizing Team Size Using a Genetic Algorithm in Agile Practices
  76. Aspect-Oriented Sentiment Analysis: A Topic Modeling-Powered Approach
  77. Feature Pair Index Graph for Clustering
  78. Tangramob: An Agent-Based Simulation Framework for Validating Urban Smart Mobility Solutions
  79. A New Algorithm Based on Magic Square and a Novel Chaotic System for Image Encryption
  80. Video Steganography Using Knight Tour Algorithm and LSB Method for Encrypted Data
  81. Clay-Based Brick Porosity Estimation Using Image Processing Techniques
  82. AGCS Technique to Improve the Performance of Neural Networks
  83. A Color Image Encryption Technique Based on Bit-Level Permutation and Alternate Logistic Maps
  84. A Hybrid of Deep CNN and Bidirectional LSTM for Automatic Speech Recognition
  85. Database Creation and Dialect-Wise Comparative Analysis of Prosodic Features for Punjabi Language
  86. Trapezoidal Linguistic Cubic Fuzzy TOPSIS Method and Application in a Group Decision Making Program
  87. Histopathological Image Segmentation Using Modified Kernel-Based Fuzzy C-Means and Edge Bridge and Fill Technique
  88. Proximal Support Vector Machine-Based Hybrid Approach for Edge Detection in Noisy Images
  89. Early Detection of Parkinson’s Disease by Using SPECT Imaging and Biomarkers
  90. Image Compression Based on Block SVD Power Method
  91. Noise Reduction Using Modified Wiener Filter in Digital Hearing Aid for Speech Signal Enhancement
  92. Secure Fingerprint Authentication Using Deep Learning and Minutiae Verification
  93. The Use of Natural Language Processing Approach for Converting Pseudo Code to C# Code
  94. Non-word Attributes’ Efficiency in Text Mining Authorship Prediction
  95. Design and Evaluation of Outlier Detection Based on Semantic Condensed Nearest Neighbor
  96. An Efficient Quality Inspection of Food Products Using Neural Network Classification
  97. Opposition Intensity-Based Cuckoo Search Algorithm for Data Privacy Preservation
  98. M-HMOGA: A New Multi-Objective Feature Selection Algorithm for Handwritten Numeral Classification
  99. Analogy-Based Approaches to Improve Software Project Effort Estimation Accuracy
  100. Linear Regression Supporting Vector Machine and Hybrid LOG Filter-Based Image Restoration
  101. Fractional Fuzzy Clustering and Particle Whale Optimization-Based MapReduce Framework for Big Data Clustering
  102. Implementation of Improved Ship-Iceberg Classifier Using Deep Learning
  103. Hybrid Approach for Face Recognition from a Single Sample per Person by Combining VLC and GOM
  104. Polarity Analysis of Customer Reviews Based on Part-of-Speech Subcategory
  105. A 4D Trajectory Prediction Model Based on the BP Neural Network
  106. A Blind Medical Image Watermarking for Secure E-Healthcare Application Using Crypto-Watermarking System
  107. Discriminating Healthy Wheat Grains from Grains Infected with Fusarium graminearum Using Texture Characteristics of Image-Processing Technique, Discriminant Analysis, and Support Vector Machine Methods
  108. License Plate Recognition in Urban Road Based on Vehicle Tracking and Result Integration
  109. Binary Genetic Swarm Optimization: A Combination of GA and PSO for Feature Selection
  110. Enhanced Twitter Sentiment Analysis Using Hybrid Approach and by Accounting Local Contextual Semantic
  111. Cloud Security: LKM and Optimal Fuzzy System for Intrusion Detection in Cloud Environment
  112. Power Average Operators of Trapezoidal Cubic Fuzzy Numbers and Application to Multi-attribute Group Decision Making
Heruntergeladen am 7.9.2025 von https://www.degruyterbrill.com/document/doi/10.1515/jisys-2018-0071/html
Button zum nach oben scrollen