Home Pythagorean Hesitant Fuzzy Information Aggregation and Their Application to Multi-Attribute Group Decision-Making Problems
Article Open Access

Pythagorean Hesitant Fuzzy Information Aggregation and Their Application to Multi-Attribute Group Decision-Making Problems

  • Muhammad Sajjad Ali Khan EMAIL logo , Saleem Abdullah , Asad Ali and Khaista Rahman
Published/Copyright: January 5, 2018
Become an author with De Gruyter Brill

Abstract

In this paper, we introduce the concept of the Pythagorean hesitant fuzzy set (PHFS), which is the generalization of the intuitionistic hesitant fuzzy set under the restriction that the square sum of its membership degrees is ≤1. In decision making with PHFSs, aggregation operators play a key role because they can be used to synthesize multidimensional evaluation values represented as Pythagorean hesitant fuzzy values into collective values. Under PHFS environments, Pythagorean hesitant fuzzy ordered weighted averaging and Pythagorean fuzzy ordered weighted geometric operators are used to aggregate the Pythagorean hesitant fuzzy values. The main advantage of these operators is that they provide more accurate and valuable results. Furthermore, these operators are applied to decision-making problems in which experts provide their preferences in the Pythagorean hesitant fuzzy environment to show the validity, practicality, and effectiveness of the new approach. Finally, we compare the proposed approach to the existing methods.

1 Introduction

The concept of fuzzy set was first proposed by Zadeh in his important paper [31] to handle uncertainty. The aggregation of various inputs into a single output is a major problem and has been discussed by many others [23, 24, 30]. Therefore, in Ref. [4], the concept of fuzzy set used by Bellman and Zadeh in decision making for the solution of uncertainty in information came from human preferences. Dubois [7] compared old and new methods for fuzzy decision analysis. Liu and Liao [17] conducted a bibliometric analysis on fuzzy decision-related research for finding underlying patterns and dynamics in this research direction. The fuzzy set is characterized by membership degrees; therefore, Atanassov defined the intuitionistic fuzzy set (IFS), which is the generalization of fuzzy set and characterized by a membership function and a non-membership function [1, 2]. The notion of IFS is more appropriate for dealing with uncertainty and fuzziness than that of fuzzy set. IFS is very suitable for showing the uncertainty and vagueness of an object, and hence an IFS can be used as a powerful tool to obtain precise data information under different fuzzy environments that receive great attention. In decision-making problems, the concept of IFS is broadly applied [3, 5, 6, 10]. Liao and Xu [13] proposed a series of intuitionistic hybrid operators, namely intuitionistic hybrid weighted average operator, intuitionistic hybrid weighted geometric operator, generalized intuitionistic hybrid weighted average operator, and generalized intuitionistic hybrid weighted geometric operator. In Ref. [15], Liao et al. proposed an enhanced consensus-reaching process for group decision making with intuitionistic fuzzy preference relations (IFPRs). In Ref. [25], Xu and Liao presented a comprehensive survey on decision making with IFPRs with the aim of providing a clear perspective on the originality, consistency, prioritization, and consensus of IFPRs. In Ref. [29], Yu and Liao made a scientometric review on IFS studies to reveal the most cited papers, influential authors, and influential journals in this domain, based on the 1318 references retrieved from the Science Citation Index Expanded and Social Science Citation Index databases via Web of Science.

The extended the notion of IFS by Yager in Refs. [26, 27] initiated the notion of the Pythagorean fuzzy set (PFS), under the restriction that the sum of square of membership degree and non-membership degree is ≤1. Many researchers have paid attention to the group decision-making problems by using the concept of Pythagorean fuzzy. In Ref. [28], the relation between Pythagorean membership degrees and complex numbers has been discussed. The authors showed that Pythagorean degrees are a subclass of complex numbers and is said to be Π−i numbers. Zhang and Xu in Ref. [33] introduced a method for order preference by similarity to a best solution to solve the multiple criteria decision-making (MCDM) problem with Pythagorean fuzzy information. In Ref. [27], Yager proposed a series of aggregation operators, which are the Pythagorean fuzzy weighted average operator, Pythagorean fuzzy weighted geometric average operator, Pythagorean fuzzy weighted power average operator, and Pythagorean fuzzy weighted power geometric average operator, to aggregate the different Pythagorean fuzzy numbers (PFNs). These proposed operators have been proved with an application to the MCDM problem. Peng and Yang [18] introduced some new operations in PFS, which are division and subtraction, and discussed their corresponding properties. The authors also dealt with the superiority and inferiority ranking method to solve the multi-attribute group decision-making problems with Pythagorean fuzzy information. Liang et al. [11] initiated the concept of Pythagorean fuzzy geometric Bonferroni mean and weighted Pythagorean fuzzy geometric Bonferroni mean operators. In Ref. [8], Garg developed the interval-valued Pythagorean fuzzy weighted average operator and interval-valued Pythagorean fuzzy geometric operator, and introduced the concept of new accuracy function under an interval-valued Pythagorean fuzzy environment.

The concept of fuzzy set was further extended by Torra in Ref. [22], who then introduced the notion of hesitant fuzzy sets (HFSs). HFSs permit the situation of the membership having a set of possible values. Using the concept of HFS, many researchers solved group decision-making problems with aggregation operators in Refs. [16, 22, 23, 30, 32]. Liao and Xu [12] proposed the concepts of hesitant fuzzy hybrid arithmetic averaging (HFHAA) operator, hesitant fuzzy hybrid arithmetic geometric (HFHAG) operator, quasi-HFHAA operator, and quasi-HFHAG operator, and investigated some of their properties. Liao et al. [14] developed a generalized family of hybrid operators under a hesitant fuzzy environment, namely generalized hesitant fuzzy hybrid weighted averaging operator, generalized hesitant fuzzy hybrid weighted geometric operator, generalized quasi-hesitant fuzzy hybrid weighted averaging operator, generalized quasi-hesitant fuzzy hybrid weighted geometric operator, and their induced forms.

In Ref. [21], Qian et al. generalized the notion of HFSs with IFSs and referred to them as generalized HFSs, which, in essence, extended the element of HFSs from a real number to an intuitionistic fuzzy number (IFN). Zhu et al. [34] developed the concept of dual HFSs and also discussed their basic operations and properties. Peng et al. [19] introduced an MCDM approach with hesitant interval-valued IFSs, which are an extension of dual interval-valued HFSs. However, dual HFSs are defined in terms of sets of values, as opposed to precise numbers, for the membership degrees and non-membership degrees of IFSs. In Ref. [20], the authors applied the concept of intuitionistic HFS (IHFS) to group decision-making problems using fuzzy cross-entropy. PFSs, HFSs, and IHFSs have attracted more and more scholars’ attention due to their powerfulness in expressing vagueness and uncertainty. IHFS satisfies the condition that the sum of its membership degrees is ≤1. However, there may be a situation where the decision maker may provide the degree of membership and non-membership of a particular attribute in such a way that their sum is >1. To overcome this shortcoming, Khan et al. [9] initiated the concept of Pythagorean HFS (PHFS), which is the generalization of the notion of IHFS. PHFS satisfies the condition that the square sum of its membership degrees and non-membership degree is ≤1. They introduced score and accuracy functions and developed aggregation operators, namely Pythagorean hesitant fuzzy weighted average (PHFWA) operator and Pythagorean hesitant fuzzy weighted geometric (PHFWG) operator. In this paper, we develop aggregation operators, namely Pythagorean hesitant fuzzy ordered weighted average (PHFOWA) operator and Pythagorean hesitant fuzzy ordered weighted geometric (PHFOWG) operator. We discuss some properties, like idempotency, boundedness, and monotonicity, of these operators. In order to do so, the remainder of the paper is organized as follows.

In the next section, we discuss some basic definitions and properties. In Section 3, we develop aggregation operators, such as the PHFOWA and PHFOWG operators. In Section 4, we develop multi-attribute decision making based on the proposed aggregation operators in which experts provide their preferences in the form of Pythagorean hesitant fuzzy numbers (PHFNs).

In Section 5, we give a numerical example to show the validity, practicality, and effectiveness of the proposed approach. In Section 6, we compare the proposed approach to the existing methods. Conclusion is given in Section 7.

2 Preliminaries

In this section, we review some basic definitions and results.

Definition 1 ([26]): Let X be a fixed set. Then, a PFS P in X can be defined as follows:

(1) P={x,hP(x),hP(x)|xX},

where hP(x) and hP(x) are mappings from X to [0, 1], such that 0≤hP(x)≤1, 0hP(x)1, and also 0hP2(x)+hP2(x)1, for all xX; here, hP(x) and h′(x) denote the membership degree and non-membership degree of element xX to set P, respectively. Let πP(x)=1hP2(x)hP2(x). Then, it is commonly called the Pythagorean fuzzy index of element xX to set P, representing the degree of indeterminacy of x to P. Also, 0≤πP(x)≤1, for every xX. We denote the PFN by p=Λh^,Γh^.

To compare two PFNs in Ref. [33], the authors introduced the concept of score function and accuracy degree. They also discussed some relation between them.

Definition 2 ([33]): Let p1=Λp1,Γp1 and p2=Λp2,Γp2 be two PFNs. Then, S(p1)=Λp12Γp12 and S(p2)=Λp22Γp22 are the scores of p1 and p2, respectively, and H(p1)=Λp12+Γp12, H(p2)=Λp22+Γp22 are the accuracy degrees of p1, p2, respectively. Then, we have

  1. If S(p1)<S(p2), then p1 is smaller than p2, denoted by p1<p2.

  2. If S(p1)=S(p2), then

    1. If H(p1)=H(p2), then p1 and p2 represent the same information, i.e. Λp12=Λp22 and Γp12=Γp22 denoted by p1=p2.

    2. If H(p1)<H(p2), then p1 is smaller than p2 denoted by p1<p2.

    3. If H(p1)>H(p2), then p1 is greater than p2 denoted by p1>p2.

Definition 3 ([22]): Let X be a fixed set. Then, HFS H in X can be defined as follows:

(2) H={x,hH(x)|xX},

where hH(x) denotes the set of some values belonging to [0, 1], i.e. the possible membership degree of the element xX to the set H. For convenience, we denote a hesitant fuzzy number (HFN) by h=hH(x) and the set of all HFNs as HFN.

Definition 4 ([22]): Let h, h2, and h3 be three HFNs. Then, some basic operations on HFNs can be defined as follows:

  1. hc=Uδh {1−δ}.

  2. h1h2=δ1h1,δ2h2max{δ1,δ2}.

  3. h1h2=δ1h1,δ2h2min{δ1,δ2}.

Definition 5 ([20]): Let X be a fixed set. Then, IHFS IH in X can be defined as follows:

(3) IH={x,ΛIH(x),ΓIH(x)|xX},

where ΛIH(x) and ΓIH(x) are mappings from X to [0, 1], denoting a possible degree of membership and non-membership degree of element xX in IH, respectively, and for every element xX, for all hIH(x)ΛI(x), hIH(x)ΓIH(x) such that 0hIH(x)+hIH(x)1, and for all hIH(x)ΓIH(x), hIH(x)ΛIH(x) such that 0hIH(x)+hIH(x)1. If X has only one element, x,ΛIH(x),ΓIH(x) is said to be an intuitionistic HFN (IHFN) and is denoted by h^=Λh^,Γh^. The set of all IHFNs is denoted by IHFNs.

2.1 Intuitionistic Hesitant Fuzzy Aggregation Operators

Definition 6 ([20]): Let h^i=(Λh^i,Γh^i)(i=1,2,3,,n) be a collection of all IHFNs, and w=(w1, w2,…,wn) be the weight vector of h^i(i=1,2,3,,n) with wi≥0 (i=1, 2, 3,…,n) such that wi∈[0, 1] and i=1nwi=1. Then, the aggregation result using the IHFWA operator is also an IHFN and

(4) IHFWA(IH1,IH2,,IHn)=hIh^1Λh^1,hh^2Λh^2,,hh^nΛh^n{1i=1n(1hh^i)wi},hh^1Γh^1,hh^1Γh^2,,hh^nΓh^n{i=1n(hh^i)wi}.

Definition 7 ([20]): Let h^i=(Λh^i,Γh^i)(i=1,2,3,,n) be a collection of all IHFNs, and w=(w1, w2,…,wn) be the weight vector of h^i(i=1,2,3,,n) with wii=0 (i=1, 2, 3,…,n), where wi∈[0, 1] and i=1nwi=1. Then, the aggregation result using the IHFWG operator is also an IHFN, and

(5) IHFWG(h^1,h^2,,h^n)=hh^1Λh^1,hh^2Λh^2,,hh^nΛh^n{i=1n(hh^i)wi},hh^1Γh^1,hh^2Γh^2,,hh^nΓh^n{1i=1n(1hh^i)wi}.

In Ref. [9], Khan et al. introduced the concept of PHFS, which is the generalization of IHFS. PHFS is defined below.

2.2 Pythagorean Hesitant Fuzzy Sets

In Ref. [9], Khan et al. generalized the concept of IHFS and introduced the concept of PHFSs. PHFS is defined by Definition 8.

Definition 8 ([9]): Let X be a fixed set. A PHFS PH in X is an object with the following notion:

(6) PH={x,ΛPH(x),ΓPH(x)|xX},

where ΛPH(x) and ΓPH(x) are mappings from X to [0, 1], denoting a possible degree of membership and non-membership degree of element xX in PH, respectively, and for each element xX, ∀ hPH(x)ΛPH(x), hPH(x)ΓPH(x) such that 0hPH2(x)+hPH2(x)1, and ∀ hPH(x)ΓPH(x), hPH(x)ΛPH(x) such that 0hPH2(x)+hPH2(x)1. Moreover, PHES(X) denotes the set of all elements of PHFSs. If X has only one element x,ΛPH(x),ΓPH(x), then it is said to be PHFN and is denoted by h^=Λh^,Γh^ for convenience. We denote the set of all PHFNs by PHFNS. For all xX, if ΛPH(x) and ΓPH(x) have only one element. Then, the PHFS become a PFS. If the non-membership degree is {0}, then the PHFS becomes an HFS. For any PHFS PH={x,ΛPH(x),ΓPH(x)|xX} and for all xX, ΠPH(x)=hPHΛPH(x),hPH(x)ΓPH(x)1hPH2hPH2 is said to be the degree of indeterminacy of x to PH, where 1hPH2hPH20.

Definition 9 ([9]): Let h^=Λh^,Γh^,h^1=Λh^1,Γh^1, h^2=Λh^2,Γh^2, are three PHFNs, and λ>0. Then, their operations are defined as follows. The result obtained by these operations is also a PHFN.

  1. h^1h^2=max{Λh^1,Λh^2},min{Γh^1,Γh^2}.

  2. h^1h^2=min{Λh^1,Λh^2},max{Γh^1,Γh^2}.

  3. h^c=Γh^,Λh^.

  4. h^1h^2=hh^1Λh^1,hh^2Λh^2{hh^12+hh^22hh^12hh^22},hh^1Γh^1,hh^2Γh^2{hh^1hh^2}.

  5. h^1h^2=hh^1Λh^1,hh^2Λh^2{hh^1hh^2},hh^1Γh^1,hh^2Γh^2{hh^12+hh^22hh^12hh^22}.

  6. λh^=hh^Λh^{1(1(hh^)2)λ},hh^Γh^{(hh^)λ},λ>0.

  7. h^λ=hh^Λh^{hh^λ},hh^Γh^{1(1(hh^)2)λ},λ>0.

To compare two PHFNs, in the following [9], the score function, accuracy function, and some basic laws on the basis of the score function are defined.

Definition 10 ([9]): Let h^=Λh^,Γh^ be a PHFN. Then, we define the score function of h^ as follows:

(7) S(h^)=(1lhh^Λh^hh^Λh^hh^)2 (1lhh^Γh^hh^Γh^hh^)2,

where S(h^)[1,1]. lhh^ denotes the number of elements in Λh^ and lhh^ denotes the number of elements in Γh^.

Definition 11 ([9]): Let, h^=Λh^,Γh^ be a PHFN. Then, the accuracy degree of h^ is denoted by σ¯(h^) and can be defined as follows:

(8) σ¯(h^)=(1lhh^Λh^hh^Λh^hh^S(h^))2+(1lhh^Γh^hh^Γh^hh^S(h^))2.

Here, we can see that S(h^) is just the mean value in statistics, and σ¯(h^) is just the standard variance, which reflects the accuracy degree between all values in the PHFN h^ and their mean value. Let h^1 and h^2 be two PHFNs, S(h^1) be the score of h^1, S(h^2) be the score of h^2, and σ¯(h^1) be the deviation degree of h^1, σ¯(h^2) be the deviation degree of h^2. Then

  1. If S(h^1)<S(h^2), then h^1<h^2.

  2. If S(h^1)>S(h^2), then h^1>h^2.

  3. If S(h^1)=S(h^2), then h^1h^2.

    1. If σ¯(h^1)<σ¯(h^2), then h^1<h^2.

    2. If σ¯(h^1)>σ¯(h^2), then h^1>h^2.

    3. If σ¯(h^1)=σ¯(h^2), then h^1h^2.

3 Pythagorean Hesitant Fuzzy Information Aggregation Operators

In this section, we develop some aggregation operators for PHFNs and investigate some of its properties.

3.1 PHFWA/Geometric Operator

Definition 12 ([9]): Let h^i=Λh^i,Γh^i(i=1,2,3,,n) be a collection of all PHFNs, and w=(w1, w2,…,wn)T be the weight vector of h^i(i=1,2,3,,n) with wi≥0 (i=1, 2, 3,…,n) where wi∈[0, 1] and i=1nwi=1. Then, the PHFWA operator is a mapping PHFWA:PHFNnPHFN can be defined as

(9) PHFWA(h^1,h^2,,h^n)=w1h^1w2h^2wnh^n,

and the PHFWA operator is said to be a PHFWA operator.

Theorem 1 ([9]): Let h^i=Λh^i,Γh^i(i=1,2,3,,n) be a collection of all PHFNs, and w=(w1, w2,…,wn)T be the weight vector of h^i(i=1,2,3,,n) with wi≥0 (i=1, 2, 3,…,n), where wi∈[0, 1] and i=1nwi=1. Then, the aggregation result using PHFWA operator is also a PHFN and

(10) PHFWA(h^1,h^2,,h^n)=hh^1Λh^1,hh^2Λh^2,,hh^nΛh^n{1i=1n(1hh^i2)wi},hh^1Γh^1,hh^1Γh^2,,hh^nΓh^n{i=1n(hh^i)wi}.

Proof. Proof of the theorem follows from Theorem 4.2 in Ref. [9].□

In the following, we present some properties of the PHFWA operator.

Theorem 2: Let h^i=Λh^i,Γh^i(i=1,2,3,,n) be a collection of all PHFNs, and w=(w1, w2,…,wn)T be the weight vector of h^i(i=1,2,3,,n) with wi≥0 (i=1, 2, 3,…,n) where wi∈[0, 1] and i=1nwi=1. Then

  1. (Idempotency) If all h^i=Λh^i,Γh^i(i=1,2,3,,n) are equal, i.e. h^i(i=1,2,3,,n)=h^, then

    (11) PHFWA(h^1,h^2,,h^n)=h^.
  2. (Boundedness)

    (12) h^PHFWA(h^1,h^2,,h^n)h^+,

    where h^=h,h+, h^+=h+,h, h=hiΛh^imini{hi},

    h+=hiΛh^imaxi{hi}, h=hiΓh^imini{hi}, h+=hiΓh^imaxi{hi}.

  3. (Monotonicity) If h^i>h^i, then

    (13) PHFWA(h^1,h^2,,h^n)PHFWA(h^1,h^2,,h^n).

Proof. (1) By Theorem 1, we have

PHFWA(h^1,h^2,,h^n)=hh^iΛh^i{1i=1n(1hh^i2)wi},hh^iΓh^i{i=1n(hh^i)wi}=hh^Λh^{1i=1n(1hh^2)wi},hh^Γh^{i=1n(hh^)wi}=hh^Λh^{1(1hh^2)i=1nwi},hh^Γh^{(hh^)i=1nwi}=hh^Λh^{1(1hh^2)},hh^Γh^{hh^}=hh^Λh^{hh^},hh^Γh^{hh^}=h^

(14) (2) As hiΛh^imini{hi}hiΛh^i{hi}hiΛh^imaxi{hi}
(15) and hiΓh^imini{hi}hiΓh^i{hi}hiΓh^imaxi{hi}.

Thus, from Eq. (14), we have

hiΛh^imini{hi}hiΛh^i{hi}hiΛh^imaxi{hi}hiΛh^imini{(hi)2}hiΛh^i{(hi)2}hiΛh^imaxi{(hi)2}hiΛh^i1maxi{(hi)2}hiΛh^i1{(hi)2}hiΛh^i1mini{(hi)2}hiΛh^i(1maxi{(hi)2})wihiΛh^i(1{(hi)2})wihiΛh^i(1mini{(hi)2})wihiΛh^ii=1n(1maxi{(hi)2})wihiΛh^ii=1n(1{(hi)2})wihiΛh^ii=1n(1mini{(hi)2})wihiΛh^i(1maxi{(hi)2})i=1nwihiΛh^ii=1n(1{(hi)2})wihiΛh^i(1mini{(hi)2})i=1nwihiΛh^i1maxi{(hi)2}hiΛh^ii=1n(1{(hi)2})wihiΛh^i1mini{(hi)2}hiΛh^i1+mini{(hi)2}hiΛh^ii=1n(1{(hi)2})wihiΛh^i1+maxi{(hi)2}hiΛh^i11+mini{(hi)2}hiΛh^i1i=1n(1{(hi)2})wihiΛh^i11+maxi{(hi)2}hiΛh^imini{(hi)2}hiΛh^i1i=1n(1{(hi)2})wihiΛh^imaxi{(hi)2}hiΛh^imini{hi}hiΛh^i1i=1n(1{(hi)2})wihiΛh^imaxi{hi}

Now, from Eq. (15), we have

hiΓh^imini{hi}hiΓh^i{hi}hiΓh^imaxi{hi}hiΓh^imini{(hi)wi}hiΓh^i{(hi)wi}hiΓh^imaxi{(hi)wi}hiΓh^ii=1nmini{(hi)wi}hiΓh^ii=1n{(hi)wi}hiΓh^ii=1nmaxi{(hi)wi}hiΓh^imini{(hi)i=1nwi}hiΓh^ii=1n{(hi)wi}hiΓh^imaxi{(hi)i=1nwi}hiΓh^imini{(hi)}hiΓh^ii=1n{(hi)wi}hiΓh^imaxi{(hi)}.

According to the score function, we have PHFWA(h^1,h^2,,h^n)h^ with equality if and only if h^=PHFWA(h^).

Similarly, PHFWA(h^1,h^2,,h^n)h^+ is with equality if and only if PHFWA(h^) is the same as h^+. Hence, h^PHFWA(h^1,h^2,,h^n)h^+

(3) If h^i>h^i, then PHFWA(h^1,h^2,,h^n)PHFWA(h^1,h^2,,h^n). As Λh^iΛh^i,Γh^iΓh^i. If Λh^iΛh^i, then

(16) hiΛh^i{hi}hiΛh^i{hi}hiΛh^i{(hi)2}hiΛh^i{(hi)2}hiΛh^i{(hi)2}hiΛh^i{(hi)2}hiΛh^i1{(hi)2}hiΛh^i1{(hi)2}hiΛh^i(1{(hi)2})wihiΛh^i(1{(hi)2})wihiΛh^ii=1n(1{(hi)2})wihiΛh^ii=1n(1{(hi)2})wihiΛh^i1i=1n(1{(hi)2})wihiΛh^ii=1n(1{(hi)2})wi.

Now, if Γh^iΓh^i, then hiΓh^i{hi}hiΓh^i{hi}

(17) hiΓh^i{(hi)wi}hiΓh^i{(hi)wi}hiΓh^i{i=1n(hi)wi}hiΓh^i{i=1n(hi)wi}.

Let h^=PHFWA(h^1,h^2,,h^n) and h^=PHFWA(h^1,h^2,,h^n). Then, from Eqs. (16) and (17), we have S(h^)S(h^).

If S(h^)<S(h^), then PHFWA(h^1,h^2,,h^n)<PHFWA(h^1,h^2,,h^n). If S(h^)=S(h^), then

(1lhh^Λh^hh^Λh^hh^)2(1lhh^Γh^hh^Γh^hh^)2=(1lhh^Λh^hh^Λh^hh^)2(1lhh^Γh^hh^Γh^hh^)2(1lhh^Λh^hh^Λh^hh^)2=(1lhh^Λh^hh^Λh^hh^)2and (1lhh^Γh^hh^Γh^hh^)2=(1lhh^Γh^hh^Γh^hh^)21lhh^Λh^hh^Λh^hh^=1lhh^Λh^hh^Λh^hh^ and 1lhh^Γh^hh^Γh^hh^=1lhh^Γh^hh^Γh^hh^.

As

σ¯(h^)=(1lhh^Λh^hh^Λh^hh^S(h^))2+(1lhh^Γh^hh^Γh^hh^S(h^))2=(1lhh^Λh^hh^Λh^hh^S(h^))2+(1lhh^Γh^hh^Γh^hh^S(h^))2=σ¯(h^),

therefore, PHFWA(h^1,h^2,,h^n)=PHFWA(h^1,h^2,,h^n).

Definition 13 ([9]): Let h^i=Λh^i,Γh^i(i=1,2,3,,n) be a collection of all PHFNs, and w=(w1, w2,…,wn) be the weight vector of h^i(i=1,2,3,,n) with wi≥0 (i=1, 2, 3,…,n) such that wi∈[0, 1] and i=1nwi=1. Then, the PHFWG operator is a mapping PHFWG:PHFNnPHFN can be defined as

(18) PHFWG(h^1,h^2,,h^n)=h^1w1h^2w2,,h^nwn,

and PHFWG is said to be a PHFWG operator.

Theorem 3 ([9]): Let h^i=Λh^i,Γh^i(i=1,2,3,,n) be a collection of all PHFNs, and w=(w1, w2,…,wn)T be the weight vector of h^i(i=1,2,3,,n) with wi≥0 (i=1, 2, 3,…,n), where wi∈[0, 1] and i=1nwi=1. Then, the aggregation result using the PHFWG operator is also a PHFN, and

(19) PHFWG(h^1,h^2,,h^n)=hh^1Λh^1,hh^2Λh^2,,hh^nΛh^n{i=1n(hh^i)wi},hh^1Γh^1,hh^2Γh^2,,hh^nΓh^n{1i=1n(1hh^i2)wi}.

Proof. Proof of the theorem follows from Theorem 4.5 in Ref. [9].□

In the following, we present some properties of the PHFWG operator.

Theorem 4: Let h^i=Λh^i,Γh^i(i=1,2,3,,n) be a collection of all PHFNs, and w=(w1, w2,…,wn)T be the weight vector of h^i(i=1,2,3,,n) with wi≥0 (i=1, 2, 3,…,n) where wi∈[0, 1] and i=1nwi=1. Then

  1. (Idempotency) If all h^i=Λh^i,Γh^i(i=1,2,3,,n) are equal, i.e. h^i(i=1,2,3,,n)=h^, then

    (20) PHFWG(h^1,h^2,,h^n)=h^.
  2. (Boundedness)

    (21) h^PHFWG(h^1,h^2,,h^n)h^+,

    where h^=h,h+, h^+=h+,h, h=hiΛh^imini{hi},

    h+=hiΛh^imaxi{hi}, h=hiΓh^imini{hi}, h+=hiΓh^imaxi{hi}.

  3. (Monotonicity) If h^i>h^i, then

    (22) PHFWG(h^1,h^2,,h^n)PHFWG(h^1,h^2,,h^n).

Proof. Proof of the theorem follows from Theorem 2.□

3.2 PHFOWA/Geometric Operator

In the following, we develop a PHFOWA operator and a PHFOWG operator. We also discuss some properties of the developed operators.

Definition 14: Let h^i=Λh^i,Γh^i(i=1,2,3,,n) be a collection of all PHFNs, h^σ(i) be the largest in them, and w=(w1, w2,…,wn)T be the weight vector of h^i(i=1,2,3,,n) with wi≥0 (i=1, 2, 3,…,n) such that wi∈[0, 1] and i=1nwi=1. Then, the PHFOWA operator is a mapping PHFOWA:PHFNnPHFN can be defined by

(23) PHFOWA(h^1,h^2,,h^n)=w1h^σ(1)w2h^σ(2),,wnh^σ(n).

Theorem 5: Let h^i=Λh^i,Γh^i(i=1,2,3,,n) be a collection of all PHFNs, h^σ(i) be the largest in them, w=(w1, w2,…,wn) be the weight vector of h^i(i=1,2,3,,n) with wi≥0 (i=1, 2, 3,…,n) such that wi∈[0, 1] and i=1nwi=1. Then, the aggregation result using the PHFOWA operator is also a PHFN and

(24) PHFOWA(h^1,h^2,,h^n)=hh^σ(1)Λh^σ(1),hh^σ(2)Λh^σ(2),,hh^σ(n)Λh^σ(n){1i=1n(1hα^σ(i)2)wi},hh^σ(1)Γh^σ(1),hh^σ(2)Γh^σ(2),,hh^σ(n)Γh^σ(n){i=1n(hh^σ(i))wi}.

Proof. Proof of the theorem is the same as Theorem 1.□

Example 1: Suppose there are three experts who are invited to evaluate some decision alternatives. The evaluation of the expert is denoted by PHFNs, h^1={0.5,0.6,0.9},{0.3,0.6}, h^2={0.3,0.6,0.8},{0.2,0.7,0.9}, h^3={0.3,0.4,0.7.0.9},{0.2,0.8,0.9}, respectively, with w=(0.35, 0.4, 0.25)T as the weighted vector of h^i(i=1,2,3). To calculate the comprehensive evaluation of the three experts on the decision alternative through using the PHFOWA operator, we have

PHFOWA(h^1,h^2,h^3)=hh^σ(1)Λh^σ(1),hh^σ(2)Λh^σ(2),hh^σ(3)Λh^σ(3){1i=13(1hh^σ(i)2)wi},hh^σ(1)Γh^σ(1),hh^σ(2)Γh^σ(2),hh^σ(3)Γh^σ(3){i=13(hh^σ(i))wi}.

First, we calculate the score functions of h^1,h^2 and h^3. For this, we have

S(h^1)=0.928, S(h^2)=0.117, S(h^3)=0.1192.

Thus, S(h^1)>S(h^3)>S(h^2). Now

PHFOWA(h^1,h^2,h^3)={0.39,0.47,0.57,0.42,0.50,0.59,0.57,0.62,0.68,0.74,0.76,0.800.44,0.51,0.60,0.47,0.54,0.62,0.60,0.65,0.70,0.76,0.78,0.81,0.69,0.72,0.76,0.70,0.73,0.77,0.82,0.85,0.86,0.88,0.79,0.76},{0.23,0.32,0.34,0.40,0.55,0.59,0.42,0.58,0.61,0.29,0.43,0.51,0.70,0.75,0.54,0.73,0.78}.

This is the required degree of satisfaction and degree of dissatisfaction of the three experts on the decision alternative.

Theorem 6: Let h^i=Λh^i,Γh^i(i=1,2,3,,n) be a collection of all PHFNs, and w=(w1, w2,…,wn)T be the weight vector of h^i(i=1,2,3,,n) with wi≥0 (i=1, 2, 3,…,n), where wi∈[0, 1]and i=1nwi=1. Then

  1. (Idempotency) If all h^i=Λh^i,Γh^i(i=1,2,3,,n) are equal, i.e. h^i(i=1,2,3,,n)=h^, then

    (25) PHFOWA(h^1,h^2,,h^n)=h^.
  2. (Boundedness)

    (26) h^PHFOWA(h^1,h^2,,h^n)h^+,

    where h^=h,h+, h^+=h+,h, h=hiΛh^imini{hi},

    h+=hiΛh^imaxi{hi}, h=hiΓh^imini{hi}, h+=hiΓh^imaxi{hi}.

  3. (Monotonicity) If h^i>h^i, then

    (27) PHFOWA(h^1,h^2,,h^n)PHFOWA(h^1,h^2,,h^n).

Proof. Proof of the theorem follows from Theorem 2.□

Definition 15: Let h^i=Λh^i,Γh^i(i=1,2,3,,n) be a collection of all PHFNs, h^σ(i) be the largest in them, w=(w1, w2,…,wn) be the weight vector of h^i(i=1,2,3,,n) with wi≥0 (i=1, 2, 3,…,n) such that wi∈[0, 1] and i=1nwi=1. Then, the PHFOWG operator is a mapping PHFOWG:PHFNnPHFN can be defined by

(28) PHFOWG(h^1,h^2,,h^n)=h^σ(1)w1h^σ(2)w2,,h^σ(n)wn.

Theorem 7: Let h^i=Λh^i,Γh^i(i=1,2,3,,n) be a collection of all PHFNs, h^σ(i) be the largest in them, w=(w1, w2,…,wn) be the weight vector of h^i(i=1,2,3,,n) with wi≥0 (i=1, 2, 3,…,n) such that wi∈[0, 1] and i=1nwi=1. Then, the aggregation result using PHFOWG operator is also a PHFNS, and

(29) PHFOWG(h^1,h^2,,h^n)=hh^σ(1)Λh^σ(1),hh^σ(2)Λh^σ(2),,hh^σ(n)Λh^σ(n){i=1n(hh^σ(i))wi},hh^σ(1)Γh^σ(1),hh^σ(2)Γh^σ(2),,hh^σ(n)Γh^σ(n){1i=1n(1hh^σ(i)2)wi}.

Proof. Proof of the theorem is the same as Theorem 3.□

Example 2: To compute the comprehensive evaluation of the three experts (Example 2) on the decision alternative through using the Pythagorean hesitant fuzzy order weighted geometric operator, we have

PHFOWG(h^1,h^2,,h^n)=hh^σ(1)Λh^σ(1),hh^σ(2)Λh^σ(2),hh^σ(3)Λh^σ(3){i=13(hh^σ(i))wi},hh^σ(1)Γh^σ(1),hh^σ(2)Γh^σ(2),hh^σ(3)Γh^σ(3){1i=13(1hh^σ(i)2)wi}.

First, we calculate the score functions of h^1,h^2 and h^3. For this, we have

S(h^1)=0.928, S(h^2)=0.117, S(h^3)=0.1192.

Thus, S(h^1)>S(h^3)>S(h^2). Now

PHFOWG(h^1,h^2,,h^n)={0.36,0.43,0.46,0.40,0.48,0.51,0.50,0.60,0.64,0.56,0.66,0.71,0.38,0.49,0.55,0.54,0.65,0.69,0.59,0.76,0.44,0.52,0.63,0.62,0.74,0.79,0.68,0.81,0.81}{0.24,0.44,0.61,0.63,0.68,0.76,0.71,0.82,0.41,0.54,0.67,0.66,0.72,0.75,0.79,0.84}.

This is the required degree of satisfaction and degree of dissatisfaction of the three experts on the decision alternative.

Theorem 8: Let h^i=Λh^i,Γh^i(i=1,2,3,,n) be a collection of all PHFNs, and w=(w1, w2,,wn)T be the weight vector of h^i(i=1,2,3,,n) with wi≥0 (i=1, 2, 3,…,n), where wi∈[0, 1] and i=1nwi=1. Then

  1. (Idempotency) If all h^i=Λh^i,Γh^i(i=1,2,3,,n) are equal, i.e. h^i(i=1,2,3,,n)=h^, then

    (30) PHFOWG(h^1,h^2,,h^n)=h^.
  2. (Boundedness)

    (31) h^PHFOWG(h^1,h^2,,h^n)h^+,

    where h^=h,h+, h^+=h+,h, h=hiΛh^imini{hi}, h+=hiΛh^imaxi{hi}, h=hiΓh^imini{hi}, h+=hiΓh^imaxi{hi}.

  3. (Monotonicity) If h^i>h^i, then

    (32) PHFOWG(h^1,h^2,,h^n)PHFOWG(h^1,h^2,,h^n).

Proof. Proof of the theorem follows from Theorem 2.□

4 Group Decision Making Based on Pythagorean Hesitant Fuzzy Information

In this section, we apply the Pythagorean hesitant fuzzy aggregation operators to multi-attribute decision making with anonymity. Suppose that there are n alternatives X={x1, x2,…,xn} and m attributes A={A1, A2,…,Am} to be evaluated having the weight vector w=(w1, w2,…,wm)T such that wj∈[0, 1], j=1, 2,…,m and j=1mwj=1. To evaluate the performance of the alternative xi under the attributes Aj, the decision maker is required to provide not only the information that the alternative xi satisfies the attributes Aj, but also the information that the alternative xi does not satisfy the attributes Aj. This two-part information can be expressed by Λij and Γij, which denote the degrees that the alternative xi satisfy the criterion Aj and does not satisfy the criterion Aj, then the performance of the alternative xi under the criteria Aj can be expressed by a PHFN h^ij=Λij,Γij with the condition that for all hij∈Λij, hijΓij such that 0(hij)2+(hij)21, and for all hij∈Γij, hijΛij such that 0(hij)2+(hij)21, i=1, 2,…,n, j=1, 2,…,m, and k=1, 2,…,t. To obtain the ranking of the alternatives, the following steps are given.

Step 1: In this step, we construct the Pythagorean hesitant fuzzy decision matrices C=(h^ij)m×n for the decision where h^ij=Λij,Γij (i=1, 2,…,n; j=1, 2,…,m).

If the attribute has two types, such as cost and benefit attributes, then the Pythagorean hesitant decision matrix can be converted into the normalized Pythagorean hesitant fuzzy decision matrix

DN=(γij)m×n, where γij={h^ijif the attribute is of benefit typeh^ijcif the attribute is of cost type,

where h^ijc=Γij,Λij(i=1,2,,n;j=1,2,,m). If all the attributes have the same type, then there is no need to normalize the decision matrix.

Step 2: Utilize the developed aggregation operators to obtain the PHFN h^i(i=1,2,,n) for the alternatives Xi, that is the developed operators to derive the collective overall preference values h^i(i=1,2,,n) of the alternative xi, where w=(w1,w2,,wn)T is the weighting vector of the attributes.

Step 3: By using Eq. (7), we calculate the scores S(h^i)(i=1,2,,n) and the deviation degree σ¯(h^i)(i=1,2,,n) of all the overall values h^i(i=1,2,,n).

Step 4: Rank the alternatives xi (i=1, 2,…,n) and then select the best one.

5 Illustrative Example

In this section, we present a numerical example considering the air-conditioning system selection problem, which will be installed in a university library under Pythagorean fuzzy contexts to demonstrate the applicability and the implementation process of the proposed method. A university is planning to build a library. One of the problems faced by higher authority is to determine what kind of air-conditioning system should be installed in the library. Suppose that there exist five air-conditioning systems (alternatives) Xi (i=1, 2, 3, 4, 5), which might be adapted to the physical structure of the library. Suppose that three attributes (factors), A1=economic, A2=functional, A3=operational, are taken into consideration in the installation problem, where A1 is the attribute of cost type. The five alternatives are to be evaluated using PHFNs by the decision makers under the above attributes whose weighting vector w=(0.25,0.4,0.35)T are installed, respectively. In the following, we utilize the developed method to get the desire air-conditioning system.

Step 1: In order to avoid manipulating each other, the decision makers are required to provide their preferences in anonymity and the decision matrix C=(h^ij)m×n is presented in Table 1, where h^ij(i=1,2,3,4,5;j=1,2,3) are in the form of PHFNs.

Table 1:

Pythagorean Hesitant Fuzzy Decision Matrix C.

A1 A2 A3
X1 〈{0.5, 0.6}, {0.7, 0.8}〉 〈{0.3, 0.4}, {0.5, 0.9}〉 〈{0.3, 0.5, 0.6}, {0.8}〉
X2 〈{0.3, 0.5, 0.7}, {0.6, 0.7}〉 〈{0.3}, {0.9}〉 〈{0.4, 0.7}, {0.6}〉
X3 〈{0.7, 0.8}, {0.2, 0.6}〉 〈{0.6, 0.8}, {0.6, 0.7}〉 〈{0.7}, {0.5, 0.6}〉
X4 〈{0.8, 0.9}, {0.1}〉 〈{0.2, 0.3}, {0.9}〉 〈{0.4, 0.6}, {0.7, 0.8}〉
X5 〈{0.8, 0.9}, {0.4}〉 〈{0.3, 0.4}, {0.9}〉 〈{0.1}, {0.9}〉

Step 2: We utilize the decision information given in matrix, DN=(h^ij)m×n and the PHFWA operator to obtain the overall preference values h^i of the air-conditioning system Xi (i=1, 2, 3, 4, 5). We have

h^1={0.4611,0.5140,0.5513,0.4873,0.5361,0.5708,0.5275,0.5706,0.6016,0.5487,0.5889,0.6181},{0.5894,0.7456,0.61689,0.7804}.h^2={0.4355,0.5653,0.4841,0.5976},{0.5934,0.6742,0.7334}.h^3={0.5881,0.6931,0.6395,0.7283},{0.5850,0.6236,0.6222,0.6632,0.6049,0.6447,0.6434,0.6858}.h^4={0.2771,0.4007,0.3103,0.4223},{0.8003,0.8386,0.8242,0.8637}.h^5={0.3321,0.3870},{0.8337,0.9}.

Step 3: Calculate the scores S(h^i)(1,2,3,4,5) of the overall PHFNs h^i(1,2,3,4,5):

S(h^1)=0.1663,S(h^2)=0.17384,S(h^3)=0.0365,S(h^4)=0.5674,S(h^5)=0.6574.

Step 4: Rank all the alternatives Xi (i=1, 2, 3, 4, 5) in accordance with the scores S(h^i)(1,2,3,4,5) of the overall Pythagorean hesitant fuzzy preference numbers. We have S(h^3)>S(h^1)>S(h^2)>S(h^4)>S(h^5), which shows that X3>X1>X2>X4>X5. That is, the most desirable air-conditioning system is X3.

Next, we apply the PHFWG operator to the same problem and give steps of our proposed algorithm and start from step 2.

Step 2: We utilize the decision information given in matrix D=(h^ij)m×n and the PHFWG operator to obtain the overall preference values h^i of the air-conditioning system Xi (i=1, 2, 3, 4, 5). We have

h^1={0.3708,0.4434,0.4726,0.4160,0.4974,0.5302,0.3834,0.4584,0.4886,0.4301,0.5143,0.5482},{0.6480,0.8155,0.6652,0.8235}.h^2={0.3946,0.4799,0.4101,0.4988},{0.7550,0.7683,0.7925}.h^3={0.4812,0.5399,0.6333,0.7105},{0.6707,0.6287,0.6452,0.6692,0.6435,0.6676,0.6819,0.7028}.h^4={0.2144,0.2470,0.2521,0.2905},{0.8277,0.8492,0.8553,0.8731}.h^5={0.2462,0.2692},{0.8815,0.9000}.

Step 3: Calculate the scores S(h^i)(1,2,3,4,5) of the overall PHFNs h^i(1,2,3,4,5):

S(h^1)=0.3306,S(h^2)=0.3971,S(h^3)=0.0910,S(h^4)=0.6618,S(h^5)=0.7270.

Step 4: Rank all the alternatives Xi (i=1, 2, 3, 4, 5) in accordance with the scores S(h^i)(1,2,3,4,5) of the overall Pythagorean hesitant fuzzy preference numbers. We have S(h^3)>S(h^1)>S(h^2)>S(h^4)>S(h^5), which shows that X3>X1>X2>X4>X5. That is, the most desirable air-conditioning system is X3.

Moreover, we apply the PHFOWA operator to the problem and the order decision matrix is as follows.

Step 2: We utilize the decision information given in matrix C=(h^ij)m×n and the PHFOWA operator to obtain the overall preference values h^i of the air-conditioning system Xi (i=1, 2, 3, 4, 5). We have

h^1={0.4611,0.5140,0.5513,0.4873,0.5361,0.5708,0.5275,0.5706,0.6016,0.5487,0.5888,0.6181},{0.5894,0.7456,0.6169,0.7804}.h^2={0.4393,0.5822,0.4873,0.6127},{0.5815,0.6607,0.7187}.h^3={0.5506,0.6287,0.6681,0.7209},{0.6051,0.6340,0.6435,0.6743,0.6333,0.6636,0.6735,0.7058}.h^4={0.2450,0.2783,0.3490,0.3716},{0.8063,0.8452,0.8337,0.8739}.h^5={0.3321,0.3870},{0.8739,0.9}.

Step 3: Calculate the scores S(h^i)(1,2,3,4,5) of the overall PHFNs h^i(1,2,3,4,5)

S(h^1)=0.1663,S(h^2)=0.1459,S(h^3)=0.2221,S(h^4)=0.6085,S(h^5)=0.6574.

Step 4: Rank all the alternatives Xi (i=1, 2, 3, 4, 5) in accordance with the scores S(h^i)(1,2,3,4,5) of the overall Pythagorean hesitant fuzzy preference numbers. We have S(h^3)>S(h^1)>S(h^2)>S(h^4)>S(h^5), which shows that X3>X1>X2>X4>X5. That is, the most desirable air-conditioning system is X3.

Finally, we apply the PHFOWG operator to the problem and the order decision matrix is in Table 2.

Table 2:

Normalized Pythagorean Hesitant Fuzzy Decision Matrix DN.

A1 A2 A3
X1 〈{0.7, 0.8}, {0.5, 0.6}〉 〈{0.3, 0.4}, {0.5, 0.9}〉 〈{0.3, 0.5, 0.6}, {0.8}〉
X2 〈{0.6, 0.7}, {0.3, 0.5, 0.7}〉 〈{0.3}, {0.9}〉 〈{0.4, 0.7}, {0.6}〉
X3 〈{0.2, 0.6}, {0.7, 0.8}〉 〈{0.6, 0.8}, {0.6, 0.7}〉 〈{0.7}, {0.5, 0.6}〉
X4 〈{0.1}, {0.8, 0.9}〉 〈{0.2, 0.3}, {0.9}〉 〈{0.4, 0.6}, {0.7, 0.8}〉
X5 〈{0.4}, {0.8, 0.9}〉 〈{0.3, 0.4}, {0.9}〉 〈{0.1}, {0.9}〉

Step 2: We utilize the decision information given in matrix C=(h^ij)m×n and the PHFOWG operator to obtain the overall preference values h^i of the air-conditioning system Xi (i=1, 2, 3, 4, 5). We have

h^1={0.3708,0.4434,0.4726,0.4160,0.4974,0.5302,0.3834,0.4584,0.4886,0.4301,0.5143,0.5482},{0.6480,0.8155,0.6652,0.8235}.h^2={0.4003,0.5007,0.41560,0.5204},{0.7370,0.7515,0.7776}.h^3={0.4245,0.6236,0.4763,0.6996},{0.6205,0.6750,0.6621,0.7091,0.6395,0.6905,0.6784,0.7226}.h^4={0.1866,0.2195,0.2065,0.2429},{0.8342,0.8699,0.8492,0.8815}.h^5={0.2462,0.2692},{0.8815,0.9000}.

Step 3: Calculate the scores S(h^i)(1,2,3,4,5) of the overall PHFNs h^i(1, 2, 3, 4, 5)

S(h^1)=0.3306,S(h^2)=0.3597,S(h^3)=0.1461,S(h^4)=0.6817,S(h^5)=0.7270.

Step 4: Rank all the alternatives Xi (i=1, 2, 3, 4, 5) in accordance with the scores S(h^i) (1, 2, 3, 4, 5) of the overall Pythagorean hesitant fuzzy preference numbers. We have S(h^3)>S(h^1)>S(h^2)>S(h^4)>S(h^5), which shows that X3>X1>X2>X4>X5. That is, the most desirable air-conditioning system is X3.

6 Comparison Analysis

In order to verify the validity and effectiveness of the proposed approach, a comparative study is conducted using the methods of HFNs by Torra [22] and IHFSs by Peng et al. [20], which are special cases of PHFNs, to the same illustrative example (Table 3).

Table 3:

Ordered Pythagorean Hesitant Fuzzy Decision Matrix.

A1 A2 A3
X1 〈{0.7, 0.8}, {0.5, 0.6}〉 〈{0.3, 0.4}, {0.5, 0.9}〉 〈{0.3, 0.5, 0.6}, {0.8}〉
X2 〈{0.6, 0.7}, {0.3, 0.5, 0.7}〉 〈{0.4, 0.7}, {0.6}〉 〈{0.3}, {0.9}〉
X3 〈{0.7}, {0.5, 0.6}〉 〈{0.6, 0.8}, {0.6, 0.7}〉 〈{0.2, 0.6}, {0.7, 0.8}〉
X4 〈{0.4, 0.6}, {0.7, 0.8}〉 〈{0.2, 0.3}, {0.9}〉 〈{0.1}, {0.8, 0.9}〉
X5 〈{0.4, 0.5}, {0.9}〉 〈{0.1}, {0.9}〉 〈{0.4}, {0.8, 0.9}〉

For comparison with the HFNs, the PHFNs can be transformed to HFNs by retaining only the membership degrees. The hesitant fuzzy information is represented in Table 4.

Table 4:

Hesitant Fuzzy Decision Matrix.

A1 A2 A3
X1 {0.7, 0.8} {0.3, 0.4} {0.3, 0.5, 0.6}
X2 {0.6, 0.7} {0.3} {0.4, 0.7}
X3 {0.2, 0.6} {0.6, 0.8} {0.7}
X4 {0.1} {0.2, 0.3} {0.4, 0.6}
X5 {0.4} {0.3, 0.4} {0.1}

The ordered hesitant fuzzy decision matrix is given in Table 5.

Table 5:

Ordered Hesitant Fuzzy Decision Matrix.

A1 A2 A3
X1 {0.7, 0.8} {0.3, 0.5, 0.6} {0.3, 0.4}
X2 {0.6, 0.7} {0.4, 0.7} {0.3}
X3 {0.6, 0.8} {0.7} {0.2, 0.6}
X4 {0.4, 0.6} {0.2, 0.3} {0.1}
X5 {0.4} {0.3, 0.4} {0.1}

Using the HFOWA operator [23], the score values are as follows:

S(h^1)=0.5326, S(h^2)=0.5145, S(h^3)=0.6268, S(h^4)=0.2907, S(h^5)=0.2865.

The ranking of the alternatives is X3>X1>X2>X4>X5.

Using the HFOWG operator [23], the score values are as follows:

S(h^1)=0.4746, S(h^2)=0.4594, S(h^3)=0.5564, S(h^4)=0.2213, S(h^5)=0.2329.

The ranking of the alternatives is X3>X1>X2>X5>X4.

Obviously, the ranking of the alternatives is the same as derived from the proposed method. However, PHFSs are more suitable than HFSs. The main reason is that HFNs only consider the membership degrees of an element and ignore the non-membership degrees, which may result in information distortion and loss.

For comparison with the IHFN, the PHFN can be transformed to IHFN by restricting the square sum of its membership and non-membership degree to ≤1; the sum of membership degrees and non-membership degree is ≤1; and the intuitionistic hesitant fuzzy information is represented in Table 6. The ordered intuitionistic hesitant fuzzy decision matrix is given in Table 7.

Table 6:

Intuitionistic Hesitant Fuzzy Decision Matrix.

A1 A2 A3
X1 〈{0.3, 0.4}, {0.5, 0.6}〉 〈{0.3, 0.4}, {0.5, 0.6}〉 〈{0.3, 0.5, 0.6}, {0.4}〉
X2 〈{0.3, 0.4}, {0.3, 0.5, 0.6}〉 〈{0.3}, {0.7}〉 〈{0.4, 0.5}, {0.5}〉
X3 〈{0.2, 0.6}, {0.3, 0.4}〉 〈{0.6, 0.8}, {0.1, 0.2}〉 〈{0.6}, {0.4, 0.5}〉
X4 〈{0.1}, {0.8, 0.9}〉 〈{0.2, 0.3}, {0.9}〉 〈{0.4, 0.5}, {0.5, 0.6}〉
X5 〈{0.4}, {0.5, 0.6}〉 〈{0.3, 0.4}, {0.6}〉 〈{0.1}, {0.9}〉
Table 7:

Ordered Intuitionistic Hesitant Fuzzy Decision Matrix.

A1 A2 A3
X1 〈{0.5, 0.6}, {0.3, 0.4}〉 〈{0.3, 0.5, 0.6}, {0.4}〉 〈{0.3, 0.4}, {0.5, 0.6}〉
X2 〈{0.4, 0.5}, {0.5}〉 〈{0.3, 0.4}, {0.3, 0.5, 0.6}〉 〈{0.3}, {0.7}〉
X3 〈{0.6, 0.8}, {0.1, 0.2}〉 〈{0.6}, {0.4, 0.5}〉 〈{0.2, 0.6}, {0.3, 0.4}〉
X4 〈{0.4, 0.5}, {0.5, 0.6}〉 〈{0.2, 0.3}, {0.9}〉 〈{0.1}, {0.8, 0.9}〉
X5 〈{0.4}, {0.5, 0.6}〉 〈{0.3, 0.4}, {0.6}〉 〈{0.1}, {0.9}〉

Using the IHFOWA operator [20], the score values are as follows:

S(h^1)=0.0257,S(h^2)=0.1809,S(h^3)=0.2727,S(h^4)=0.5179,S(h^5)=0.3896.

The ranking of the alternatives is X3>X1>X2>X5>X4.

Using the IHFOWG operator [20], the score values are as follows:

S(h^1)=0.0134,S(h^2)=0.2219,S(h^3)=0.1719,S(h^4)=0.6257,S(h^5)=0.5139.

The ranking of the alternatives is, X3>X1>X2>X4>X5.

Obviously, the ranking of the alternatives is the same as derived from the proposed method. However, PHFSs are more suitable than IHFSs. The main reason for this is that IHFNs consider the membership degrees and non-membership degrees, which satisfy the condition that the sum of its membership and non-membership degree is ≤1, while in the proposed approach the square sum of membership degree and non-membership degree is ≤1.

7 Conclusion

PHFS is a very powerful tool for dealing with uncertainty and fuzziness. It was proposed by Khan et al. [9]. Therefore, in this paper, we developed a multi-attribute decision making approach under Pythagorean hesitant fuzzy information. We discussed some properties of the developed operators, namely PHFWA operator and PHFWG operator [9]. Furthermore, we generalized these operators and developed the PHFOWA and PHFOWG operators to solve the decision- making problems with anonymity. By the illustrative example, we have roughly shown the change trends of the results derived by the developed aggregation operators. Finally, a comparison method has been discussed between the proposed approach and existing methods.

In the future, we will introduce the concept of generalized Pythagorean hesitant fuzzy aggregation operators, Pythagorean hesitant fuzzy hybrid aggregation operators, and their generalizations, TOPSIS and TODIM methods, under the Pythagorean hesitant fuzzy environment.

Bibliography

[1] K. T. Atanassov, Intuitionistic fuzzy sets, Fuzzy Sets Syst. 20 (1986), 87–96.10.1016/S0165-0114(86)80034-3Search in Google Scholar

[2] K. T. Atanassov, Intuitionistic fuzzy sets: theory and applications, Physica-Verlag, Heidelberg, 1999.10.1007/978-3-7908-1870-3Search in Google Scholar

[3] I. Beg and T. Rashid, Multi-criteria trapezoidal valued intuitionistic fuzzy decision making with Choquet integral based TOPSIS, OPSEARCH 51 (2014), 98–9129.10.1007/s12597-013-0134-5Search in Google Scholar

[4] R. E. Bellman and L. A. Zadeh, Decision-making in a fuzzy environment, Manage. Sci. 17 (1970), 141–164.10.1287/mnsc.17.4.B141Search in Google Scholar

[5] F. E. Boran, S. Gen, M. Kurt and D. Akay, A multi-criteria intuitionistic fuzzy group decision making for supplier selection with TOPSIS method, Expert Syst. Appl. 36 (2009), 11363–11368.10.1016/j.eswa.2009.03.039Search in Google Scholar

[6] S. K. De, R. Biswas and A. R. Roy, An application of intuitionistic fuzzy sets in medical diagnosis, Fuzzy Sets Syst. 117 (2001), 209–213.10.1016/S0165-0114(98)00235-8Search in Google Scholar

[7] D. Dubois, The role of fuzzy sets in decision sciences: old techniques and new directions, Fuzzy Sets Syst. 184 (2011), 3–28.10.1016/j.fss.2011.06.003Search in Google Scholar

[8] H. Garg, A novel accuracy function under interval-valued Pythagorean fuzzy environment for solving multi-criteria decision making problem, J. Intell. Fuzzy Syst. 31 (2016), 529–540.10.3233/IFS-162165Search in Google Scholar

[9] M. S. A. Khan, S. Abdullah, A. Ali, N. Siddiqui and F. Amin, Pythagorean hesitant fuzzy sets and their application to group decision making with incomplete weight information, J. Intell. Fuzzy Syst. 33 (2017), 3971–3985.10.3233/JIFS-17811Search in Google Scholar

[10] D. F. Li, Multi-attribute decision making models and methods using intuitionistic fuzzy sets, J. Comput. Syst. Sci. 70 (2005), 73–85.10.1016/j.jcss.2004.06.002Search in Google Scholar

[11] D. Liang, Z. Xu and A. P. Darko, Projection model for fusing the information of Pythagorean fuzzy multi-criteria group decision making based on geometric Bonferroni mean, Int. J. Intell. Syst. 32 (2017), 966–987.10.1002/int.21879Search in Google Scholar

[12] H. C. Liao and Z. Xu, Some new hybrid weighted aggregation operators under hesitant fuzzy multi-criteria decision making environment, J. Intell. Fuzzy Syst. 26 (2014), 1601–1617.10.3233/IFS-130841Search in Google Scholar

[13] H. Liao and Z. Xu, Intuitionistic fuzzy hybrid weighted aggregation operators, Int. J. Intell. Syst. 29 (2014), 971–993.10.1002/int.21672Search in Google Scholar

[14] H. Liao and Z. Xu, Extended hesitant fuzzy hybrid weighted aggregation operators and their application in decision making, Soft Comput. 19 (2015), 2551–2564.10.1007/s00500-014-1422-6Search in Google Scholar

[15] H. Liao, Z. Xu, X. J. Zeng and D. L. Xu, An enhanced consensus reaching process in group decision making with intuitionistic fuzzy preference relations, Inform. Sci. 329 (2016), 274–286.10.1016/j.ins.2015.09.024Search in Google Scholar

[16] J. Liu and M. Sun, Generalized power average operator of hesitant fuzzy numbers and its application in multiple attribute decision making, J. Comput. Inform. Syst. 9 (2013), 3051–3058.Search in Google Scholar

[17] W. Liu and H. C. Liao, A bibliometric analysis of fuzzy decision research during 1970–2015, Int. J. Fuzzy Syst. 19 (2017), 1–14.10.1007/s40815-016-0272-zSearch in Google Scholar

[18] X. Peng and Y. Yang, Some results for Pythagorean fuzzy sets, Int. J. Intell. Syst. 30 (2015), 1133–1160.10.1002/int.21738Search in Google Scholar

[19] J. J. Peng, J. Q. Wang, J. Wang and X. H. Chen, Multi-criteria decision-making approach with hesitant interval valued intuitionistic fuzzy set, Sci. World J., 2014 Article ID 868515 (2014), 22.10.1155/2014/868515Search in Google Scholar PubMed PubMed Central

[20] J. J. Peng, J. Q. Wang, X. H. Wu, H. Y. Zhang and X. H. Chen, The fuzzy cross-entropy for intuitionistic hesitant fuzzy sets and their application in multi-criteria decision-making, Int. J. Syst. Sci. 46 (2015), 2335–2350.10.1080/00207721.2014.993744Search in Google Scholar

[21] G. Qian, G. H. Wang and X. Q. Feng, Generalized hesitant fuzzy sets and their application in decision support system, Knowl. Based Syst. 37 (2013), 357–365.10.1016/j.knosys.2012.08.019Search in Google Scholar

[22] V. Torra, Hesitant fuzzy sets, Int. J. Intell. Syst. 25 (2010), 529–539.10.1002/int.20418Search in Google Scholar

[23] M. Xia and Z. Xu, Hesitant fuzzy information aggregation in decision making, Int. J. Approx. Reason. 52 (2011), 395–407.10.1016/j.ijar.2010.09.002Search in Google Scholar

[24] Z. Xu and X. Zhang, Hesitant fuzzy multi-attribute decision making based on TOPSIS with incomplete weight information, Knowl. Based Syst. 52 (2013), 53–64.10.1016/j.knosys.2013.05.011Search in Google Scholar

[25] Z. S. Xu and H. C. Liao, A survey of approaches to decision making with intuitionistic fuzzy preference relations, Knowl. Based Syst. 80 (2015), 131–142.10.1016/j.knosys.2014.12.034Search in Google Scholar

[26] R. R. Yager, Pythagorean fuzzy subsets, in: Proceedings of Joint IFSA World Congress and NAFIPS Annual Meeting, pp. 57–61, Edmonton, Canada, 2013.10.1109/IFSA-NAFIPS.2013.6608375Search in Google Scholar

[27] R. R. Yager, Pythagorean membership grades in multi-criteria decision making, IEEE Trans. Fuzzy Syst. 22 (2014), 958–965.10.1109/TFUZZ.2013.2278989Search in Google Scholar

[28] R. R. Yager and A. M. Abbasov, Pythagorean membership grades, complex numbers and decision making, Int. J Intell. Syst. 28 (2013), 436–452.10.1002/int.21584Search in Google Scholar

[29] D. Yu and H. C. Liao, Visualization and quantitative research on intuitionistic fuzzy studies, J. Intell. Fuzzy Syst. 30 (2016), 3653–3663.10.3233/IFS-162111Search in Google Scholar

[30] D. Yu, Y. Wu and W. Zhou, Multi-criteria decision making based on Choquet integral under hesitant fuzzy environment, J. Comput. Inform. Syst. 7 (2011), 4506–4513.Search in Google Scholar

[31] L. A. Zadeh, Fuzzy sets, Inf. Control 8 (1965), 338–353.10.21236/AD0608981Search in Google Scholar

[32] Z. Zhang, Hesitant fuzzy power aggregation operators and their application to multiple attribute group decision making, Inform. Sci. 234 (2013), 150–181.10.1016/j.ins.2013.01.002Search in Google Scholar

[33] X. L. Zhang and Z. S. Xu, Extension of TOPSIS to multi-criteria decision making with Pythagorean fuzzy sets, Int. J. Intell. Syst. 29 (2014), 1061–1078.10.1002/int.21676Search in Google Scholar

[34] B. Zhu, Z. Xu and M. Xia, Dual hesitant fuzzy sets, J. Appl. Math. 2012 (2012), 13, Article ID 879629.10.1155/2012/879629Search in Google Scholar

Received: 2017-05-18
Published Online: 2018-01-05

©2020 Walter de Gruyter GmbH, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 Public License.

Articles in the same Issue

  1. An Optimized K-Harmonic Means Algorithm Combined with Modified Particle Swarm Optimization and Cuckoo Search Algorithm
  2. Texture Feature Extraction Using Intuitionistic Fuzzy Local Binary Pattern
  3. Leaf Disease Segmentation From Agricultural Images via Hybridization of Active Contour Model and OFA
  4. Deadline Constrained Task Scheduling Method Using a Combination of Center-Based Genetic Algorithm and Group Search Optimization
  5. Efficient Classification of DDoS Attacks Using an Ensemble Feature Selection Algorithm
  6. Distributed Multi-agent Bidding-Based Approach for the Collaborative Mapping of Unknown Indoor Environments by a Homogeneous Mobile Robot Team
  7. An Efficient Technique for Three-Dimensional Image Visualization Through Two-Dimensional Images for Medical Data
  8. Combined Multi-Agent Method to Control Inter-Department Common Events Collision for University Courses Timetabling
  9. An Improved Particle Swarm Optimization Algorithm for Global Multidimensional Optimization
  10. A Kernel Probabilistic Model for Semi-supervised Co-clustering Ensemble
  11. Pythagorean Hesitant Fuzzy Information Aggregation and Their Application to Multi-Attribute Group Decision-Making Problems
  12. Using an Efficient Optimal Classifier for Soil Classification in Spatial Data Mining Over Big Data
  13. A Bayesian Multiresolution Approach for Noise Removal in Medical Magnetic Resonance Images
  14. Gbest-Guided Artificial Bee Colony Optimization Algorithm-Based Optimal Incorporation of Shunt Capacitors in Distribution Networks under Load Growth
  15. Graded Soft Expert Set as a Generalization of Hesitant Fuzzy Set
  16. Universal Liver Extraction Algorithm: An Improved Chan–Vese Model
  17. Software Effort Estimation Using Modified Fuzzy C Means Clustering and Hybrid ABC-MCS Optimization in Neural Network
  18. Handwritten Indic Script Recognition Based on the Dempster–Shafer Theory of Evidence
  19. An Integrated Intuitionistic Fuzzy AHP and TOPSIS Approach to Evaluation of Outsource Manufacturers
  20. Automatically Assess Day Similarity Using Visual Lifelogs
  21. A Novel Bio-Inspired Algorithm Based on Social Spiders for Improving Performance and Efficiency of Data Clustering
  22. Discriminative Training Using Noise Robust Integrated Features and Refined HMM Modeling
  23. Self-Adaptive Mussels Wandering Optimization Algorithm with Application for Artificial Neural Network Training
  24. A Framework for Image Alignment of TerraSAR-X Images Using Fractional Derivatives and View Synthesis Approach
  25. Intelligent Systems for Structural Damage Assessment
  26. Some Interval-Valued Pythagorean Fuzzy Einstein Weighted Averaging Aggregation Operators and Their Application to Group Decision Making
  27. Fuzzy Adaptive Genetic Algorithm for Improving the Solution of Industrial Optimization Problems
  28. Approach to Multiple Attribute Group Decision Making Based on Hesitant Fuzzy Linguistic Aggregation Operators
  29. Cubic Ordered Weighted Distance Operator and Application in Group Decision-Making
  30. Fault Signal Recognition in Power Distribution System using Deep Belief Network
  31. Selector: PSO as Model Selector for Dual-Stage Diabetes Network
  32. Oppositional Gravitational Search Algorithm and Artificial Neural Network-based Classification of Kidney Images
  33. Improving Image Search through MKFCM Clustering Strategy-Based Re-ranking Measure
  34. Sparse Decomposition Technique for Segmentation and Compression of Compound Images
  35. Automatic Genetic Fuzzy c-Means
  36. Harmony Search Algorithm for Patient Admission Scheduling Problem
  37. Speech Signal Compression Algorithm Based on the JPEG Technique
  38. i-Vector-Based Speaker Verification on Limited Data Using Fusion Techniques
  39. Prediction of User Future Request Utilizing the Combination of Both ANN and FCM in Web Page Recommendation
  40. Presentation of ACT/R-RBF Hybrid Architecture to Develop Decision Making in Continuous and Non-continuous Data
  41. An Overview of Segmentation Algorithms for the Analysis of Anomalies on Medical Images
  42. Blind Restoration Algorithm Using Residual Measures for Motion-Blurred Noisy Images
  43. Extreme Learning Machine for Credit Risk Analysis
  44. A Genetic Algorithm Approach for Group Recommender System Based on Partial Rankings
  45. Improvements in Spoken Query System to Access the Agricultural Commodity Prices and Weather Information in Kannada Language/Dialects
  46. A One-Pass Approach for Slope and Slant Estimation of Tri-Script Handwritten Words
  47. Secure Communication through MultiAgent System-Based Diabetes Diagnosing and Classification
  48. Development of a Two-Stage Segmentation-Based Word Searching Method for Handwritten Document Images
  49. Pythagorean Fuzzy Einstein Hybrid Averaging Aggregation Operator and its Application to Multiple-Attribute Group Decision Making
  50. Ensembles of Text and Time-Series Models for Automatic Generation of Financial Trading Signals from Social Media Content
  51. A Flame Detection Method Based on Novel Gradient Features
  52. Modeling and Optimization of a Liquid Flow Process using an Artificial Neural Network-Based Flower Pollination Algorithm
  53. Spectral Graph-based Features for Recognition of Handwritten Characters: A Case Study on Handwritten Devanagari Numerals
  54. A Grey Wolf Optimizer for Text Document Clustering
  55. Classification of Masses in Digital Mammograms Using the Genetic Ensemble Method
  56. A Hybrid Grey Wolf Optimiser Algorithm for Solving Time Series Classification Problems
  57. Gray Method for Multiple Attribute Decision Making with Incomplete Weight Information under the Pythagorean Fuzzy Setting
  58. Multi-Agent System Based on the Extreme Learning Machine and Fuzzy Control for Intelligent Energy Management in Microgrid
  59. Deep CNN Combined With Relevance Feedback for Trademark Image Retrieval
  60. Cognitively Motivated Query Abstraction Model Based on Associative Root-Pattern Networks
  61. Improved Adaptive Neuro-Fuzzy Inference System Using Gray Wolf Optimization: A Case Study in Predicting Biochar Yield
  62. Predict Forex Trend via Convolutional Neural Networks
  63. Optimizing Integrated Features for Hindi Automatic Speech Recognition System
  64. A Novel Weakest t-norm based Fuzzy Fault Tree Analysis Through Qualitative Data Processing and Its Application in System Reliability Evaluation
  65. FCNB: Fuzzy Correlative Naive Bayes Classifier with MapReduce Framework for Big Data Classification
  66. A Modified Jaya Algorithm for Mixed-Variable Optimization Problems
  67. An Improved Robust Fuzzy Algorithm for Unsupervised Learning
  68. Hybridizing the Cuckoo Search Algorithm with Different Mutation Operators for Numerical Optimization Problems
  69. An Efficient Lossless ROI Image Compression Using Wavelet-Based Modified Region Growing Algorithm
  70. Predicting Automatic Trigger Speed for Vehicle-Activated Signs
  71. Group Recommender Systems – An Evolutionary Approach Based on Multi-expert System for Consensus
  72. Enriching Documents by Linking Salient Entities and Lexical-Semantic Expansion
  73. A New Feature Selection Method for Sentiment Analysis in Short Text
  74. Optimizing Software Modularity with Minimum Possible Variations
  75. Optimizing the Self-Organizing Team Size Using a Genetic Algorithm in Agile Practices
  76. Aspect-Oriented Sentiment Analysis: A Topic Modeling-Powered Approach
  77. Feature Pair Index Graph for Clustering
  78. Tangramob: An Agent-Based Simulation Framework for Validating Urban Smart Mobility Solutions
  79. A New Algorithm Based on Magic Square and a Novel Chaotic System for Image Encryption
  80. Video Steganography Using Knight Tour Algorithm and LSB Method for Encrypted Data
  81. Clay-Based Brick Porosity Estimation Using Image Processing Techniques
  82. AGCS Technique to Improve the Performance of Neural Networks
  83. A Color Image Encryption Technique Based on Bit-Level Permutation and Alternate Logistic Maps
  84. A Hybrid of Deep CNN and Bidirectional LSTM for Automatic Speech Recognition
  85. Database Creation and Dialect-Wise Comparative Analysis of Prosodic Features for Punjabi Language
  86. Trapezoidal Linguistic Cubic Fuzzy TOPSIS Method and Application in a Group Decision Making Program
  87. Histopathological Image Segmentation Using Modified Kernel-Based Fuzzy C-Means and Edge Bridge and Fill Technique
  88. Proximal Support Vector Machine-Based Hybrid Approach for Edge Detection in Noisy Images
  89. Early Detection of Parkinson’s Disease by Using SPECT Imaging and Biomarkers
  90. Image Compression Based on Block SVD Power Method
  91. Noise Reduction Using Modified Wiener Filter in Digital Hearing Aid for Speech Signal Enhancement
  92. Secure Fingerprint Authentication Using Deep Learning and Minutiae Verification
  93. The Use of Natural Language Processing Approach for Converting Pseudo Code to C# Code
  94. Non-word Attributes’ Efficiency in Text Mining Authorship Prediction
  95. Design and Evaluation of Outlier Detection Based on Semantic Condensed Nearest Neighbor
  96. An Efficient Quality Inspection of Food Products Using Neural Network Classification
  97. Opposition Intensity-Based Cuckoo Search Algorithm for Data Privacy Preservation
  98. M-HMOGA: A New Multi-Objective Feature Selection Algorithm for Handwritten Numeral Classification
  99. Analogy-Based Approaches to Improve Software Project Effort Estimation Accuracy
  100. Linear Regression Supporting Vector Machine and Hybrid LOG Filter-Based Image Restoration
  101. Fractional Fuzzy Clustering and Particle Whale Optimization-Based MapReduce Framework for Big Data Clustering
  102. Implementation of Improved Ship-Iceberg Classifier Using Deep Learning
  103. Hybrid Approach for Face Recognition from a Single Sample per Person by Combining VLC and GOM
  104. Polarity Analysis of Customer Reviews Based on Part-of-Speech Subcategory
  105. A 4D Trajectory Prediction Model Based on the BP Neural Network
  106. A Blind Medical Image Watermarking for Secure E-Healthcare Application Using Crypto-Watermarking System
  107. Discriminating Healthy Wheat Grains from Grains Infected with Fusarium graminearum Using Texture Characteristics of Image-Processing Technique, Discriminant Analysis, and Support Vector Machine Methods
  108. License Plate Recognition in Urban Road Based on Vehicle Tracking and Result Integration
  109. Binary Genetic Swarm Optimization: A Combination of GA and PSO for Feature Selection
  110. Enhanced Twitter Sentiment Analysis Using Hybrid Approach and by Accounting Local Contextual Semantic
  111. Cloud Security: LKM and Optimal Fuzzy System for Intrusion Detection in Cloud Environment
  112. Power Average Operators of Trapezoidal Cubic Fuzzy Numbers and Application to Multi-attribute Group Decision Making
Downloaded on 7.9.2025 from https://www.degruyterbrill.com/document/doi/10.1515/jisys-2017-0231/html
Scroll to top button