Home Some remarks on a pair of seemingly unrelated regression models
Article Open Access

Some remarks on a pair of seemingly unrelated regression models

  • Jian Hou and Yong Zhao EMAIL logo
Published/Copyright: August 24, 2019

Abstract

Linear regression models are foundation of current statistical theory and have been a prominent object of study in statistical data analysis and inference. A special class of linear regression models is called the seemingly unrelated regression models (SURMs) which allow correlated observations between different regression equations. In this article, we present a general approach to SURMs under some general assumptions, including establishing closed-form expressions of the best linear unbiased predictors (BLUPs) and the best linear unbiased estimators (BLUEs) of all unknown parameters in the models, establishing necessary and sufficient conditions for a family of equalities of the predictors and estimators under the single models and the combined model to hold. Some fundamental and valuable properties of the BLUPs and BLUEs under the SURM are also presented.

MSC 2010: 62H12; 62J05

1 Introduction

Linear regression models are foundation of current statistical theory and have been a prominent object of study in statistical data analysis and inference. A special class of linear regression models is called the seemingly unrelated regression model (SURM) which allows correlated observations between regression equations. In this article, we consider a SURM of the form:

L1:y1=X1β1+ε1, (1.1)
L2:y2=X2β2+ε2, (1.2)

where yi ∈ ℝni×1 are vectors of observable response variables, Xi ∈ ℝni×pi are known matrices of arbitrary ranks, βi ∈ ℝpi×1 are fixed but unknown vectors, i = 1, 2, ε1 ∈ ℝn1×1 and ε2 ∈ ℝn2×1 are random error vectors satisfying

Eε1ε2=0,Covε1ε2=Σ11Σ12Σ21Σ22:=Σ. (1.3)

Under these assumptions, (1.1)(1.3) can jointly be written as

L:y=Xβ+ε,E(ε)=0,Cov(ε)=Σ, (1.4)

where

y=y1y2,X=X100X2,β=β1β2,ε=ε1ε2.

The two individual equations are in fact linked each other since the disturbance terms in the two models are correlated. Thus, such a pair of linear regression models are usually called a seemingly unrelated regression model (SURM). It is well known that there are two main motivations for using SURMs in statistical analysis: the first one is to gain efficiency in estimation of parameters by combining information on the given different equations; the second is to impose and/or test restrictions that involve parameters in the different equations. Some earlier and seminal work in this area was presented in [1, 2, 3], whereas there are relatively many papers and also chapters in monographs on econometrics that approached SURMs, e.g., a thorough treatment is given in [4], and a survey can be found in [5, 6, 7] among others.

In the statistical inference of ℒ1 and ℒ2, a main objects of study is to estimate βi and predict εi, where the traditional procedure is to establish estimators and predictors of βi and εi, respectively. It is, however, better to simultaneously identify estimators and predictors of all unknown parameters in ℒ1 and ℒ2. Some recent contributions on simultaneous estimators/predictors of combined unknown parameter vectors under linear regression models can be found, e.g., in [8, 9, 10]. In this article, we construct two general vectors of the unknown vectors βi and εi in ℒ1 and ℒ2 as follows

ψ1=G1β1+H1ε1,ψ2=G2β2+H2ε2, (1.5)

where Gi and Hi are given ki × pi and ki × ni matrices, respectively, i = 1, 2. Furthermore, merging the two vectors gives

ψ=Gβ+Hε,ψ=ψ1ψ2,G=G100G2,H=H100H2. (1.6)

In this setting,

E(ψi)=Giβi=GiSiβ,E(ψ)=Gβ, (1.7)
Cov(ψi)=HiΣiiHi=HiTiΣ(HiTi),Cov{ψi,yi}=HiΣii=HiTiΣTi, (1.8)
Cov(ψ)=HΣH,Cov{ψ,y}=HΣ (1.9)

for i = 1, 2, where S1 = [Ip1, 0], S2 = [0, Ip2], T1 = [In1, 0], and T2 = [0, In2]. When Gi = Xi and Hi = Ini, (1.5) becomes ψi = Xiβi + εi = yi, the observed response vector in ℒ1 and ℒ2. Hence, (1.5) includes all vector operations in ℒ1 and ℒ2 as its special cases.

Throughout out this article, ℝm×n denotes the collection of all m × n real matrices, and use A′, r(A), and ℛ(A) to stand for the transpose, the rank, and the range of a matrix A ∈ ℝm×n, respectively; Im denotes the identity matrix of order m. The Moore–Penrose inverse of A, denoted by A+, is defined to be the unique solution G satisfying the four matrix equations AGA = A, GAG = G, (AG)′ = AG, and (GA)′ = GA. PA, EA, and FA stand for the three orthogonal projectors (symmetric idempotent matrices) PA = AA+, EA = A = ImAA+, and FA = InA+A induced from A+. For two symmetric matrices A and B of the same size, AB means that AB is nonnegative definite.

Concerning the predictability of ψ in (1.6), we need the following definition.

Definition 1.1

The vector ψ in (1.6) is said to be predictable under ℒ if there exists a matrix L ∈ ℝk×n such that E(Lyψ) = 0. In particular, the Gβ is said to be estimable under ℒ if there exists a matrix L ∈ ℝk×n such that E(LyGβ) = 0.

Definition 1.2

Let ψ be defined in (1.6). If there exists a matrix L such that

Cov(Lyψ)=mins.t.E(Lyψ)=0 (1.10)

holds in the Löwner partial ordering, the linear statistic Ly is defined to be the best linear unbiased predictor(BLUP) of ψ under ℒ, and is denoted by

Ly=BLUPL(ψ)=BLUPL(Gβ+Hε). (1.11)

If H = 0, or G = 0 in (1.6), then the Ly satisfying (1.10) is called the best linear unbiased estimator(BLUE) and the BLUP of Gβ and Hε under ℒ, respectively, and are denoted by

Ly=BLUEL(Gβ),Ly=BLUPL(Hε), (1.12)

respectively.

BLUPs/BLUEs are well known objects of study in regression analysis because of their simple and optimality properties in statistical inferences, and are one of the prominent research objects in the field of statistics and applications. Because the BLUPs of ψi under ℒ1 and ℒ2, and the BLUPs of ψi under ℒ are not necessarily the same, it is natural to compare the BLUPs under these models, and establish possible connections for the BLUPs, such as,

BLUPL(ψi)=BLUPLi(ψi),i=1,2, (1.13)
BLUPL(ψ)=BLUPL1(ψ1)BLUPL2(ψ2). (1.14)

This article aims at establishing necessary and sufficient conditions for the equalities to hold, and presents some consequences and applications of these equalities.

2 Preliminary results

We need the following tools in the analysis of (1.1)(1.14).

Lemma 2.1

([11]). Let A ∈ ℝm×n, B ∈ ℝm×k, C ∈ ℝl×n and D ∈ ℝl×k. Then

r[A,B]=r(A)+r(EAB)=r(B)+r(EBA), (2.1)
rAC=r(A)+r(CFA)=r(C)+r(AFC). (2.2)

If ℛ(B) ⊆ ℛ(A) and ℛ(C′) ⊆ ℛ(A′), then

rABCD=r(A)+r(DCA+B). (2.3)

In addition, the following results hold.

  1. r[A, B] = r(A) ⇔ ℛ(B) ⊆ ℛ(A) ⇔ AA+B = BEAB = 0.

  2. rAC = r(A) ⇔ ℛ(C′) ⊆ ℛ(A′) ⇔ CA+A = CC FA = 0.

Lemma 2.2

([12]). The linear matrix equation AX = B is solvable for X if and only if r[A, B] = r(A), or equivalently, AA+B = B. In this case, the general solution of the equation can be written as X = A+ B + (IA+A) U, where U is an arbitrary matrix.

In order to directly solve the matrix minimization problem in (1.10), we need the following known result.

Lemma 2.3

([9]). Let

f(L)=LC+DMLC+Ds.t.LA=B,

where A ∈ ℝp×q, B ∈ ℝn×q, C ∈ ℝp×m and D ∈ ℝn×m are given, M ∈ ℝm×m is nnd, and the matrix equation LA = B is solvable. Then there always exists a solution L0 of L0A = B such that

f(L)f(L0)

holds for all solutions of LA = B. In this case, the matrix L0 satisfying the above inequality is determined by the following solvable matrix equation

L0A,CMCA=B,DMCA.

In this case, the general expression of L0 and the corresponding f(L0) and f(L) are given by

L0=argminLA=Bf(L)=B,DMCAA,CMCA++UA,CMC,f(L0)=minLA=Bf(L)=FMFFMCTCMF,f(L)=f(L0)+LC+DMCTCMLC+D=f(L0)+LCMCA+DMCATLCMCA+DMCA,

where F = BA+C + D, T = (ACMCA)+, and U ∈ ℝn×p is arbitrary.

3 Exact formulas for BLUPs of all parameters under SURMs

Classic estimation/prediction problems of unknown parameters in SURMs were considered, e.g., in [13, 14, 15]. Some effective algebraic methods for deriving analytical formulas of BLUPs/BLUEs under general linear regression models have recently been proposed and used in [8, 9, 10, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26]. In this section, we first give a new derivation of exact formulas for calculating the BLUPs of ψi in (1.5), and show a variety of algebraic and statistical properties of the BLUPs. It can be seen from (1.1), (1.2), and (1.5) that

Liyiψi=LiXiβi+LiεiGiβiHiεi=(LiXiGi)βi+(LiHi)εi=(LiXiGi)βi+(LiHi)Tiε,i=1,2. (3.1)

Then, the expectations and covariance matrices of Liyiψi can be written as

E(Liyiψi)=(LiXiGi)βi, (3.2)
Cov(Liyiψi)=Cov[(LiXiGi)βi+(LiTiHiTi)ε]=(LiTiHiTi)Σ(LiTiHiTi)=fi(Li) (3.3)

for i = 1, 2. Hence, the constrained covariance matrix minimization problems in (1.10) convert to mathematical problems of minimizing the quadratic matrix-valued functions fi(Li) subject to (LiXiGi)βi = 0, i = 1, 2. Our first main result is presented below.

Theorem 3.1

Let1 and2 be as given in (1.1) and (1.2), respectively, and denote

Ci=Cov{ψi,yi}=HiTiΣTi,i=1,2. (3.4)

Then, the parameter vectors ψi in (1.5) are predictable by yi in1 and2, respectively, if and only if

R(Xi)R(Gi),i=1,2. (3.5)

In these cases,

Cov(L^iyiψi)=mins.t.E(L^iyiψi)=0L^i[Xi,ΣiiXi]=[Gi,CiXi],i=1,2. (3.6)

The matrix equations in (3.6) are solvable under (3.5), and the general solutions i and the corresponding BLUPi(ψi) can be written as

BLUPLi(ψi)=L^iyi=L^iTiy=[Gi,CiXi][Xi,ΣiiXi]++Ui[Xi,ΣiiXi]Tiy, (3.7)

where Ui ∈ ℝki×ni are arbitrary, i = 1, 2. The corresponding fi(i) and fi(Li) under (3.1)(3.3) are given by

fi(L^i)=Cov[BLUPLi(ψi)ψi]=[Gi,CiXi][Xi,ΣiiXi]+TiHiTiΣ[Gi,CiXi][Xi,ΣiiXi]+TiHiTi, (3.8)
fi(Li)=fi(L^i)+LiTiΣTiHiTiΣTiXiTiΣTiXi+LiTiΣTiHiTiΣTi=fi(L^i)+LiΣiiCiXiΣiiXi+LiΣiiCi (3.9)

for i = 1, 2. Further, the following results hold.

  1. r[Xi, Σii Xi ] = r[Xi, Σii], ℛ[Xi, Σii Xi ] = ℛ[Xi, Σii] and ℛ(Xi) ∩ ℛ(Σii Xi ) = {0}, i = 1, 2.

  2. i are unique if and only if r[Xi, Σii] = ni, i = 1, 2.

  3. BLUPi(ψi) are unique with probability 1 if and only if yi ∈ ℛ[Xi, Σii] hold with probability 1, i = 1, 2.

  4. The covariance matrices of BLUPi(ψi), as well as the covariance matrices between BLUPi(ψi) and ψi are unique, and satisfy the formulas

    Cov[BLUPLi(ψi)]=[Gi,CiXi][Xi,ΣiiXi]+Σii[Gi,CiXi][Xi,ΣiiXi]+, (3.10)
    Cov{BLUPLi(ψi),ψi}=[Gi,CiXi][Xi,ΣiiXi]+Ci, (3.11)
    Cov(ψi)Cov[BLUPLi(ψi)]=HiTiΣ(HiTi)[Gi,CiXi][Xi,ΣiiXi]+Σii[Gi,CiXi][Xi,ΣiiXi]+ (3.12)

    for i = 1, 2.

  5. The BLUPs of ψi can be decomposed as the sums

    BLUPLi(ψi)=BLUELi(Giβi)+BLUPLi(Hiεi),i=1,2. (3.13)
  6. If ψ1 and ψ2 are predictable under1 and2, respectively, then P1ψ1 and P2ψ2 are predictable under1 and2, respectively, and BLUPi(Piψi) = PiBLUPi(ψi) hold for any matrices Pi ∈ ℝti×ki, i = 1, 2.

Proof

It can be seen from (1.1), (1.2), and (1.5) that

ELiyiψi=0LiXiβiGiβi=0 for all βiLiXi=Gi,i=1,2.

From Lemma 2.2, the matrix equations are solvable respectively if and only if (3.5) hold. In these cases, we see from Lemma 2.2 that the first parts of (3.6) are equivalent to finding solutions i of the solvable matrix equations iXi = Gi such that

fi(Li)fi(L^i)s.t.LiXi=Gi,i=1,2 (3.14)

hold in the Löwner partial ordering. Further from Lemma 2.3, there always exist solutions i of iXi = Gi such that (3.14) hold, and the i are determined by the matrix equations

L^i[Xi,TiΣTiXi]=[Gi,HiTiΣTiXi],i=1,2,

thus establishing the matrix equations in (3.6). Solving the matrix equations by Lemma 2.2 gives the i in (3.7). Also from (3.3),

fi(L^i)=Cov(L^iyiψi)=[Gi,CiXi][Xi,ΣiiXi]+TiHiTiΣ[Gi,CiXi][Xi,ΣiiXi]+TiHiTi,

as required for (3.8) for i = 1, 2. Eq. (3.9) follows from Lemma 2.3.

Result (a) is well known. Results (b) and (c) follow directly from (3.7). Taking covariance matrices of (3.7) yields (3.10). From (3.4) and (3.7),

Cov{BLUPLi(ψi),ψi}=Cov{L^iyi,ψi}=L^iCov{yi,ψi}=[Gi,CiXi][Xi,ΣiiXi]+Ci,

thus establishing (3.11) for i = 1, 2. The two equalities in (3.12) follow from (1.8) and (3.10). Results (e) and (f) are direct consequences of (3.7).□

We next derive the BLUPs of ψ and ψi under ℒ, respectively. Note from ℒ, (1.5) and (1.6) that

Kyψ=KXβ+KεGβHε=(KXG)β+(KH)ε,Kiyψi=KiXβ+KiεGiβiHiεi=(KiXGiSi)β+(KiHiTi)ε,i=1,2.

Then the expectations and covariance matrices of Kyψ and Kiyψi can be written as

E(Kyψ)=(KXG)β,E(Kiyψi)=(KiXGiSi)β, (3.15)
Cov(Kyψ)=Cov[(KXG)β+(KH)ε]=(KH)Σ(KH)=g(K), (3.16)
Cov(Kiyψi)=Cov[(KiXGiSi)β+(KiHiTi)ε]=(KiHiTi)Σ(KiHiTi)=gi(Ki) (3.17)

for i = 1, 2. Our second main result is presented below.

Theorem 3.2

Letbe as given in (1.4), and denote

J=Cov{ψ,y}=HΣ,Ji=Cov{ψi,y}=HiTiΣ,i=1,2. (3.18)

Then, the parameter vector ψ in (1.6) is predictable by y inif and only if

R(X)R(G),i.e.,R(Xi)R(Gi),i=1,2. (3.19)

In this case,

E(K^yψ)=0andCov(K^yψ)=minK^[X,ΣX]=[G,JX]. (3.20)

The matrix equation in (3.20) is solvable as well under (3.19), while the general forms of and the corresponding BLUP(ψ) can be written as

BLUPL(ψ)=K^y=[G,JX][X,ΣX]++U[X,ΣX]y, (3.21)

where U ∈ ℝk×n is arbitrary. The corresponding g() and g(K) in (3.16) are given by

g(K^)=Cov[BLUPL(ψ)ψ]=[G,JX][X,ΣX]+HΣ[G,JX][X,ΣX]+H, (3.22)
g(K)=g(K^)+KΣHΣXΣX+KΣHΣ=g(K^)+KΣJXΣX+KΣJ. (3.23)

In particular, the parameter vectors ψi in (1.5) is predictable by y inif and only if

R(X)R[(GiSi)],i=1,2. (3.24)

In this case,

E(K^iyψi)=0andCov(K^iyψi)=minK^i[X,ΣX]=[GiSi,JiX],i=1,2. (3.25)

The matrix equation in (3.25) is solvable as well under (3.24), while the general forms of i and the corresponding BLUP(ψi) can be written as

BLUPL(ψi)=K^iy=[GiSi,JiX][X,ΣX]++Ui[X,ΣX]y, (3.26)

where Ui ∈ ℝki×n are arbitrary, i = 1, 2. The corresponding gi(i) and gi(Ki) in (3.17) are given by

gi(K^i)=Cov[BLUPL(ψi)ψi]=[GiSi,JiX][X,ΣX]+HiTiΣ[GiSi,JiX][X,ΣX]+HiTi, (3.27)
gi(Ki)=gi(K^i)+KiΣHiTiΣXΣX+KiΣHiTiΣ=gi(K^i)+KiΣJiXΣX+KiΣJi (3.28)

for i = 1, 2. Further, the following results hold.

  1. r[X, ΣX] = r[X, Σ], ℛ[X, ΣX] = ℛ[X, Σ], and ℛ(X) ∩ ℛ(ΣX) = {0}.

  2. is unique if and only if r[X, Σ] = n.

  3. BLUP(ψ) is unique with probability 1 if and only if y ∈ ℛ[X, Σ] holds with probability 1.

  4. The following covariance matrix formulas

    Cov[BLUPL(ψ)]=[G,JX][X,ΣX]+Σ[G,JX][X,ΣX]+,Cov{BLUPL(ψ),ψ}=[G,JX][X,ΣX]+J,Cov(ψ)Cov[BLUPL(ψ)]=HΣH[G,JX][X,ΣX]+Σ[G,JX][X,ΣX]+,

    and

    Cov[BLUPL(ψi)]=[GiSi,JiX][X,ΣX]+Σ[GiSi,JiX][X,ΣX]+,Cov{BLUPL(ψi),ψi}=[GiSi,JiX][X,ΣX]+Ji,Cov(ψi)Cov[BLUPL(ψi)]=HiTiΣ(HiTi)[GiSi,JiX][X,ΣX]+Σ[GiSi,JiX][X,ΣX]+

    hold for i = 1, 2.

  5. The BLUPs of ψ and ψi satisfy the following identities

    BLUPL(ψ)=BLUEL(Gβ)+BLUPL(Hε),BLUPL(ψi)=BLUEL(Giβi)+BLUPL(Hiεi),i=1,2.
  6. Tψ is predictable under ℒ, then BLUP(Tψ) = TBLUP(ψ) holds for all matrices T ∈ ℝt×k.

Proof

It is obvious from (3.15) that

EKyψ=0KXβGβ=0for all βKX=G,EKiyψi=0KiXβGiSiβ=0for all βKiX=GiSi,i=1,2.

From Lemma 2.2, the matrix equations are solvable respectively if and only if (3.19) and (3.24) hold, respectively. In these cases, we see from Lemma 2.2 that the first parts of (3.20) and (3.25) are equivalent to finding solutions of the solvable matrix equations X = G and i of the solvable matrix equations iX = GiSi such that

g(K)g(K^)s.t.KX=G, (3.29)
gi(Ki)gi(K^i)s.t.KiX=GiSi,i=1,2, (3.30)

hold, respectively, in the Löwner partial ordering. Further from Lemma 2.3, there always exist solutions of X = G and i of iX = GiSi such that (3.29) and (3.30) hold, respectively. Applying Lemma 2.3 to (3.29) and (3.30) leads to the conclusions in the theorem.

4 How to establish decomposition identities between BLUPs under SURMs

The exact formulas of BLUPs and their analytical properties presented in Section 3 enable us to conduct many new and valuable statistical inference for SURMs via various matrix analysis tools. Especially through comparing the formulas of the BLUPs of the same unknown parameters under two different models, people can propose various types of equality between the BLUPs. Some previous and recent work on the equivalence of BLUPs under linear regression models can be found in [27, 28, 29, 30]. In this section, we derive necessary and sufficient conditions for (1.13)(1.14) to hold, and present some of their direct consequences.

Theorem 4.1

Assume that ψi in (1.5) are predictable under1 and ℒ2, i.e., (3.5) holds, i = 1, 2. Then, they are predictable underas well. Also let BLUPi(ψi) and BLUP(ψi) be as given in (3.7) and (3.26), respectively, i = 1, 2. Then, the following statements are equivalent:

  1. BLUP(ψi) = BLUPi(ψi), i = 1, 2.

  2. rXiCov{yi,y}0XGiCov{ψi,y}=rXiCov{yi,y}0X,i=1,2.

  3. rXiCov{yi,Xy}GiCov{ψi,Xy}=r[Xi,Cov{yi,Xy}],i=1,2.

  4. ℛ([Gi, Cov{ψi, Xy}]′) ⊆ ℛ([Xi, Cov{yi, Xy}]′), i = 1, 2.

Proof

If (a) holds, the coefficient matrices of BLUPi(ψi) and BLUP(ψi) are the same, i.e., the coefficient matrices of BLUPi(ψi) satisfy (3.25)

[Gi,CiXi][Xi,ΣiiXi]++Ui[Xi,ΣiiXi]Ti[X,ΣX]=[GiSi,JiX],i=1,2. (4.1)

Simplifying both sides by (2.1) and elementary block matrix operations, we obtain

r[Xi,ΣiiXi]Ti[X,ΣX]=r[Xi,Σii]Ti[X,Σ]=r[Ti[X,Σ],Xi,Σii]r[Xi,Σii]=r[[Xi,0],TiΣ,Xi,Σii]r[Xi,Σii]=r[Xi,Σii]r[Xi,Σii]=0,

that is, ℛ(Ti[X, ΣX]) ⊆ ℛ[Xi, ΣiiXi]. In this case, we obtain by (2.3) that

r[GiSi,JiX][Gi,CiXi][Xi,ΣiiXi]+Ti[X,ΣX]=r[GiSi,JiX][Gi,CiXi][TiX,TiΣX][Xi,ΣiiXi]r[Xi,Σii]=rGiJiXGiCiXiXiTiΣXXiΣiiXir[Xi,Σii]=rGiJiCiXiTiΣΣii0X000Xir(X)r(Xi)r[Xi,Σii](by (2.2))=rGiHiTiΣHiTiΣTiXiTiΣTiΣTi0X000Xir(X)r(Xi)r[Xi,Σii]=rGiHiTiΣ0XiTiΣ00X000Xir(X)r(Xi)r[Xi,Σii]=rGiCov{ψi,y}XiCov{yi,y}0Xr(X)r[Xi,Σii]=rGiCov{ψi,Xy}XiCov{yi,Xy}r[Xi,Σii](by (2.2)).

Combining this equality with (4.1) leads to the equivalence of (a)–(c). The equivalence of (c) and (d) follows from Lemma 2.1 (b).□

The following results are direct consequences of Theorem 4.1.

Corollary 4.2

Let BLUPi(ψi) and BLUP(ψi) be as given in (3.7) and (3.26), respectively, i = 1, 2. Then, the following statements are equivalent:

  1. BLUE(Xiβi) = BLUEi(Xiβi), i = 1, 2.

  2. BLUP(εi) = BLUPi(εi), i = 1, 2.

  3. rXiCov{yi,y}0X0Cov{εi,y}=rXiCov{yi,y}0X,i=1,2.

  4. rXiCov{yi,Xy}0Cov{εi,Xy}=r[Xi,Cov{yi,Xy}],i=1,2.

  5. ℛ([0, Cov{εi, Xy}]′) ⊆ ℛ([Xi, Cov{yi, Xy}]′), i = 1, 2.

  6. ℛ([Cov{εi, Xy}]′) ⊆ ℛ([Cov{Xiyi, Xy}]′), i = 1, 2.

Corollary 4.3

The following statistical facts are equivalent:

  1. BLUP(ψ) = BLUPL1(ψ1)BLUPL2(ψ2).

  2. BLUP(ψ1) = BLUP1(ψ1) and BLUP(ψ2) = BLUP2(ψ2).

Corollary 4.4

The following statistical facts are equivalent:

  1. BLUE(Xβ) = BLUEL1(X1β1)BLUEL2(X2β2).

  2. BLUP(ε) = BLUPL1(ε1)BLUPL2(ε2).

  3. BLUE(X1β1) = BLUE1(X1β1) and BLUE(X2β2) = BLUE2(X2β2).

  4. BLUP(ε1) = BLUP1(ε1) and BLUP(ε2) = BLUP2(ε2).

    Finally, we present a group of consequences for the covariance matrix Σ in (1.3) given by Σ = diag(σ12In1,σ22In2), where σ12 and σ22 are unknown positive numbers. In this situation, (1.8) and (1.9) reduce to

    Cov(ψi)=σi2HiHi,Cov{ψi,yi}=σi2Hi,i=1,2,Cov(ψ)=σ12H1H100σ22H2H2,Cov{ψ,y}=σ12H100σ22H2.

Corollary 4.5

Let1 and2 be as given in (1.1) and (1.2), respectively, and assume that the parameter vectors ψi in (1.5) are predictable by yi in (1.1) and (1.2), respectively. Then

BLUPLi(ψi)=[Gi,σi2HiXi][Xi,σi2Xi]+yi=(GiXi++HiXi)yi,i=1,2.

Further, the following results hold.

  1. The covariance matrices of BLUPi(ψi), as well as the covariance matrices between BLUPi(ψi) and ψi are unique, and satisfy the equalities

    Cov[BLUPLi(ψi)]=σi2(GiXi++HiXi)(GiXi++HiXi),Cov{BLUPLi(ψi),ψi}=σi2(GiXi++HiXi)Hi,Cov(ψi)Cov[BLUPLi(ψi)]=σi2HiHiσi2(GiXi++HiXi)(GiXi++HiXi)

    for i = 1, 2.

  2. The BLUPs of ψi can be decomposed as the sums

    BLUPLi(ψi)=BLUELi(Giβi)+BLUPLi(Hiεi),i=1,2.
  3. If ψ1 and ψ2 are predictable under (1.1) and (1.2), respectively, then P1ψ1 and P2ψ2 are predictable under (1.1) and (1.2), respectively, and BLUPi(Piψi) = PiBLUPi(ψi) hold for any matrices Pi ∈ ℝti×ki, i = 1, 2.

Corollary 4.6

Letbe as given in (1.4), and assume that ψ in (1.6) is predictable by y in ℒ. Then

BLUPL(ψ)=BLUPL(ψ1)BLUPL(ψ2)=BLUPL1(ψ1)BLUPL2(ψ2)=(G1X1++H1X1)y1(G2X2++H2X2)y2,BLUPL(ψ)=BLUEL(Gβ)+BLUPL(Hε),BLUPL(ψi)=BLUEL(Giβi)+BLUPL(Hiεi)=BLUELi(Giβi)+BLUPLi(Hiεi),i=1,2.

Acknowledgements

The authors would like to thank referees for their helpful comments and suggestions on this article. This work is supported in part by the National Natural Science Foundation of China (Grant No. 61703248).

References

[1] Zellner A., An efficient method of estimating seemingly unrelated regressions and tests for aggregation bias, J. Amer. Statist. Assoc., 1962, 57, 348–36810.1080/01621459.1962.10480664Search in Google Scholar

[2] Zellner A., Estimators for seemingly unrelated regression equations: some exact finite sample results, J. Amer. Statist. Assoc., 1963, 58, 977–99210.1080/01621459.1963.10480681Search in Google Scholar

[3] Zellner A., Huang D.S., Further properties of efficient estimators for seemingly unrelated regression equations, Internat. Econom. Rev., 1962, 3, 300–31310.2307/2525396Search in Google Scholar

[4] Srivastava V.K., Giles D.E.A., Seemingly Unrelated Regression Equations Model: Estimation and Inference, Marcel Dekker, New York, 1987Search in Google Scholar

[5] Fiebig D.G., Seemingly Unrelated Regression, Chapter 5 in Baltagi B.H. (Ed.), A Companion to Theoretical Econometrics, Blackwell, Massachusetts, 2001Search in Google Scholar

[6] Kubáček L., Seemingly unrelated regression models, Applications Math., 2013, 58, 111–12310.1007/s10492-013-0005-7Search in Google Scholar

[7] Sun Y., Ke R., Tian Y., Some overall properties of seemingly unrelated regression models, Adv. Statist. Anal., 2013, 98, 1–1810.1007/s10182-013-0212-2Search in Google Scholar

[8] Gan S., Sun Y., Tian Y., Equivalence of predictors under real and over-parameterized linear models, Commun. Statist. Theory Methods, 2017, 46, 5368–538310.1080/03610926.2015.1100742Search in Google Scholar

[9] Tian Y., A new derivation of BLUPs under random-effects model, Metrika, 2015, 78, 905–91810.1007/s00184-015-0533-0Search in Google Scholar

[10] Tian Y., A matrix handling of predictions under a general linear random-effects model with new observations, Electron. J. Linear Algebra, 2015, 29, 30–4510.13001/1081-3810.2895Search in Google Scholar

[11] Marsaglia G., Styan G.P.H., Equalities and inequalities for ranks of matrices, Linear Multilinear Algebra, 1974, 2, 269–29210.1080/03081087408817070Search in Google Scholar

[12] Penrose R., A generalized inverse for matrices, Proc. Cambridge Phil. Soc., 1955, 51, 406–41310.1017/S0305004100030401Search in Google Scholar

[13] Baksalary J.K., Kala R., On the prediction problem in the seemingly unrelated regression equations model, Statistics, 1979, 10, 203–20810.1080/02331887908801479Search in Google Scholar

[14] Hwang H.-A., Estimation of a linear sur model with unequal numbers of observations, Rev. Econ. Statist., 1990, 72, 510–51510.2307/2109360Search in Google Scholar

[15] Teräsvirta T., A note on predicting with seemingly unrelated regression equations, Statistics, 1973, 6, 709–71110.1080/02331937508842289Search in Google Scholar

[16] Gong L., Establishing equalities of OLSEs and BLUEs under seemingly unrelated regression models, J. Stat. Theory Pract., 2019, 13:5, https://doi.org/10.1007/s42519-018-0015-610.1007/s42519-018-0015-6Search in Google Scholar

[17] Hou J., Jiang B., Predictions and estimations under a group of linear models with random coefficients, Commun. Statist. Simul. Comput., 2018, 47, 510–52510.1080/03610918.2017.1283704Search in Google Scholar

[18] Jiang B., Sun Y., On the equality of estimators under a general partitioned linear model with parameter restrictions, Stat. Papers, 2019, 60, 273–29210.1007/s00362-016-0837-9Search in Google Scholar

[19] Jiang B., Tian Y., Decomposition approaches of a constrained general linear model with fixed parameters, Electron. J. Linear Algebra, 2017, 32, 232–25310.13001/1081-3810.3428Search in Google Scholar

[20] Jiang B., Tian Y., On additive decompositions of estimators under a multivariate general linear model and its two submodels, J. Multivariate Anal., 2017, 162, 193–21410.1016/j.jmva.2017.09.007Search in Google Scholar

[21] Jiang B., Tian Y., On equivalence of predictors/estimators under a multivariate general linear model with augmentation, J. Korean Stat. Soc., 2017, 46, 551–56110.1016/j.jkss.2017.04.001Search in Google Scholar

[22] Jiang B., Tian Y., Zhang X., On decompositions of estimators under a general linear model with partial parameter restrictions, Open Math., 2017, 15, 1300–132210.1515/math-2017-0109Search in Google Scholar

[23] Lu C., Sun Y., Tian Y., A comparison between two competing fixed parameter constrained general linear models with new regressors, Statistics, 2018, 52, 769–78110.1080/02331888.2018.1469021Search in Google Scholar

[24] Lu C., Sun Y., Tian Y., Two competing linear random-effects models and their connections, Stat. Papers, 2018, 59, 1101–111510.1007/s00362-016-0806-3Search in Google Scholar

[25] Tian Y., Some equalities and inequalities for covariance matrices of estimators under linear model, Stat. Papers, 2017, 58, 467–48410.1007/s00362-015-0707-xSearch in Google Scholar

[26] Tian Y., Matrix rank and inertia formulas in the analysis of general linear models, Open Math., 2017, 15, 126–15010.1515/math-2017-0013Search in Google Scholar

[27] Haslett S.J., Isotalo J., Liu Y., Puntanen S., Equalities between OLSE, BLUE and BLUP in the linear model, Statist. Papers, 2014, 55, 543–56110.1007/s00362-013-0500-7Search in Google Scholar

[28] Haslett S.J., Puntanen S., A note on the equality of the BLUPs for new observations under two linear models, Acta Comment. Univ. Tartu. Math., 2010, 14, 27–3310.12697/ACUTM.2010.14.03Search in Google Scholar

[29] Haslett S.J., Puntanen S., Equality of BLUEs or BLUPs under two linear models using stochastic restrictions, Stat. Papers, 2010, 51, 465–47510.1007/s00362-009-0219-7Search in Google Scholar

[30] Haslett S.J., Puntanen S., On the equality of the BLUPs under two linear mixed models, Metrika, 2011, 74, 381–39510.1007/s00184-010-0308-6Search in Google Scholar

[31] Tian Y., Jiang B., An algebraic study of BLUPs under two linear random-effects models with correlated covariance matrices, Linear Multilinear Algebra, 2016, 64, 2351–236710.1080/03081087.2016.1155533Search in Google Scholar

Received: 2018-04-15
Accepted: 2019-06-17
Published Online: 2019-08-24

© 2019 Hou and Zhao, published by De Gruyter

This work is licensed under the Creative Commons Attribution 4.0 Public License.

Articles in the same Issue

  1. Regular Articles
  2. On the Gevrey ultradifferentiability of weak solutions of an abstract evolution equation with a scalar type spectral operator of orders less than one
  3. Centralizers of automorphisms permuting free generators
  4. Extreme points and support points of conformal mappings
  5. Arithmetical properties of double Möbius-Bernoulli numbers
  6. The product of quasi-ideal refined generalised quasi-adequate transversals
  7. Characterizations of the Solution Sets of Generalized Convex Fuzzy Optimization Problem
  8. Augmented, free and tensor generalized digroups
  9. Time-dependent attractor of wave equations with nonlinear damping and linear memory
  10. A new smoothing method for solving nonlinear complementarity problems
  11. Almost periodic solution of a discrete competitive system with delays and feedback controls
  12. On a problem of Hasse and Ramachandra
  13. Hopf bifurcation and stability in a Beddington-DeAngelis predator-prey model with stage structure for predator and time delay incorporating prey refuge
  14. A note on the formulas for the Drazin inverse of the sum of two matrices
  15. Completeness theorem for probability models with finitely many valued measure
  16. Periodic solution for ϕ-Laplacian neutral differential equation
  17. Asymptotic orbital shadowing property for diffeomorphisms
  18. Modular equations of a continued fraction of order six
  19. Solutions with concentration and cavitation to the Riemann problem for the isentropic relativistic Euler system for the extended Chaplygin gas
  20. Stability Problems and Analytical Integration for the Clebsch’s System
  21. Topological Indices of Para-line Graphs of V-Phenylenic Nanostructures
  22. On split Lie color triple systems
  23. Triangular Surface Patch Based on Bivariate Meyer-König-Zeller Operator
  24. Generators for maximal subgroups of Conway group Co1
  25. Positivity preserving operator splitting nonstandard finite difference methods for SEIR reaction diffusion model
  26. Characterizations of Convex spaces and Anti-matroids via Derived Operators
  27. On Partitions and Arf Semigroups
  28. Arithmetic properties for Andrews’ (48,6)- and (48,18)-singular overpartitions
  29. A concise proof to the spectral and nuclear norm bounds through tensor partitions
  30. A categorical approach to abstract convex spaces and interval spaces
  31. Dynamics of two-species delayed competitive stage-structured model described by differential-difference equations
  32. Parity results for broken 11-diamond partitions
  33. A new fourth power mean of two-term exponential sums
  34. The new operations on complete ideals
  35. Soft covering based rough graphs and corresponding decision making
  36. Complete convergence for arrays of ratios of order statistics
  37. Sufficient and necessary conditions of convergence for ρ͠ mixing random variables
  38. Attractors of dynamical systems in locally compact spaces
  39. Random attractors for stochastic retarded strongly damped wave equations with additive noise on bounded domains
  40. Statistical approximation properties of λ-Bernstein operators based on q-integers
  41. An investigation of fractional Bagley-Torvik equation
  42. Pentavalent arc-transitive Cayley graphs on Frobenius groups with soluble vertex stabilizer
  43. On the hybrid power mean of two kind different trigonometric sums
  44. Embedding of Supplementary Results in Strong EMT Valuations and Strength
  45. On Diophantine approximation by unlike powers of primes
  46. A General Version of the Nullstellensatz for Arbitrary Fields
  47. A new representation of α-openness, α-continuity, α-irresoluteness, and α-compactness in L-fuzzy pretopological spaces
  48. Random Polygons and Estimations of π
  49. The optimal pebbling of spindle graphs
  50. MBJ-neutrosophic ideals of BCK/BCI-algebras
  51. A note on the structure of a finite group G having a subgroup H maximal in 〈H, Hg
  52. A fuzzy multi-objective linear programming with interval-typed triangular fuzzy numbers
  53. Variational-like inequalities for n-dimensional fuzzy-vector-valued functions and fuzzy optimization
  54. Stability property of the prey free equilibrium point
  55. Rayleigh-Ritz Majorization Error Bounds for the Linear Response Eigenvalue Problem
  56. Hyper-Wiener indices of polyphenyl chains and polyphenyl spiders
  57. Razumikhin-type theorem on time-changed stochastic functional differential equations with Markovian switching
  58. Fixed Points of Meromorphic Functions and Their Higher Order Differences and Shifts
  59. Properties and Inference for a New Class of Generalized Rayleigh Distributions with an Application
  60. Nonfragile observer-based guaranteed cost finite-time control of discrete-time positive impulsive switched systems
  61. Empirical likelihood confidence regions of the parameters in a partially single-index varying-coefficient model
  62. Algebraic loop structures on algebra comultiplications
  63. Two weight estimates for a class of (p, q) type sublinear operators and their commutators
  64. Dynamic of a nonautonomous two-species impulsive competitive system with infinite delays
  65. 2-closures of primitive permutation groups of holomorph type
  66. Monotonicity properties and inequalities related to generalized Grötzsch ring functions
  67. Variation inequalities related to Schrödinger operators on weighted Morrey spaces
  68. Research on cooperation strategy between government and green supply chain based on differential game
  69. Extinction of a two species competitive stage-structured system with the effect of toxic substance and harvesting
  70. *-Ricci soliton on (κ, μ)′-almost Kenmotsu manifolds
  71. Some improved bounds on two energy-like invariants of some derived graphs
  72. Pricing under dynamic risk measures
  73. Finite groups with star-free noncyclic graphs
  74. A degree approach to relationship among fuzzy convex structures, fuzzy closure systems and fuzzy Alexandrov topologies
  75. S-shaped connected component of radial positive solutions for a prescribed mean curvature problem in an annular domain
  76. On Diophantine equations involving Lucas sequences
  77. A new way to represent functions as series
  78. Stability and Hopf bifurcation periodic orbits in delay coupled Lotka-Volterra ring system
  79. Some remarks on a pair of seemingly unrelated regression models
  80. Lyapunov stable homoclinic classes for smooth vector fields
  81. Stabilizers in EQ-algebras
  82. The properties of solutions for several types of Painlevé equations concerning fixed-points, zeros and poles
  83. Spectrum perturbations of compact operators in a Banach space
  84. The non-commuting graph of a non-central hypergroup
  85. Lie symmetry analysis and conservation law for the equation arising from higher order Broer-Kaup equation
  86. Positive solutions of the discrete Dirichlet problem involving the mean curvature operator
  87. Dislocated quasi cone b-metric space over Banach algebra and contraction principles with application to functional equations
  88. On the Gevrey ultradifferentiability of weak solutions of an abstract evolution equation with a scalar type spectral operator on the open semi-axis
  89. Differential polynomials of L-functions with truncated shared values
  90. Exclusion sets in the S-type eigenvalue localization sets for tensors
  91. Continuous linear operators on Orlicz-Bochner spaces
  92. Non-trivial solutions for Schrödinger-Poisson systems involving critical nonlocal term and potential vanishing at infinity
  93. Characterizations of Benson proper efficiency of set-valued optimization in real linear spaces
  94. A quantitative obstruction to collapsing surfaces
  95. Dynamic behaviors of a Lotka-Volterra type predator-prey system with Allee effect on the predator species and density dependent birth rate on the prey species
  96. Coexistence for a kind of stochastic three-species competitive models
  97. Algebraic and qualitative remarks about the family yy′ = (αxm+k–1 + βxmk–1)y + γx2m–2k–1
  98. On the two-term exponential sums and character sums of polynomials
  99. F-biharmonic maps into general Riemannian manifolds
  100. Embeddings of harmonic mixed norm spaces on smoothly bounded domains in ℝn
  101. Asymptotic behavior for non-autonomous stochastic plate equation on unbounded domains
  102. Power graphs and exchange property for resolving sets
  103. On nearly Hurewicz spaces
  104. Least eigenvalue of the connected graphs whose complements are cacti
  105. Determinants of two kinds of matrices whose elements involve sine functions
  106. A characterization of translational hulls of a strongly right type B semigroup
  107. Common fixed point results for two families of multivalued A–dominated contractive mappings on closed ball with applications
  108. Lp estimates for maximal functions along surfaces of revolution on product spaces
  109. Path-induced closure operators on graphs for defining digital Jordan surfaces
  110. Irreducible modules with highest weight vectors over modular Witt and special Lie superalgebras
  111. Existence of periodic solutions with prescribed minimal period of a 2nth-order discrete system
  112. Injective hulls of many-sorted ordered algebras
  113. Random uniform exponential attractor for stochastic non-autonomous reaction-diffusion equation with multiplicative noise in ℝ3
  114. Global properties of virus dynamics with B-cell impairment
  115. The monotonicity of ratios involving arc tangent function with applications
  116. A family of Cantorvals
  117. An asymptotic property of branching-type overloaded polling networks
  118. Almost periodic solutions of a commensalism system with Michaelis-Menten type harvesting on time scales
  119. Explicit order 3/2 Runge-Kutta method for numerical solutions of stochastic differential equations by using Itô-Taylor expansion
  120. L-fuzzy ideals and L-fuzzy subalgebras of Novikov algebras
  121. L-topological-convex spaces generated by L-convex bases
  122. An optimal fourth-order family of modified Cauchy methods for finding solutions of nonlinear equations and their dynamical behavior
  123. New error bounds for linear complementarity problems of Σ-SDD matrices and SB-matrices
  124. Hankel determinant of order three for familiar subsets of analytic functions related with sine function
  125. On some automorphic properties of Galois traces of class invariants from generalized Weber functions of level 5
  126. Results on existence for generalized nD Navier-Stokes equations
  127. Regular Banach space net and abstract-valued Orlicz space of range-varying type
  128. Some properties of pre-quasi operator ideal of type generalized Cesáro sequence space defined by weighted means
  129. On a new convergence in topological spaces
  130. On a fixed point theorem with application to functional equations
  131. Coupled system of a fractional order differential equations with weighted initial conditions
  132. Rough quotient in topological rough sets
  133. Split Hausdorff internal topologies on posets
  134. A preconditioned AOR iterative scheme for systems of linear equations with L-matrics
  135. New handy and accurate approximation for the Gaussian integrals with applications to science and engineering
  136. Special Issue on Graph Theory (GWGT 2019)
  137. The general position problem and strong resolving graphs
  138. Connected domination game played on Cartesian products
  139. On minimum algebraic connectivity of graphs whose complements are bicyclic
  140. A novel method to construct NSSD molecular graphs
Downloaded on 9.9.2025 from https://www.degruyterbrill.com/document/doi/10.1515/math-2019-0077/html
Scroll to top button