824 CHAPTER 39. STATISTICAL TESTS

Note that the symmetry of B,C implies P,Q are symmetric also. It is given that XT BX isX 2

(r1)

which happens if and only if B2 = B thanks to Theorem 39.4.5. In other words, Bhas eigenvalues either 0 or 1. It follows that P2 = P.

Now multiply on the left in 39.8 by

(P 00 0

).

(P 00 0

)=

(P2 00 0

)+

(PQ 00 0

)=

(P 00 0

)+

(PQ 00 0

)Thus, comparing the ends, PQ = QP = 0. It follows that BC = 0. By Corollary 39.4.3,XT BX,XTCX are independent. It follows from this independence

E(exp(tXT AX

))= E

(exp(tXT BX+ tXTCX

))= E

(exp(tXT BX

)exp(tXTCX

))= E

(exp(tXT BX

))E(exp(tXTCX

))and so, by assumption,

1

(1−2t)r/2 =1

(1−2t)r1/2 E(exp(tXTCX

))showing that

E(exp(tXTCX

))=

1

(1−2t)(r−r1)/2

which implies XTCX is X 2 (r− r1). ■Something should be pointed out here. It is that xTCx≥ 0 and this follows from linear

algebra considerations. From the above argument,(I 00 0

)=

(P 00 0

)+

(Q 00 0

),P2 = P

Thus P has eigenvalues 0 or 1. Then also, 0 ≤((I−P)2x,x

)=((

I−2P+P2)x,x

)=

((I−P)x,x) and so I−P = Q has all nonnegative eigenvalues. Hence

(xTCx

)= xTU

(Q 00 0

)UTx≥ 0

You can extend this to more than two quadratic forms on the right.

Corollary 39.4.7 Let A,{Ak}mk=1 be real symmetric n×n matrices. Let

X =(

X1 · · · Xn

)T

where {X1, · · · ,Xn} is a random sample from n(0,σ2

). Let

xT A x=m

∑k=1

xT Akx (39.9)

and suppose XT AX is X 2 (r) ,XT AkX is X 2 (rk) for ∑m−1k=1 rk < r. Then the random

variables{XT AkX

}m−1k=1 on the right are independent and XT AmX is X 2

(r−∑

m−1k=1 rk

).

824 CHAPTER 39. STATISTICAL TESTSNote that the symmetry of B,C implies P,Q are symmetric also. It is given that X’ BX is2? (r!) which happens if and only if B’ = B thanks to Theorem 39.4.5. In other words, Bhas eigenvalues either 0 or 1. It follows that P? = P.P ONow multiply on the left in 39.8 by ( 00 .JC oP} (9 08 8)Thus, comparing the ends, PQ = QP = 0. It follows that BC = 0. By Corollary 39.4.3,X'BX,X'CX are independent. It follows from this independenceE(exp(tX'AX)) = E(exp(tX’BX +1X'CX))E (exp (*X"BX) exp (tX™CX))E (exp (tX"BX)) E (exp (tX7CX))and so, by assumption,1 1= E (exp (tX'CX(1-20)? (4-28)? (exp ( ))showing thatT _E (exp (1X CX)) = (i aywhich implies X7CX is 2? (r—r,).Something should be pointed out here. It is that 27 Ca > 0 and this follows from linearalgebra considerations. From the above argument,(oC b)e(8 omerThus P has eigenvalues 0 or 1. Then also, 0 < ((1-P)° a, a) = ((1—2P+P*) a,x) =((I—P) ax,a) and so J— P = Q has all nonnegative eigenvalues. Hence0(a" Ca) =a'U Q U'a2>00 0You can extend this to more than two quadratic forms on the right.Corollary 39.4.7 Let A, {Ax}; be real symmetric n x n matrices. LetTX= ( Xj o X )where {X1,--- ,Xn} isa random sample from n (0, 0”). Letmai An= y a! Apa (39.9)k=and suppose X™AX is 2? (r), XTAX is 2? (ry) for LM! rye <r. Then the randomvariables {X TAX an on the right are independent and X" A,X is X? (r - ye rk).