822 CHAPTER 39. STATISTICAL TESTS
(A2−A
)vk = 0−0 = 0. Thus A2−A = 0 because this matrix sends every vector in a
basis to 0.Conversely, suppose A2 = A. Why are all eigenvalues 1 or 0? Say Av = λv and say
λ ̸= 0. Then for each v an eigenvector, A2v = Av = λAv and so A(1−λ )v = 0. If λ ̸= 1,then Av = 0 which is assumed not to be so. Hence λ = 1. Thus all eigenvalues are either0 or 1. ■
This proves the following interesting theorem.
Theorem 39.4.5 Let X1, · · · ,Xn be independent and n(0,σ2
). Let A be a real symmetric
matrix. Then for X =(
X1 · · · Xn
)T, X
T AXσ2 is X 2 (r) for some r ≤ n if and only if
A2 = A if and only if the eigenvalues of A are 0 or 1. In fact, r is the rank of A.
At this point, it might be good to recall that the distribution of
nS2/σ2 ≡
n
∑k=1
(Xk− X̄)2
σ2
is X 2 (n−1) where
X̄ =1n
n
∑k=1
Xk
In showing this, first there was some algebra.
X 2(n)︷ ︸︸ ︷n
∑k=1
(Xk−µ)2
σ2 =n
∑k=1
((Xk− X̄)+(X̄−µ))2
σ2
After some simple manipulations,
=n
∑k=1
(Xk− X̄)2
σ2 +n
∑k=1
(X̄−µ)2
σ2 =nS2
σ2 +n
∑k=1
(1n ∑
nj=1 (Xk−µ)
)2
σ2
=nS2
σ2 +n
∑k=1
(1n
n
∑j=1
(Xk−µ)
σ
)2
=
X 2(1)︷ ︸︸ ︷
n
∑j=1
(Xk−µ)√nσ
2
+nS2
σ2
Then it was proved that the two random variables at the end are independent. This wasdone by using the special form of the normal distribution. Then from this, we obtained onlooking at the moment generating functions,
(1
1−2t
)n/2
= E(
exp(
tnS2
σ2
))E
exp
t
(n
∑j=1
(Xk−µ)√nσ
)2
= E(
exp(
tnS2
σ2
))1
(1−2t)1/2