39.5. LINEAR REGRESSION 827
and soα̂ =
1n ∑
kXk ≡ X̄
Then alson
∑k=1
Xk (tk− t̄)−β ∑k(tk− t̄)2 = 0
and so
β̂ =∑
nk=1 Xk (tk− t̄)
∑k (tk− t̄)2 =∑
nk=1 (Xk− X̄)(tk− t̄)
∑k (tk− t̄)2
because ∑nk=1 X̄ (tk− t̄) = 0. It remains to find the maximum likelihood estimate for σ2.
Using 39.10,
nσ−
n
∑k=1
(Xk−
(α̂ + β̂ (tk− t̄)
))2
σ3 = 0
σ̂2 =
1n
n
∑k=1
(Xk−
(α̂ + β̂ (tk− t̄)
))2
Now considern
∑k=1
(Xk− (α +β (tk− t̄)))2
σ2 (39.11)
I will add in α̂ + β̂ (tk− t̄) and subtract it and then write this as a sum of quadratic forms.First of all, note that it is the sum of the squares of independent random variables in n(0,1)and so it is X 2 (n). It equals
n
∑k=1
(Xk−
(α̂ + β̂ (tk− t̄)
)+((
α̂ + β̂ (tk− t̄))− (α +β (tk− t̄))
))2
σ2 (39.12)
This will be expanded. I need to consider the mixed term in which I will use the abovedescriptions of α̂ and β̂ .
∑k
(Xk−
(α̂ + β̂ (tk− t̄)
))((α̂ + β̂ (tk− t̄)
)− (α +β (tk− t̄))
)= ∑
k
[(Xk− X̄)−
(β̂ (tk− t̄)
)][(X̄ + β̂ (tk− t̄)
)− (α +β (tk− t̄))
]First note that
∑k(Xk− X̄) X̄ = ∑
k(Xk− X̄)α = ∑
kβ̂ (tk− t̄) X̄ = ∑
kβ̂ (tk− t̄)α = 0
Thus the mixed term is
∑k(Xk− X̄) β̂ (tk− t̄)−β ∑
k(Xk− X̄)(tk− t̄)
−β̂2∑k(tk− t̄)2 + β̂β ∑
k(tk− t̄)2
=(
β̂ −β
)∑k(Xk− X̄)(tk− t̄)+ β̂
(β − β̂
)∑k(tk− t̄)2
=(
β̂ −β
)β̂ ∑
j(t j− t̄)2 + β̂
(β − β̂
)∑k(tk− t̄)2 = 0