15.3. THE QR ALGORITHM 435
Now this is similar to A and the eigenvalues are close to the eigenvalues obtained fromthe two blocks on the diagonal,(
−.83412 −4.16821.05 .14514
),
(−.85029 −.61608
−1.8263×10−2 .53939
)
since 4.0264×10−4 is small. After routine computations involving the quadratic formula,these are seen to be
−.85834, .54744, −.34449−2.0339i, −.34449+2.0339i
When these are plugged in to the polynomial equation, you see that each is close to beinga solution of the equation.
15.3.4 Upper Hessenberg MatricesIt seems like most of the attention to the QR algorithm has to do with finding ways to get itto “converge” faster. Great and marvelous are the clever tricks which have been proposed todo this but my intent is to present the basic ideas, not to go in to the numerous refinementsof this algorithm. However, there is one thing which should be done. It involves reducingto the case of an upper Hessenberg matrix which is one which is zero below the main subdiagonal. The following shows that any square matrix is unitarily similar to such an upperHessenberg matrix.
Let A be an invertible n×n matrix. Let Q′1 be a unitary matrix
Q′1
a21
...an1
=
√∑
nj=2
∣∣a j1∣∣2
0...0
≡
a0...0
The vector Q′1 is multiplying is just the bottom n−1 entries of the first column of A. Thenlet Q1 be (
1 0
0 Q′1
)It follows
Q1AQ∗1 =
(1 0
0 Q′1
)AQ∗1 =
a11 a12 · · · a1n
a... A′10
(
1 0
0 Q′∗1
)
=
∗ ∗ · · · ∗a... A1
0