15.3. THE QR ALGORITHM 435

Now this is similar to A and the eigenvalues are close to the eigenvalues obtained fromthe two blocks on the diagonal,(

−.83412 −4.16821.05 .14514

),

(−.85029 −.61608

−1.8263×10−2 .53939

)

since 4.0264×10−4 is small. After routine computations involving the quadratic formula,these are seen to be

−.85834, .54744, −.34449−2.0339i, −.34449+2.0339i

When these are plugged in to the polynomial equation, you see that each is close to beinga solution of the equation.

15.3.4 Upper Hessenberg MatricesIt seems like most of the attention to the QR algorithm has to do with finding ways to get itto “converge” faster. Great and marvelous are the clever tricks which have been proposed todo this but my intent is to present the basic ideas, not to go in to the numerous refinementsof this algorithm. However, there is one thing which should be done. It involves reducingto the case of an upper Hessenberg matrix which is one which is zero below the main subdiagonal. The following shows that any square matrix is unitarily similar to such an upperHessenberg matrix.

Let A be an invertible n×n matrix. Let Q′1 be a unitary matrix

Q′1

a21

...an1

=



√∑

nj=2

∣∣a j1∣∣2

0...0

≡

a0...0

The vector Q′1 is multiplying is just the bottom n−1 entries of the first column of A. Thenlet Q1 be (

1 0

0 Q′1

)It follows

Q1AQ∗1 =

(1 0

0 Q′1

)AQ∗1 =

a11 a12 · · · a1n

a... A′10

(

1 0

0 Q′∗1

)

=

∗ ∗ · · · ∗a... A1

0



15.3. THE QR ALGORITHM 435Now this is similar to A and the eigenvalues are close to the eigenvalues obtained fromthe two blocks on the diagonal,—.83412 —4.1682 —. 85029 —.616081.05 14514. J’? \ —1.8263x1072 53939since 4.0264 x 10~* is small. After routine computations involving the quadratic formula,these are seen to be—.858 34, .54744, —.34449 — 2.03391, —.34449 + 2.0339:When these are plugged in to the polynomial equation, you see that each is close to beinga solution of the equation.15.3.4 Upper Hessenberg MatricesIt seems like most of the attention to the QR algorithm has to do with finding ways to get itto “converge” faster. Great and marvelous are the clever tricks which have been proposed todo this but my intent is to present the basic ideas, not to go in to the numerous refinementsof this algorithm. However, there is one thing which should be done. It involves reducingto the case of an upper Hessenberg matrix which is one which is zero below the main subdiagonal. The following shows that any square matrix is unitarily similar to such an upperHessenberg matrix.Let A be an invertible n x n matrix. Let Q' be a unitary matrixa2| V Li-2 jaja? arf. 0 0Q| : = = .ani 0 0The vector Q/ is multiplying is just the bottom n — 1 entries of the first column of A. Thenlet QO; be1 O0 2It follows411 412 +7) Gin1 O a 1 OQ\AQ; = AQ) = .‘Lo a )°* : Al 0 OF0* Ok *a= A,