15.3. THE QR ALGORITHM 425

Proof: LetQ = (q1, · · · ,qn) , Q(k) =

(qk

1, · · · ,qkn

)where the q are the columns. Also denote by rk

i j the i jth entry of Rk. Thus

Q(k)Rk =(qk

1, · · · ,qkn

)rk

11 ∗. . .

0 rknn

It follows rk

11qk1 → q1 and so rk

11 =∣∣rk

11qk1

∣∣→ 1. Therefore, qk1 → q1. Next consider the

second column.rk

12qk1 + rk

22qk2→ q2

Taking the inner product of both sides with qk1 it follows

limk→∞

rk12 = lim

k→∞

(q2 ·qk

1

)= (q2 ·q1) = 0.

Therefore, limk→∞ rk22q

k2 = q2 and since rk

22 > 0, it follows as in the first part that rk22→ 1.

Hence limk→∞qk2 = q2. Continuing this way, it follows limk→∞ rk

i j = 0 for all i ̸= j and

limk→∞

rkj j = 1, lim

k→∞qk

j = q j.

Thus Rk→ I and Q(k)→ Q. This proves the first part of the lemma.The second part follows immediately. If QR = Q′R′ = A where A−1 exists, then Q∗Q′ =

R(R′)−1 and I need to show both sides of the above are equal to I. The left side of the aboveis unitary and the right side is upper triangular having positive entries on the diagonal. Thisis because the inverse of such an upper triangular matrix having positive entries on themain diagonal is still upper triangular having positive entries on the main diagonal andthe product of two such upper triangular matrices gives another of the same form havingpositive entries on the main diagonal. Suppose then that Q = R where Q is unitary and Ris upper triangular having positive entries on the main diagonal. Let Qk = Q and Rk = R.It follows IRk → R = Q and so from the first part, Rk → I but Rk = R and so R = I. Thusapplying this to Q∗Q′ = R(R′)−1 yields both sides equal I. ■

A case of all this is of great interest. Suppose A has a largest eigenvalue λ which isreal. Then An is of the form

(An−1a1, · · · ,An−1an

)and so likely each of these columns

will be pointing roughly in the direction of an eigenvector of A which corresponds to thiseigenvalue. Then when you do the QR factorization of this, it follows from the fact that R isupper triangular, that the first column of Q will be a multiple of An−1a1 and so will end upbeing roughly parallel to the eigenvector desired. Also this will require the entries belowthe top in the first column of An = QT AQ will all be small because they will be of the formqT

i Aq1 ≊ λqTi q1 = 0. Therefore, An will be of the form(

λ′ a

e B

)where e is small. It follows that λ

′ will be close to λ and q1 will be close to an eigenvectorfor λ . Then if you like, you could do the same thing with the matrix B to obtain approxi-mations for the other eigenvalues. Finally, you could use the shifted inverse power methodto get more exact solutions.

15.3. THE QR ALGORITHM 425Proof: Let= (415+ dn), 0 = (at. -a4)where the q are the columns. Also denote by rij the ij” entry of R,. Thusan *0 R= (aha)k0 TanIt follows rk qk — q, and so rk = Iria | — 1. Therefore, qi — q,. Next consider thesecond column.riod +1425 > @Taking the inner product of both sides with qi it followslim rf = lim (4-41) = (42°41) =.k—- 00 k—- 00Therefore, lim, 5.075, = qo and since rk, > 0, it follows as in the first part that r4, — 1.Hence limg_5.0 gs = q». Continuing this way, it follows lim,_,.. ri = 0 for all i F j andlim = 1 Hina =a).Thus R;, + I and QO) -+ Q. This proves the first part of the lemma.The second part follows immediately. If QR = O'R! = A where A! exists, then Q*Q' =R(R’ y! and I need to show both sides of the above are equal to J. The left side of the aboveis unitary and the right side is upper triangular having positive entries on the diagonal. Thisis because the inverse of such an upper triangular matrix having positive entries on themain diagonal is still upper triangular having positive entries on the main diagonal andthe product of two such upper triangular matrices gives another of the same form havingpositive entries on the main diagonal. Suppose then that Q = R where Q is unitary and Ris upper triangular having positive entries on the main diagonal. Let Q; = Q and Ry; = R.It follows JR, — R = Q and so from the first part, Ry — I but Ry = R and so R = I. Thusapplying this to Q*Q! = R(R’)' yields both sides equal J. iA case of all this is of great interest. Suppose A has a largest eigenvalue A which isreal. Then A” is of the form (A”~'a,,---,A”~!a,) and so likely each of these columnswill be pointing roughly in the direction of an eigenvector of A which corresponds to thiseigenvalue. Then when you do the QR factorization of this, it follows from the fact that R isupper triangular, that the first column of Q will be a multiple of A”~!a, and so will end upbeing roughly parallel to the eigenvector desired. Also this will require the entries belowthe top in the first column of A,, = Q7 AQ will all be small because they will be of the formq! Aq, la Adi a = 0. Therefore, A, will be of the formNoae Bwhere e is small. It follows that A’ will be close to A and q, will be close to an eigenvectorfor A. Then if you like, you could do the same thing with the matrix B to obtain approxi-mations for the other eigenvalues. Finally, you could use the shifted inverse power methodto get more exact solutions.