428 CHAPTER 15. NUMERICAL METHODS, EIGENVALUES

where Q′ is essentially equal to Q but might have some of the columns multiplied by −1.This is because Qk→ I and so QkΛ→ Λ. Now by Lemma 15.3.4, it follows

Q(k)→ Q′, R(k)G−1k → I.

It remains to verify Ak converges to an upper triangular matrix. Recall that from 15.9and the definition below this (S = QR)

A = SDS−1 = (QR)D(QR)−1 = QRDR−1QT = QT QT

Where T is an upper triangular matrix. This is because it is the product of upper triangularmatrices R,D,R−1. Thus QT AQ = T. If you replace Q with Q′ in the above, it still resultsin an upper triangular matrix T ′ having the same diagonal entries as T. This is because

T = QT AQ =(Q′Λ

)T A(Q′Λ

)= ΛQ′T AQ′Λ

and considering the iith entry yields(QT AQ

)ii ≡∑

j,kΛi j(Q′T AQ′

)jk Λki = ΛiiΛii

(Q′T AQ′

)ii =

(Q′T AQ′

)ii

Recall from Lemma 15.3.3, Ak = Q(k)T AQ(k). Thus taking a limit and using the firstpart,

Ak = Q(k)T AQ(k)→ Q′T AQ′ = T ′. ■

An easy case is for A symmetric. Recall Corollary 13.1.6. By this corollary, there existsan orthogonal (real unitary) matrix Q such that

QT AQ = D

where D is diagonal having the eigenvalues on the main diagonal decreasing in size fromthe upper left corner to the lower right.

Corollary 15.3.6 Let A be a real symmetric n×n matrix having eigenvalues

λ 1 > λ 2 > · · ·> λ n > 0

and let Q be defined byQDQT = A, D = QT AQ, (15.11)

where Q is orthogonal and D is a diagonal matrix having the eigenvalues on the maindiagonal decreasing in size from the upper left corner to the lower right. Let QT have anLU factorization. Then in the QR algorithm, the matrices Q(k) converge to Q′ where Q′ isthe same as Q except having some columns multiplied by (−1) . Thus the columns of Q′ areeigenvectors of A. The matrices Ak converge to D.

Proof: This follows from Theorem 15.3.5. Here S = Q,S−1 = QT . Thus

Q = S = QR

and R = I. By Theorem 15.3.5 and Lemma 15.3.3,

Ak = Q(k)T AQ(k)→ Q′T AQ′ = QT AQ = D.

428 CHAPTER 15. NUMERICAL METHODS, EIGENVALUESwhere Q’ is essentially equal to Q but might have some of the columns multiplied by —1.This is because Q; — I and so Q,A — A. Now by Lemma 15.3.4, it follows0 +0, RG! 1.It remains to verify A, converges to an upper triangular matrix. Recall that from 15.9and the definition below this (S = QR)A= SDS"! = (QR)D(QR) | = QRDR“'Q" = QTQ"Where T is an upper triangular matrix. This is because it is the product of upper triangularmatrices R,D,R~'. Thus Q7 AQ = T. If you replace Q with Q’ in the above, it still resultsin an upper triangular matrix T’ having the same diagonal entries as T. This is becauseTT =Q'AQ=(Q'A)’ A(Q'A) = AQTAO'Aand considering the ii’” entry yields(Q" AQ), =LAi (2 (Q7AQ') 1, Mi = AA (Q7AQ'),, = (Q7AQ')Recall from Lemma 15.3.3, A, = OWT AQ, Thus taking a limit and using the firstpart,A, = OT AQ + OT AO! =T'. EIAn easy case is for A symmetric. Recall Corollary 13.1.6. By this corollary, there existsan orthogonal (real unitary) matrix Q such thato'AQ =Dwhere D is diagonal having the eigenvalues on the main diagonal decreasing in size fromthe upper left corner to the lower right.Corollary 15.3.6 Let A be a real symmetric n Xx n matrix having eigenvaluesAy >A2>-+++>An>0and let Q be defined byoDQ' =A, D=Q' AQ, (15.11)where Q is orthogonal and D is a diagonal matrix having the eigenvalues on the maindiagonal decreasing in size from the upper left corner to the lower right. Let Q" have anLU factorization. Then in the QR algorithm, the matrices a) converge to Q' where Q' isthe same as Q except having some columns multiplied by (—1). Thus the columns of Q' areeigenvectors of A. The matrices Ay converge to D.Proof: This follows from Theorem 15.3.5. Here S = Q,S~! = Q’. ThusQ=S=ORand R = J. By Theorem 15.3.5 and Lemma 15.3.3,Ay = TAQ" — OTAQ! = Q"AQ = D.