402 CHAPTER 14. ANALYSIS OF LINEAR TRANSFORMATIONS
Theorem 14.4.6 Suppose ρ(B−1C
)< 1. Then the iterates described above converge to
the unique solution of Ax= b.
Proof: Consider the above iterates. Let Tx= B−1Cx+B−1b. Then∣∣∣T kx−T ky∣∣∣= ∣∣∣(B−1C
)kx−
(B−1C
)ky∣∣∣≤ ∥∥∥(B−1C
)k∥∥∥ |x−y| .
Here ||·|| refers to any of the operator norms. It doesn’t matter which one you pick becausethey are all equivalent. I am writing the proof to indicate the operator norm taken withrespect to the usual norm on E. Since ρ
(B−1C
)< 1, it follows from Gelfand’s theorem,
Theorem 14.2.4 on Page 392, there exists N such that if k ≥ N, then∣∣∣∣∣∣(B−1C
)k∣∣∣∣∣∣≤ r < 1.
Consequently, ∣∣T Nx−T Ny∣∣≤ r |x−y| .
Also |Tx−Ty| ≤∣∣∣∣B−1C
∣∣∣∣ |x−y| and so Corollary 14.4.5 applies and gives the conclu-sion of this theorem. ■
In the Jacobi method, you have
A =
∗ ∗
. . .
∗ ∗
and you let B be the diagonal matrix whose diagonal entries are those of A and you let C be(−1) times the matrix obtained from A by making the diagonal entries 0 and retaining allthe other entries of A. Thus
B =
∗ 0
. . .
0 ∗
, C =−
0 ∗
. . .
∗ 0
In the Gauss Seidel method, you let
B =
∗ 0
. . .
∗ ∗
, C =−
0 ∗
. . .
0 0
Thus you keep the entries of A which are on or below the main diagonal in order to get B.To get C you take −1 times the matrix obtained from A by replacing all entries below andon the main diagonal with zeros.
Observation 14.4.7 Note that if A is diagonally dominant, meaning
|aii|> ∑j ̸=i
∣∣ai j∣∣
then in both cases above, ρ(B−1C
)< 1 so the two iterative procedures will converge.
To see this, suppose B−1Cx = λx, |λ | ≥ 1. Then you get (λB−C)x= 0 However,in either the case of Jacobi iteration or Gauss Seidel iteration, the matrix λB−C will bediagonally dominant and so by Gerschgorin’s theorem will have no zero eigenvalues whichrequires that this matrix be one to one. Thus there are no eigenvectors for such λ and henceρ(B−1C
)< 1.
402 CHAPTER 14. ANALYSIS OF LINEAR TRANSFORMATIONSTheorem 14.4.6 Suppose p (B-'c) < 1. Then the iterates described above converge tothe unique solution of Ax = b.Proof: Consider the above iterates. Let Ta = B~-'Ca +B ~!b. Thenk|r'e—T'y| = \(B Ic) xr (B'c)'y| < | (8 'c)"| la — y|.Here ||-|| refers to any of the operator norms. It doesn’t matter which one you pick becausethey are all equivalent. I am writing the proof to indicate the operator norm taken withrespect to the usual norm on E. Since p (B~'C) <1, it follows from Gelfand’s theorem,Theorem 14.2.4 on Page 392, there exists NV such that if k > N, then | | (B-'c)'| | <r<l.Consequently,|T%x—Ty| <rlx—y|.Also |Ta —Ty| < ||B~'C|| |z—y| and so Corollary 14.4.5 applies and gives the conclu-sion of this theorem.In the Jacobi method, you have* *and you let B be the diagonal matrix whose diagonal entries are those of A and you let C be(—1) times the matrix obtained from A by making the diagonal entries 0 and retaining allthe other entries of A. ThusB= me, ,C=- .0 * * 0In the Gauss Seidel method, you let* * 0 0Thus you keep the entries of A which are on or below the main diagonal in order to get B.To get C you take —1 times the matrix obtained from A by replacing all entries below andon the main diagonal with zeros.Observation 14.4.7 Note that if A is diagonally dominant, meaningaii > y |ai,|fithen in both cases above, p (B-'Cc) < 1 so the two iterative procedures will converge.To see this, suppose B-'Ca = Aa, |A| > 1. Then you get (AB—C) a = 0 However,in either the case of Jacobi iteration or Gauss Seidel iteration, the matrix AB—C will bediagonally dominant and so by Gerschgorin’s theorem will have no zero eigenvalues whichrequires that this matrix be one to one. Thus there are no eigenvectors for such A and hencep(B 'C) <1.