Chapter 7
Canonical FormsLinear algebra is really all about linear transformations and the fundamental question iswhether a matrix comes from some linear transformation with respect to some basis. Inother words, are two matrices really from the same linear transformation? As proved above,this happens if and only if the two are similar. Canonical forms allow one to answer thisquestion. There are two main kinds of canonical form, the Jordan canonical form for thecase where the minimum polynomial splits and the rational canonical form in the othercase. Of the two, the Jordan canonical form is the one which is used the most in appliedmath. However, the other one is also pretty interesting. In what follows V,W will denotevector spaces over the field of scalars F.
7.1 Reduction to Diagonal MatrixThis is a really interesting result on diagonalization. It is an approach used in Jacobsen[26]. It concerns matrices whose entries are polynomials having coefficients in F and is anapplication of row operations and division of polynomials.
Recall the elementary matrices which involved doing a row operation to the identitymatrix. The elementary matrices which involve switching two rows or adding a multipleof one row to another result in elementary matrices which are invertible. Similarly, thesetwo column operations may be accomplished by multiplying on the right by an elementarymatrix which involves adding a multiple of a column to another column or switching twocolumns. See Problem 40 on Page 99. It all works just as well if the multiple is an elementof a commutative ring. In the theorem which follows, the entries will be polynomials δ (q)will denote the degree of the nonzero polynomial q(x) which is undefined if q = 0. AlsoAi j will be a polynomial. When we write AB we mean the matrix whose i jth entry is just∑k AikBk j which may be a polynomial. The identity matrix is the same as usual. An inverseis also the same as before, PP−1 = I. Recall that if α,β ∈ F [x], the polynomials withcoefficients in F, there exists κ such that
α = κβ +ρ, δ (ρ)< δ (β ) or else ρ = 0
Theorem 7.1.1 Let A be an m× n matrix whose entries are polynomials. Then there areinvertible matrices P,Q of the right size such that PAQ = B where B is a diagonal matrix.
Proof: If A = 0 there is nothing to show. Just let P,Q be appropriate identity matrices.Assume then that A ̸= 0. Begin with P and Q appropriate sized identity matrices. Letδ (Ai j) be the smallest of the degrees of all entries of A which are not zero. Now choosinga switch of columns and rows, we can modify P,Q such that B11 = Ai j. Consider Bi1, thefirst entry in the ith row. By the Euclidean algorithm,
Bi1 = B11q+ ri1, δ (ri1)< δ (B11)
or else ri1 = 0. Take −q times the first row of B and add to the ith row to place a ri1 inthe i1 position in place of Bi1. This involves adjusting P to get this new B. It is desired toget a 0 in the i1 position which might have occurred if the ri1 had been 0. Otherwise, outof all entries of the new matrix B the Brs which has δ (Brs) the smallest is in the ith rowand δ (Bis) ≤ δ (ri1) . Switch rows and columns till Bis is in the 11 position. Now repeatthe argument just given, replacing the first entry of the ith row with a remainder r′ where it
151