5.4. EXERCISES 117
24. Show that if L ∈L (V,W ) (linear transformation) where V and W are vector spaces,then if Lyp = f for some yp ∈V, then the general solution of Ly = f is of the formker(L)+yp.
25. Suppose Ax= b has a solution. Explain why the solution is unique precisely whenAx= 0 has only the trivial (zero) solution.
26. Let L :Rn→R be linear. Show that there exists a vector a∈Rn such that Ly = aTy.
27. Let the linear transformation T be determined by
T x=
1 0 −5 −70 1 −3 −91 1 −8 −16
x
Find the rank of this transformation.
28. Let T f =(D2 +5D+4
)f for f in the vector space of polynomials of degree no
more than 3 where we consider T to map into the same vector space. Find the rankof T . You might want to use Proposition 4.3.6.
29. (Extra important) Let A be an n× n matrix. The trace of A, trace(A) is defined as∑i Aii. It is just the sum of the entries on the main diagonal. Show trace(A) =trace
(AT). Suppose A is m×n and B is n×m. Show that trace(AB) = trace(BA) .
Now show that if A and B are similar n×n matrices, then trace(A) = trace(B). Recallthat A is similar to B means A = S−1BS for some matrix S.
30. Suppose you have a monic polynomial φ (λ ) which is irreducible over F the fieldof scalars. Remember that this means that no polynomial divides it except scalarmultiples of φ (λ ) and scalars. Say
φ (λ ) = a0 +a1λ + · · ·+ad−1λd−1 +λ
d
Now consider A ∈ L (V,V ) where V is a vector space. Consider ker(φ (A)) andsuppose this is not 0. For x ∈ ker(φ (A)) ,x ̸= 0, let β x =
{x,Ax, · · · ,Ad−1x
}. Show
that β x is an independent set of vectors if x ̸= 0.
31. ↑Let V be a finite dimensional vector space and let A ∈L (V,V ) . Also let W be asubspace of V such that A(W )⊆W. We call such a subspace an A invariant subspace.Say {w1, · · · ,ws} is a basis for W . Also let x ∈U⧹W where U is an A invariant sub-space which is contained in ker(φ (A)). Then you know that {w1, · · · ,ws,x} is lin-early independent. Show that in fact {w1, · · · ,ws,β x} is linearly independent whereβ x is given in the above problem. Hint: Suppose you have
s
∑k=1
akwk +d
∑j=1
b jA j−1x = 0. (*)
You need to verify that the second sum is 0. From this it will follow that each b j is0 and then each ak = 0. Let S = ∑
dj=1 b jA j−1x. Observe that β S ⊆ β x and if S ̸= 0,
then β S is independent from the above problem and both β x and β S have the samedimension. You will argue that span(β S)⊆W ∩ span(β x)⊆ span(β x) and then useProblem 6 on Page 74..