78 CHAPTER 4. MATRICES
the existence of an additive identity,
A+(−A) = 0, (4.4)
the existence of an additive inverse. Also, for α,β scalars, the following also hold.
α (A+B) = αA+αB, (4.5)
(α +β )A = αA+βA, (4.6)
α (βA) = αβ (A) , (4.7)
1A = A. (4.8)
These properties are just the vector space axioms discussed earlier and the fact that them× n matrices satisfy these axioms is what is meant by saying this set of matrices withaddition and scalar multiplication as defined above forms a vector space.
Definition 4.0.2 Matrices which are n× 1 or 1× n are especially called vectors and areoften denoted by a bold letter. Thus
x=
x1...
xn
is an n×1 matrix also called a column vector while a 1×n matrix of the form(
x1 · · · xn
)is referred to as a row vector.
All the above is fine, but the real reason for considering matrices is that they can bemultiplied. This is where things quit being banal. The following is the definition of multi-plying an m×n matrix times a n×1 vector. Then after this, the product of two matrices isconsidered.
Definition 4.0.3 First of all, define the product of a 1×n matrix and a n×1 matrix.
(x1 · · · xn
)y1...
yn
= ∑i
xiyi
If you have A an m× n matrix and B is an n× p matrix, then AB will be an m× p matrixwhose i jth entry is the product of the ith row of A on the left with the jth column of B on theright. Thus
(AB)i j ≡n
∑k=1
AikBk j.
and if B =(
b1 · · · bn
),AB =
(Ab1 · · · Abn
). You can do (m×n)× (n× p)
but in order to multiply, you must have the number of columns of the matrix on the leftequal to the number of rows of the matrix on the right or else the rule just given makes nosense.