Linear Algebra/Mechanics of Matrix Multiplication
Template:Navigation In this subsection we consider matrix multiplication as a mechanical process, putting aside for the moment any implications about the underlying maps. As described earlier, the striking thing about matrix multiplication is the way rows and columns combine. The entry of the matrix product is the dot product of row of the left matrix with column of the right one. For instance, here a second row and a third column combine to make a entry.
We can view this as the left matrix acting by multiplying its rows, one at a time, into the columns of the right matrix. Of course, another perspective is that the right matrix uses its columns to act on the left matrix's rows. Below, we will examine actions from the left and from the right for some simple matrices.
The first case, the action of a zero matrix, is very easy.
After zero matrices, the matrices whose actions are easiest to understand are the ones with a single nonzero entry.
Next in complication are matrices with two nonzero entries. There are two cases. If a left-multiplier has entries in different rows then their actions don't interact.
But if the left-multiplier's nonzero entries are in the same row then that row of the result is a combination. Template:TextBox
Right-multiplication acts in the same way, with columns.
These observations about matrices that are mostly zeroes extend to arbitrary matrices.
An application of those observations is that there is a matrix that just copies out the rows and columns.
In short, an identity matrix is the identity element of the set of matrices with respect to the operation of matrix multiplication.
We next see two ways to generalize the identity matrix.
The first is that if the ones are relaxed to arbitrary reals, the resulting matrix will rescale whole rows or columns. Template:TextBox
The second generalization of identity matrices is that we can put a single one in each row and column in ways other than putting them down the diagonal.
We finish this subsection by applying these observations to get matrices that perform Gauss' method and Gauss-Jordan reduction.
To see how to perform a pivot, we observe something about those two examples. The matrix that rescales the second row by a factor of three arises in this way from the identity.
Similarly, the matrix that swaps first and third rows arises in this way.
We have observed the following result, which we shall use in the next subsection.
Until now we have taken the point of view that our primary objects of study are vector spaces and the maps between them, and have adopted matrices only for computational convenience. This subsection show that this point of view isn't the whole story. Matrix theory is a fascinating and fruitful area.
In the rest of this book we shall continue to focus on maps as the primary objects, but we will be pragmatic— if the matrix point of view gives some clearer idea then we shall use it.
Exercises
Template:Linear Algebra/Book 2/Recommended Template:TextBox Template:Linear Algebra/Book 2/Recommended Template:TextBox Template:Linear Algebra/Book 2/Recommended Template:TextBox Template:TextBox Template:Linear Algebra/Book 2/Recommended Template:TextBox Template:TextBox Template:TextBox Template:TextBox Template:Linear Algebra/Book 2/Recommended Template:TextBox Template:TextBox Template:TextBox Template:TextBox Template:Linear Algebra/Book 2/Recommended Template:TextBox Template:TextBox Template:TextBox Template:Linear Algebra/Book 2/Recommended Template:TextBox Template:Linear Algebra/Book 2/Recommended Template:TextBox Template:TextBox Template:Linear Algebra/Book 2/Recommended Template:TextBox Template:TextBox Template:TextBox Template:TextBox Template:TextBox Template:TextBox
References
- Template:Citation.
- Template:Citation.
- Template:AnchorWilliam Lowell Putnam Mathematical Competition, Problem A-5, 1990.