At long last (with this section plus the final paper extra credit), we will understand why matrix multiplication has the odd but extremely effective definition that it does. Aside from matrices of the same size forming a vector space and their action on being a linear transformation, it isn’t obvious what either of the last two sections really mean for matrix theory and the study of real column vectors. This penultimate section will finally get at the relationship between matrices and linear transformations before applying this isomorphism in Section 7 to look at the eigenvalue problem, which is exploited by Google to the tune of all of their earthly riches.
So if you understand these next two sections, you are in line to receive all of Google’s money.*
Coordinatization of vector spaces
All n-dimensional vector spaces are isomorphic to .
No doubt you have already had some inkling that in the case of polynomial vector spaces, only the coefficients matter, and in matrix vector spaces, only the entries matter, and in column vector spaces, only the components matter, and so on. First, we will formalize this notion: if is a vector space of finite dimension and ordered (the order matters) basis , then is isomorphic to by the map
Note that . Since sends a basis to a basis, per the preceding section we know it is an isomorphism. This theorem gives mathematical rigor to our growing feeling that it’s really only the coefficients of the basis vectors that matter.
e.g. By the preceding theorem and the transitivity of isomorphism, the vector spaces , , and are all isomorphic, or algebraically indistinguishable as vector spaces.
Coordinatization of linear maps
First, we note that for vector spaces and , the set
is itself a vector space. Verify for yourself, or with your class notes, that this in fact a vector space. This fact is required to make sense of the following statement:
Every linear map between finite-dimensional vector spaces is a matrix.
If and have dimensions and respectively, . The map that gives the isomorphism is defined as follows. Let have ordered basis and have ordered basis . Then is assigned the matrix whose -th column is , or applied to written in the basis .
Observe that there are members of , so has columns, and that has components, so the matrix has rows.
e.g. The identity map where both the domain and codomain are written in the same -element matrix corresponds to the identity matrix . If the domain and codomain are written in different bases, then the corresponding matrix is called a change of basis matrix and will play an integral role in diagonalizing matrices in the next section.
Consequences in matrix algebra
Many facts from matrix algebra are verified in the light of linear transformation. First of all, if , , , , and with respective bases , , and , then
Notice that is a matrix with rows and columns and that is a matrix with rows and columns. Therefore, matrix multiplication is possible if and only if the corresponding linear composition makes sense.
Here are a few more connections between matrix algebra and linear transformations. Let .
- If is a square matrix, then is a linear transformation whose domain and codomain have the same dimension and are hence isomorphic.
- The invertibility of corresponds exactly to the invertibility of .
- The kernel of is nontrivial if and only if the determinant of is zero.
In extra credit 4, you will venture into the vortex and verify, through a frankly disgusting (but helpful) amount of computation, that matrix multiplication makes absolutely perfect sense.
* This is a lie.