One way to think of functions, as you are accustomed, is as input-output boxes: one value goes in, exactly one corresponding value comes out. Another way to think about them is as relationships between sets. If we are dealing with pure sets and the only thing worth talking about is their size, then a bijection can tell us, say, that there are as many rational numbers as there are integers (wait, what?) If we are dealing with additional structure, then we have ways of seeing how two algebras are the same. A monumental concept in mathematics is the idea of a structure-preserving map between two objects. In short, if and are objects with features you care about, and has those features means that the pre-image does, then the function is said to preserve whatever the structure is. In particular, can tell us in what ways and are the same and in what ways they are different.
You are no doubt aware that we are interested in linear structure, namely the ability to add and stretch vectors in a “nice” way (i.e. follows the vector space axioms). Therefore, functions that preserve our linear structure will preserve the addition and scaling operations. These functions turn out to be wicked nice to deal with, especially in the case that our vector space is finite-dimensional (as it always goes in one’s first linear algebra class). Let’s get started.
Let in everything that follows and be vector spaces and be a function between them.
We say that is a linear transformation (or function, or map) if it preserves addition and scalar multiplication, which means that for all and , we have
e.g. We have seen loads of linear transformations throughout math classes. Examples include scalar multiplication, matrix multiplication, differentiation, and integration.
If is linear, then it preserves the origin. In other words, let be the zero vector in and let be the zero vector in . Then . This means that linear transformations never move, or translate, the space. They rotate, or stretch, or shear, or reflect it instead.
Composition of linear transformations
Let and be linear maps. (I don’t need to say that is a vector space; were it not, then it would be nonsense to say that is a linear transformation.) Their composition, recall from precalculus, is the map
The map takes , does to it, and then does to the result. We read from the inside-out, just like the order of operations from algebra. First of all, is linear. Second, may not even exist, let alone be equal to . (Sound familiar?) For to exist, we would need , since would have to pick up where ended.
The rank-nullity theorem
Two subspaces tell us about what information from that keeps and destroys. As with all functions, the image of is all values that it takes:
Special to algebraic structures (since regular sets don’t have an idea of an additive identity 0), we have the kernel of , which is everything in that gets sent to :
Importantly, since the whole point of linear transformations is that they preserve linear combinations, is a subspace of and is a subspace of . The rank of is the dimension of and the nullity of is the dimension of .
It makes sense that if you add up what you destroy with what you save, what you get is what you started with, right? This is basically the point of the rank-nullity theorem, which is the tip of an iceberg of sorts of a more powerful, more complicated algebraic idea (called the first isomorphism theorem). In short,
You may have noticed that when viewed as members of their respective vector spaces, quadratic polynomials behave a lot like 3-vectors . In both cases, when we add and scale, all that really matters is the quadratic coefficients and the vector components . Polynomials and column vectors are different objects, though, so the vector spaces aren’t equal. How can we quantify this “sameness” of and with a notion weaker than equality?
If you guessed linear transformations would do the trick, good on you. An isomorphism (Greek “same form”) is a linear map that is also a bijection. In looser language, has exactly as many elements as (that’s the bijection part), and those elements add and scale the same way (that’s the linear map part). If and admit an isomorphism between them, then we say they are isomorphic and write .
There are a few equivalent, more illustrative ways to think of isomorphisms rather than verifying that a linear map is 1-1 and onto. In both cases, must be a linear map:
- If and , then is an isomorphism.
Why? Well, first, and are the same size. Second, doesn’t destroy any information (its kernel is trivial, i.e. just the zero vector). That means that every vector in must be sent to a unique vector in , and is just big enough that it totally fills up in this way.
- If sends a basis for to a basis for , then is an isomorphism.
More formally, let be a basis for . We suppose that is a basis for . This implies that . If it also implies that is trivial as in the above interpretation, then we are done.
Suppose that . There exist scalars , not all zero, such that since the form a basis for . If , then
which contradicts the hypothesis that is a linearly independent set.
Of course, when any function is a bijection, that means it can be inverted. If is an isomorphism, then there exists a map such that is equal to and is equal to .
In general, an equivalence relation is a set of properties that, when held, mean that a relationship is like equality for all intents and purposes. Isomorphism carries these properties: it is an algebraic equality-type relationship that doesn’t require two spaces have the same elements. These properties are:
- Reflexitivity: All vector spaces are isomorphic to themselves: .
- Symmetry: If by the isomorphism , then by the isomorphism .
- Transitivity: If by the isomorphism and by the isomorphism , then by the isomorphism .
Change of basis map
We end on a useful isomorphism that we will return to in sections 6 and 7. Suppose that has a basis and a basis . Then a change of basis map is the isomorphism (called an operator when the domain equals the codomain) such that
Per the preceding argument, this is an isomorphism since it sends one basis to another basis.