I’ve heard many times “a vector space is naturally isomorphic to its double dual,” with reference to the “canonical isomorphism” (evaluation.) However, in the same breath, there is no “easy” way to construct an isomorphism from a vector space to its usual dual (in a sense that can be made formal.) This is really pretty sad, since in finite dimensions they are always isomorphic. But, on the other hand, if one works in the category of inner product spaces, a real vector space *is *canonically isomorphic to its dual space. In fact, the isomorphism is exactly what we would expect:

**Proposition 1: **Given a finite dimensional real vector space equipped with inner product , there is an isomorphism given by .

First note that this makes sense, since the function is actually a linear functional, and all is good with the world. On the other hand, notice that the dimension of both vector spaces is the same and is injective since for any nonzero vector, by the definition of the norm. One can see a different isomorphism, which is essentially a representation theorem (Riesz for finite dimensions), and this is achieved by keeping track of values on a given basis. I’m not sure if this is canonical, because the proof I’m aware of makes explicit reference to basis elements– perhaps I’m wrong about this.

There is this really interesting thing that falls out: Given a morphism , one can consider its induced map (which is contravariant) given by . My previous notation was intentionally suggestive, in matrix form, is indeed the transpose of , but it turns out, that there is one more guy to consider:

**Definition:** Let be a real inner product space. The *adjoint* of a linear map is the unique map so that .

That was really a “definition/theorem” since I haven’t shown existence, uniqueness, or even that it is well-defined, but I think its commonplace enough to skip this. Here is the treat:

The adjoint and induced map of are isomorphic as linear maps (there exists some isomorphism so that ) and so essentially the same thing.

I’ll write a proof here, because I am too lazy to tex up a diagram:

Consider the isomorphism (no surprise here) from proposition . On one hand, and on the other hand, which are exactly the same by the definition of the adjoint.

There it is, just an fun little nugget for the day.