1
$\begingroup$

Given finite-dimensional vector spaces $V,W$, there is an isomorphism $\text{Hom}(V,W) \rightarrow V^* \otimes W$. In particular, any linear map $\phi : V \rightarrow W$ has a tensor expansion $\sum v^*_i \otimes w_i$ where $v^*_i \in V^*, w_i \in W$.

For example, if one chooses dual bases $\{x_i\}, \{e_i\}$ of $V^*$ and $V$, then $\sum x_i \otimes e_i \in V^* \otimes V$ is a tensor expansion for the identity map on $V$.

What's the best way to intuitively understand tensor expansions of linear maps?

3 Answers 3

1

Not entirely sure what you're looking for so I'll just throw out some thoughts:

Recall that $V^{*}$ is the set of linear maps from $V$ to $\mathbb{R}$. Tensoring with $W$ effectively replaces the $\mathbb{R}$ with $W$. So the star on $V$ represents that the maps are coming from $V$; the lack of one on $W$ represents that you're getting elements of $W$. Likewise, I'd expect that elements of $\text{Hom}(V,W)$ would covary with a change of basis of $V$ and contravary with one of $W$ (if you're unfamiliar with these terms, here's the Wikipedia article).

For a bit more rigor, recall that you have a natural pairing between $V$ and $V^{*}$, which is just a map $$\langle -,- \rangle: V\times V^*\rightarrow\mathbb{R}$$ defined by $\langle v,\alpha\rangle=\alpha(v)$. As you can check, this is invariant under change of basis (hence the "natural") -- this actually follows from $V$ and $V$ varying oppositely. This also has the property that if $v_i,\alpha^i$ are dual bases for $V$ and $V^{*}$, then $\langle v_i,\alpha^i\rangle=1$, and $\langle v_i,\alpha^j\rangle=0$ when $i\ne j$.

So now choose bases $v_i,w_j$ for $V$ and $W$ and a dual basis $\alpha^i$ for $V^{*}$. Given an element $v=\sum b^iv_i$ of $V$ and an element $f=\sum a_i^jv^i\otimes w_j$ of $V^*\otimes W$, we get $$f(v)=\sum_{i,j,k}(a_i^j\alpha^i\otimes w_j)(b^kv_k)=\sum_{i,j,k}(a_i^jb^k\langle v_k,\alpha^i\rangle\otimes w_j).$$ Since the pairing is only nonzero for $k=i$, this reduces to $$f(v)=\sum_{i,j}a_i^jb^i\langle v_i,\alpha^i\rangle\otimes w_j=\sum_{i,j}a_i^jb^iw_j$$ where the tensor product has disappeared because the output of the pairing is just a real number.

But this is just a vector in $W$! And conversely, given a linear map $V\rightarrow W$, we can pick bases and write the map as a matrix $(a_i^j)$, which then transforms to the above tensor product.

I found these notes very helpful when I was learning this stuff. You can change the 5 in the URL to other numbers to read all the notes, though I believe they stop at 9.

2

Here's a very "low-tech" answer, in terms of coordinates. With respect to the dual bases (I take $\dim V=\dim W=2$ for simplicity), the linear transformation with matrix $$\begin{pmatrix} a&b \\ c&d \end{pmatrix}$$ can be expanded as $$a \begin{pmatrix} 1 \\ 0 \end{pmatrix} \begin{pmatrix} 1 & 0 \end{pmatrix} + b \begin{pmatrix} 1 \\ 0 \end{pmatrix} \begin{pmatrix} 0 & 1 \end{pmatrix} + c \begin{pmatrix} 0 \\ 1 \end{pmatrix} \begin{pmatrix} 1 & 0 \end{pmatrix} + d \begin{pmatrix} 0 \\ 1 \end{pmatrix} \begin{pmatrix} 0 & 1 \end{pmatrix}.$$ Here the column vectors are coordinate vectors for elements in $W$, while the row vectors are coordinate vectors for elements in $V^*$.

1

Ok, let's try to incorporate your question in a more gneral setting: at least one of the vector spaces in the following is finite-dimensional.

Let $f\colon U\to U'$ and $g\colon V\to V'$ be linear maps. We easily define the "tensor product" of f and g to be the map $f\otimes g$ such that $$ (f\otimes g)(u\otimes v)=f(u)\otimes g(v) $$ for all $u\in U$ and $v\in V$. So, we obtain a linear map $$ \lambda\colon\hom(U,U')\otimes \hom(V,V')\cong \hom(V\otimes U, V'\otimes U')$$ which is easily checked to be an isomorphism, provided at least one of the pairs $(U,U')$, $(V,V')$, $(U,V)$ consists of finite-dimensional vector spaces.

With an intelligent choice of the spaces, the isomorphism $\lambda$ allows us to prove every sort of useful identity relating tensor product of spaces and their duals: Try putting $U'=V'=\mathbb{K}$ (the underlying field), you'll obtain $(V\otimes U)^\star\cong U^\star\otimes V^\star $ (don't worry for the twist). Try putting $U=V'=\mathbb{K}$, you'll obtain... $\lambda_{UV}\colon V\otimes U^\star\cong \hom(U,V)$

Now, let $f\colon U\to V$ be a linear map. Using bases for $U$ and $V$, we have $$f(u_j) = \sum_i f_i^j v_i$$ for a family $(f_i^j)$ of scalars. Try proving that $$ f = \lambda_{UV}\Big(\sum_{ij} f_i^j v_i\otimes u^j\Big) $$

  • 0
    I'm quite fond of these identities from which many others spring.2010-10-12