0
$\begingroup$

When trying to deeply visualize the meaning of the entries of a basis transformation matrix, I realized that a (forward) change of basis matrix can be written in a tensor form like this:

$$Q=Q^i_j \; \vec{e_i}\otimes\tilde{\epsilon}^j$$

For completeness sake, the backwards transformation matrix could be written like this:

$$P=P^i_j \; \vec{\tilde{e}_i}\otimes\epsilon^j$$

Where the tildes denote the new basis, and $\epsilon$ stands for the respective dual bases.

Strangely enough, but appropriately, the basis for the matrix is a tensor product of a basis vector and covector (but of different bases!) In this way, it seems like an isomorphism can be made between linear maps and basis transformations, but that they aren't strictly one and the same (i.e. basis transformations are not linear maps, and certainly linear maps are not basis transformations).

Does this seem correct? Or is this an abuse of notation?

$\endgroup$
3
  • $\begingroup$ I suppose you need to restrict to invertible linear maps? $\endgroup$ Commented Aug 24, 2023 at 4:34
  • $\begingroup$ Basis transformations are not linear maps? Think about the identity and its representations. [Btw, any tensor product of the form you chose is isomorphic to a space of linear maps.] $\endgroup$ Commented May 7 at 15:33
  • 1
    $\begingroup$ @AndreasCompagnoni yes now I know more to see this $\endgroup$ Commented May 17 at 4:24

2 Answers 2

2
$\begingroup$

I will use $a,b$ to denote abstract indices and $i,j,k$ to denote numerical indices.

A change of basis changes how a vector is represented, but does not change the vector itself, it is the same geometric object before and after the change of basis,

$$v^i \: e_i = \overline v^i \: \overline e_i = v^a$$

A transformation which always outputs the same object as it's input is the identity transformation, therefor all change of basis transformations are the identity, including the ones the ones you've written although it may not be immediately apparent.

The identity transformation for vectors can be expressed as,

$$I^a{}_b = \delta^i{}_j \: e_i \otimes \epsilon^j = e_i \otimes \epsilon^i$$

Contracting with any vector returns the same vector,

$$I^a{}_b \: v^b = v^j \: e_i \otimes \epsilon^i (e_j) = v^j \: e_i \: \delta^i{}_j = v^i \: e_i = v^a$$

If two bases $\{e_i\}$ and $\{\overline e_i\}$ for the same vector space are related by,

$$\overline e_i = Q^j{}_i \: e_j$$

Then we have,

$$e_i = P^j{}_i \: \overline e_j$$

$$\overline \epsilon^i = P^i{}_j \: \epsilon^j$$

$$\epsilon^i = Q^i{}_j \: \overline \epsilon^j$$

Where $P^i{}_k \: Q^k{}_j = \delta^i{}_j$

We can use these relations to show that your P and Q tensors are indeed the identity and that the identity is well defined,

$$I^a{}_b = e_i \otimes \epsilon^i = e_i \otimes (Q^i{}_j \: \overline \epsilon^j) = Q^i{}_j \: e_i \otimes \overline \epsilon^j = Q^a{}_b$$

$$I^a{}_b = e_j \otimes \epsilon^j = (P^i{}_j \: \overline e_i) \otimes \epsilon^j = P^i{}_j \: \overline e_i \otimes \epsilon^j = P^a{}_b$$

$$I^a{}_b = e_i \otimes \epsilon^i = (P^j{}_i \: \overline e_j) \otimes (Q^i{}_k \: \overline \epsilon^k) = P^j{}_i \: Q^i{}_k \: \overline e_j \otimes \overline \epsilon^k = \delta^j{}_k \: \overline e_j \otimes \overline \epsilon^k = \overline e_j \otimes \overline \epsilon^j = I^a{}_b$$

Applying the identity to a vector $v^a = \overline v^k \: \overline e_k$,

$$I^a{}_b \: v^b = Q^i{}_j \: \overline v^k \: e_i \otimes \overline \epsilon^j (\overline e_k) = Q^i{}_j \: \overline v^k \: e_i \: \delta^j{}_k = Q^i{}_j \: \overline v^j \: e_i = v^i \: e_i = v^a$$

We can see that this successfully expresses the vector in a different basis. While the components of the identity may be given by the Kronecker delta when expressed as a linear combination of tensor products of basis vectors and covectors from the same basis, that is not necessarily the case when it is expressed as a linear combination of tensor products of basis vectors from one basis and basis covectors from another.

To answer your question it is valid, and the only way to make sense of the seemingly contradictory facts that a change of basis should not change the vector and yet may have components which are not given by the Kronecker delta.

$\endgroup$
1
$\begingroup$

Wheeler has popularized the 'northwest-southeast' notation of linear basis transforms in Einstein notation in differential geometry, reduced to the tangent and cotangent basis in two systems.

$$dx^{i'} = \Lambda^{i'}_{\ \ k} dx^k$$

$$ e_k =\partial_k = \Lambda^{i'}_{\ \ k} \ e_{i'} = \Lambda^{i'}_{\ \ k} \ \partial_{i'}$$

$${\Lambda^{-1}}^{i'}_{\ \ k} = \Lambda^k_{\ \ i'}$$

As an intertwiner between to two different spaces, the Jacobi matrix can be viewed as the coeffient matrix of

$$ \Lambda = \Lambda^{i'}_{ \ \ k} \ \ e_{i'} \otimes dx^k $$ in the direct sum of all four tangent spaces, replacing the Einstein index summation convention by contraction of mixed tensor products explicitely.

$\endgroup$
2
  • $\begingroup$ So it looks like someone had the same idea as me, only applied to differential geometry! Where can I read more about this and this notation? $\endgroup$ Commented Aug 25, 2023 at 18:41
  • $\begingroup$ The influential original is from 1972 Gravitation Authors: Misner, Charles W. 1932-2023 ; Thorne, Kip S. 1940- ; Wheeler, John Archibald 1911-2008 New York, NY [u.a.] : Freeman, [ca. 2006] XXVI, 1279 p $\endgroup$ Commented Aug 26, 2023 at 5:37

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.