Skip to main content
added 291 characters in body
Source Link
Jean Marie
  • 90.8k
  • 7
  • 59
  • 134

Let $V=\begin{pmatrix}a\\b\\c\end{pmatrix}$.

$[V]_{\times}$ denotes the antisymmetrical matrix, associated with the linear operation "taking the cross product with $V$", i.e.,

$$[V]_{\times}X:=V \times X=\begin{pmatrix} -cy+bz\\ \ \ \ cx-az\\-bx+ay \end{pmatrix},$$

which means that $$[V]_{\times}=\begin{pmatrix}\ \ \ 0&-c& \ \ \ b\\ \ \ \ c& \ \ \ 0&-a\\-b& \ \ \ a& \ \ \ 0\end{pmatrix}.$$

(this notation is described in (https://en.wikipedia.org/wiki/Cross_product)).

Remarks:

  1. Very often $V$ is assumed to be unitary ($a^2+b^2+c^2=1$).

  2. Here is a very interesting paper (https://arxiv.org/pdf/1312.0788v1.pdf) where this notation is used with many of its properties grouped in its Appendix A.

  3. This notation doesn't look very old ; maybe one should post the question of its origin (robotics ?) on the "history of mathematics" section of SE.

Edit:

  1. I have had a glance at the source paper you have indicated after I wrote my answer.

In the block-defined matrix, $\hat R^n_{b,k}f_k^b$ has to be a vector. There are two ways to consider this notation:

  • a) either as a single matrix $\hat R^n_{b,k}$ or $\left(\hat R^n_{b}\right)_k$applied to a vector $f_k^b$, yielding a vector.

  • b) or (much less likely) a linear combination of matrices applied to vectors, with a hidden summation sign according to Einstein notation (https://en.wikipedia.org/wiki/Einstein_notation), $\sum_{b=1}^m \hat R^n_{b,k}f_k^b$ where the summation is on the index that is present once in a lower, once in a upper position, here index $b$.

  1. See also the recent document (https://people.eecs.berkeley.edu/~wkahan/MathH110/Cross.pdf) and the older one (http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.825.1726&rep=rep1&type=pdf) (1989)

Let $V=\begin{pmatrix}a\\b\\c\end{pmatrix}$.

$[V]_{\times}$ denotes the antisymmetrical matrix, associated with the linear operation "taking the cross product with $V$", i.e.,

$$[V]_{\times}X:=V \times X=\begin{pmatrix} -cy+bz\\ \ \ \ cx-az\\-bx+ay \end{pmatrix},$$

which means that $$[V]_{\times}=\begin{pmatrix}\ \ \ 0&-c& \ \ \ b\\ \ \ \ c& \ \ \ 0&-a\\-b& \ \ \ a& \ \ \ 0\end{pmatrix}.$$

Remarks:

  1. Very often $V$ is assumed to be unitary ($a^2+b^2+c^2=1$).

  2. Here is a very interesting paper (https://arxiv.org/pdf/1312.0788v1.pdf) where this notation is used with many of its properties grouped in its Appendix A.

  3. This notation doesn't look very old ; maybe one should post the question of its origin (robotics ?) on the "history of mathematics" section of SE.

Edit:

  1. I have had a glance at the source paper you have indicated after I wrote my answer.

In the block-defined matrix, $\hat R^n_{b,k}f_k^b$ has to be a vector. There are two ways to consider this notation:

  • a) either as a single matrix $\hat R^n_{b,k}$ or $\left(\hat R^n_{b}\right)_k$applied to a vector $f_k^b$, yielding a vector.

  • b) or (much less likely) a linear combination of matrices applied to vectors, with a hidden summation sign according to Einstein notation (https://en.wikipedia.org/wiki/Einstein_notation), $\sum_{b=1}^m \hat R^n_{b,k}f_k^b$ where the summation is on the index that is present once in a lower, once in a upper position, here index $b$.

Let $V=\begin{pmatrix}a\\b\\c\end{pmatrix}$.

$[V]_{\times}$ denotes the antisymmetrical matrix, associated with the linear operation "taking the cross product with $V$", i.e.,

$$[V]_{\times}X:=V \times X=\begin{pmatrix} -cy+bz\\ \ \ \ cx-az\\-bx+ay \end{pmatrix},$$

which means that $$[V]_{\times}=\begin{pmatrix}\ \ \ 0&-c& \ \ \ b\\ \ \ \ c& \ \ \ 0&-a\\-b& \ \ \ a& \ \ \ 0\end{pmatrix}.$$

(this notation is described in (https://en.wikipedia.org/wiki/Cross_product)).

Remarks:

  1. Very often $V$ is assumed to be unitary ($a^2+b^2+c^2=1$).

  2. Here is a very interesting paper (https://arxiv.org/pdf/1312.0788v1.pdf) where this notation is used with many of its properties grouped in its Appendix A.

  3. This notation doesn't look very old ; maybe one should post the question of its origin (robotics ?) on the "history of mathematics" section of SE.

Edit:

  1. I have had a glance at the source paper you have indicated after I wrote my answer.

In the block-defined matrix, $\hat R^n_{b,k}f_k^b$ has to be a vector. There are two ways to consider this notation:

  • a) either as a single matrix $\hat R^n_{b,k}$ or $\left(\hat R^n_{b}\right)_k$applied to a vector $f_k^b$, yielding a vector.

  • b) or (much less likely) a linear combination of matrices applied to vectors, with a hidden summation sign according to Einstein notation (https://en.wikipedia.org/wiki/Einstein_notation), $\sum_{b=1}^m \hat R^n_{b,k}f_k^b$ where the summation is on the index that is present once in a lower, once in a upper position, here index $b$.

  1. See also the recent document (https://people.eecs.berkeley.edu/~wkahan/MathH110/Cross.pdf) and the older one (http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.825.1726&rep=rep1&type=pdf) (1989)
added 90 characters in body
Source Link
Jean Marie
  • 90.8k
  • 7
  • 59
  • 134

Let $V=\begin{pmatrix}a\\b\\c\end{pmatrix}$.

$[V]_{\times}$ denotes the antisymmetrical matrix, associated with the linear operation "taking the cross product with $V$", i.e.,

$$[V]_{\times}X:=V \times X=\begin{pmatrix} -cy+bz\\ \ \ \ cx-az\\-bx+ay \end{pmatrix},$$

which means that $$[V]_{\times}=\begin{pmatrix}\ \ \ 0&-c& \ \ \ b\\ \ \ \ c& \ \ \ 0&-a\\-b& \ \ \ a& \ \ \ 0\end{pmatrix}.$$

Remarks:

  1. Very often $V$ is assumed to be unitary ($a^2+b^2+c^2=1$).

  2. Here is a very interesting paper (https://arxiv.org/pdf/1312.0788v1.pdf) where this notation is used with many of its properties grouped in its Appendix A.

  3. This notation doesn't look very old ; maybe one should post the question of its origin (robotics ?) on the "history of mathematics" section of SE.

  4. In the block-defined matrix you give, $\hat R^n_{b,k}f_k^b$ has to be a vector. There are two ways to consider this notation:

Edit:

  1. I have had a glance at the source paper you have indicated after I wrote my answer.

In the block-defined matrix, $\hat R^n_{b,k}f_k^b$ has to be a vector. There are two ways to consider this notation:

  • a) either as a single matrix $\hat R^n_{b,k}$ appliedor $\left(\hat R^n_{b}\right)_k$applied to a vector $f_k^b$, yielding a vector. This would be OK. But I think a second interpretation is more likely:

  • b) or (much less likely) a linear combination of matrices applied to vectors, with a hidden summation sign according to Einstein notation (https://en.wikipedia.org/wiki/Einstein_notation), $\sum_{b=1}^m \hat R^n_{b,k}f_k^b$ where the summation is on the index that appearsis present once in a lower, once in a upper position, here index $b$.

Let $V=\begin{pmatrix}a\\b\\c\end{pmatrix}$.

$[V]_{\times}$ denotes the antisymmetrical matrix, associated with the linear operation "taking the cross product with $V$", i.e.,

$$[V]_{\times}X:=V \times X=\begin{pmatrix} -cy+bz\\ \ \ \ cx-az\\-bx+ay \end{pmatrix},$$

which means that $$[V]_{\times}=\begin{pmatrix}\ \ \ 0&-c& \ \ \ b\\ \ \ \ c& \ \ \ 0&-a\\-b& \ \ \ a& \ \ \ 0\end{pmatrix}.$$

Remarks:

  1. Very often $V$ is assumed to be unitary ($a^2+b^2+c^2=1$).

  2. Here is a very interesting paper (https://arxiv.org/pdf/1312.0788v1.pdf) where this notation is used with many of its properties grouped in its Appendix A.

  3. This notation doesn't look very old ; maybe one should post the question of its origin (robotics ?) on the "history of mathematics" section of SE.

  4. In the block-defined matrix you give, $\hat R^n_{b,k}f_k^b$ has to be a vector. There are two ways to consider this notation:

  • a) a single matrix $\hat R^n_{b,k}$ applied to a vector $f_k^b$, yielding a vector. This would be OK. But I think a second interpretation is more likely:

  • b) a linear combination of matrices applied to vectors, with a hidden summation sign according to Einstein notation (https://en.wikipedia.org/wiki/Einstein_notation), $\sum_{b=1}^m \hat R^n_{b,k}f_k^b$ where the summation is on the index that appears once in a lower, once in a upper position, here index $b$.

Let $V=\begin{pmatrix}a\\b\\c\end{pmatrix}$.

$[V]_{\times}$ denotes the antisymmetrical matrix, associated with the linear operation "taking the cross product with $V$", i.e.,

$$[V]_{\times}X:=V \times X=\begin{pmatrix} -cy+bz\\ \ \ \ cx-az\\-bx+ay \end{pmatrix},$$

which means that $$[V]_{\times}=\begin{pmatrix}\ \ \ 0&-c& \ \ \ b\\ \ \ \ c& \ \ \ 0&-a\\-b& \ \ \ a& \ \ \ 0\end{pmatrix}.$$

Remarks:

  1. Very often $V$ is assumed to be unitary ($a^2+b^2+c^2=1$).

  2. Here is a very interesting paper (https://arxiv.org/pdf/1312.0788v1.pdf) where this notation is used with many of its properties grouped in its Appendix A.

  3. This notation doesn't look very old ; maybe one should post the question of its origin (robotics ?) on the "history of mathematics" section of SE.

Edit:

  1. I have had a glance at the source paper you have indicated after I wrote my answer.

In the block-defined matrix, $\hat R^n_{b,k}f_k^b$ has to be a vector. There are two ways to consider this notation:

  • a) either as a single matrix $\hat R^n_{b,k}$ or $\left(\hat R^n_{b}\right)_k$applied to a vector $f_k^b$, yielding a vector.

  • b) or (much less likely) a linear combination of matrices applied to vectors, with a hidden summation sign according to Einstein notation (https://en.wikipedia.org/wiki/Einstein_notation), $\sum_{b=1}^m \hat R^n_{b,k}f_k^b$ where the summation is on the index that is present once in a lower, once in a upper position, here index $b$.

added 608 characters in body
Source Link
Jean Marie
  • 90.8k
  • 7
  • 59
  • 134

Let $V=\begin{pmatrix}a\\b\\c\end{pmatrix}$.

$[V]_{\times}$ denotes the antisymmetrical matrix, associated with the linear operation "taking the cross product with $V$", i.e.,

$$[V]_{\times}X:=V \times X=\begin{pmatrix} -cy+bz\\ \ \ \ cx-az\\-bx+ay \end{pmatrix},$$

which means that $$[V]_{\times}=\begin{pmatrix}\ \ \ 0&-c& \ \ \ b\\ \ \ \ c& \ \ \ 0&-a\\-b& \ \ \ a& \ \ \ 0\end{pmatrix}.$$

Remarks:

  1. Very often $V$ is assumed to be unitary ($a^2+b^2+c^2=1$).

  2. Here is a very interesting paper (https://arxiv.org/pdf/1312.0788v1.pdf) where this notation is used with many of its properties grouped in its Appendix A.

  3. This notation doesn't look very old ; maybe one should post the question of its origin (robotics ?) on the "history of mathematics" section of SE.

  4. In the block-defined matrix you give, $\hat R^n_{b,k}f_k^b$ has to be a vector. There are two ways to consider this notation:

  • a) a single matrix $\hat R^n_{b,k}$ applied to a vector $f_k^b$, yielding a vector. This would be OK. But I think a second interpretation is more likely:

  • b) a linear combination of matrices applied to vectors, with a hidden summation sign according to Einstein notation (https://en.wikipedia.org/wiki/Einstein_notation), $\sum_{b=1}^m \hat R^n_{b,k}f_k^b$ where the summation is on the index that appears once in a lower, once in a upper position, here index $b$.

Let $V=\begin{pmatrix}a\\b\\c\end{pmatrix}$.

$[V]_{\times}$ denotes the antisymmetrical matrix, associated with the linear operation "taking the cross product with $V$", i.e.,

$$[V]_{\times}X:=V \times X=\begin{pmatrix} -cy+bz\\ \ \ \ cx-az\\-bx+ay \end{pmatrix},$$

which means that $$[V]_{\times}=\begin{pmatrix}\ \ \ 0&-c& \ \ \ b\\ \ \ \ c& \ \ \ 0&-a\\-b& \ \ \ a& \ \ \ 0\end{pmatrix}.$$

Remarks:

  1. Very often $V$ is assumed to be unitary ($a^2+b^2+c^2=1$).

  2. Here is a very interesting paper (https://arxiv.org/pdf/1312.0788v1.pdf) where this notation is used with many of its properties grouped in its Appendix A.

  3. This notation doesn't look very old ; maybe one should post the question of its origin (robotics ?) on the "history of mathematics" section of SE.

Let $V=\begin{pmatrix}a\\b\\c\end{pmatrix}$.

$[V]_{\times}$ denotes the antisymmetrical matrix, associated with the linear operation "taking the cross product with $V$", i.e.,

$$[V]_{\times}X:=V \times X=\begin{pmatrix} -cy+bz\\ \ \ \ cx-az\\-bx+ay \end{pmatrix},$$

which means that $$[V]_{\times}=\begin{pmatrix}\ \ \ 0&-c& \ \ \ b\\ \ \ \ c& \ \ \ 0&-a\\-b& \ \ \ a& \ \ \ 0\end{pmatrix}.$$

Remarks:

  1. Very often $V$ is assumed to be unitary ($a^2+b^2+c^2=1$).

  2. Here is a very interesting paper (https://arxiv.org/pdf/1312.0788v1.pdf) where this notation is used with many of its properties grouped in its Appendix A.

  3. This notation doesn't look very old ; maybe one should post the question of its origin (robotics ?) on the "history of mathematics" section of SE.

  4. In the block-defined matrix you give, $\hat R^n_{b,k}f_k^b$ has to be a vector. There are two ways to consider this notation:

  • a) a single matrix $\hat R^n_{b,k}$ applied to a vector $f_k^b$, yielding a vector. This would be OK. But I think a second interpretation is more likely:

  • b) a linear combination of matrices applied to vectors, with a hidden summation sign according to Einstein notation (https://en.wikipedia.org/wiki/Einstein_notation), $\sum_{b=1}^m \hat R^n_{b,k}f_k^b$ where the summation is on the index that appears once in a lower, once in a upper position, here index $b$.

added 399 characters in body
Source Link
Jean Marie
  • 90.8k
  • 7
  • 59
  • 134
Loading
Source Link
Jean Marie
  • 90.8k
  • 7
  • 59
  • 134
Loading