Let $V=\begin{pmatrix}a\\b\\c\end{pmatrix}$.
$[V]_{\times}$ denotes the antisymmetrical matrix, associated with the linear operation "taking the cross product with $V$", i.e.,
$$[V]_{\times}X:=V \times X=\begin{pmatrix} -cy+bz\\ \ \ \ cx-az\\-bx+ay \end{pmatrix},$$
which means that $$[V]_{\times}=\begin{pmatrix}\ \ \ 0&-c& \ \ \ b\\ \ \ \ c& \ \ \ 0&-a\\-b& \ \ \ a& \ \ \ 0\end{pmatrix}.$$
(this notation is described in (https://en.wikipedia.org/wiki/Cross_product)).
Remarks:
Very often $V$ is assumed to be unitary ($a^2+b^2+c^2=1$).
Here is a very interesting paper (https://arxiv.org/pdf/1312.0788v1.pdf) where this notation is used with many of its properties grouped in its Appendix A.
This notation doesn't look very old ; maybe one should post the question of its origin (robotics ?) on the "history of mathematics" section of SE.
Edit:
- I have had a glance at the source paper you have indicated after I wrote my answer.
In the block-defined matrix, $\hat R^n_{b,k}f_k^b$ has to be a vector. There are two ways to consider this notation:
a) either as a single matrix $\hat R^n_{b,k}$ or $\left(\hat R^n_{b}\right)_k$applied to a vector $f_k^b$, yielding a vector.
b) or (much less likely) a linear combination of matrices applied to vectors, with a hidden summation sign according to Einstein notation (https://en.wikipedia.org/wiki/Einstein_notation), $\sum_{b=1}^m \hat R^n_{b,k}f_k^b$ where the summation is on the index that is present once in a lower, once in a upper position, here index $b$.
- See also the recent document (https://people.eecs.berkeley.edu/~wkahan/MathH110/Cross.pdf) and the older one (http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.825.1726&rep=rep1&type=pdf) (1989)