Working on a problem in quantum mechanics, I rediscovered a notion of a product integral, which I was not familiar with before. In a simple case, when the integrand is a real-valued scalar function, then by taking a logarithm of it, the product integral can be in a straightforward way reduced to the regular integral that can be evaluated in Mathematica using Integrate and NIntegrate. The problem apparently becomes more complicated if the integrand is a matrix function. In particular, I am interested in a matrix product integral of the form: $$\prod_0^1\exp\!\left(\left[ \begin{array}{cc} 0 & -1 \\ f(x) & 0\\ \end{array} \right]\right)^{\!dx}=\lim_{\Delta x\to 0^+} \left(\,\prod _{x=0,\,\Delta x,\,2\Delta x,\,\dots}^1 \exp\left(\Delta x\left[ \begin{array}{cc} 0 & -1 \\ f(x) & 0\\ \end{array} \right]\right)\right),$$ where $\prod$ denotes the matrix product (Dot), $\exp(\cdot)$ denotes the matrix exponential (MatrixExp), and $f(x)$ is a real-valued, piecewise-smooth function (a particular form of that function depends on the problem we are trying to solve). The notation on the left is borrowed from here; as usual, we write $dx$ to hint at an infinitesimal $\Delta x$.
As far as I can see, this kind of integrals is not directly supported by Mathematica. Symbolic evaluation of such integrals seems a difficult problem even for very simple functions $f(x)$ — I could not find any developed theory for that. For now, I would like to find a way to evaluate matrix product integrals numerically to arbitrary precision. Could you suggest a way to do it?
Note: The matrix exponent from the product can be evaluated as follows: $$\small\exp\left(\left[ \begin{array}{cc} 0 & -1 \\ f(x) & 0\\ \end{array} \right]\right)^{\!h}=\exp\left(h\left[ \begin{array}{cc} 0 & -1 \\ f(x) & 0\\ \end{array} \right]\right)=\left[ \begin{array}{cc} \cos \left(h\sqrt{f(x)}\right) & -h\operatorname{sinc} \left(h\sqrt{f(x)}\right) \\ \sqrt{f(x)} \sin \left(h\sqrt{f(x)}\right) & \cos \left(h \sqrt{f(x)}\right) \\ \end{array} \right]$$ Note: Originally I posted a matrix that had $1$ rather than $-1$ in its top-right corner. Now I’ve fixed this error, and the expression above now correctly contains trigonometric rather than hyperbolic functions.
Update: I found a closed form solution for a quadratic polynomial $f(x)=ax^2+bx+c$; notation $D_\nu(z)$ stands for ParabolicCylinderD.
(see source)