Here is a proof.
In the following, we fix a commutative ring $\mathbb{K}$. All matrices are over $\mathbb{K}$.
Theorem 1. Let $n\in\mathbb{N}$. Let $S$ be an $n\times n$-matrix. Let $A$ be an alternating $n\times n$-matrix. (This means that $A^{T}=-A$ and that the diagonal entries of $A$ are $0$.) Then, each entry of the matrix $\left( \operatorname*{adj}S\right) \cdot A\cdot\left( \operatorname*{adj}\left( S^{T}\right) \right) $ is divisible by $\det S$ (in $\mathbb{K}$).
[UPDATE: A slight modification of the below proof of Theorem 1 can be found in the solution to Exercise 6.42 in my Notes on the combinatorial fundamentals of algebra, version of 10 January 2019. More precisely, said Exercise 6.42 claims that each entry of the matrix $\left(\operatorname{adj} S\right)^T \cdot A \cdot \left(\operatorname{adj} S\right)$ is divisible by $\det S$; now it remains to substitute $S^T$ for $S$ and recall that $\left(\operatorname{adj} S\right)^T = \operatorname{adj} \left(S^T\right)$, and this immediately yields Theorem 1 above. Still, the following (shorter) version of this proof might be useful as well.]
The main workhorse of the proof of Theorem 1 is the following result, which is essentially (up to some annoying switching of rows and columns) the Desnanot-Jacobi identity used in Dodgson condensation:
Theorem 2. Let $n\in\mathbb{N}$. Let $S$ be an $n\times n$-matrix. For every $u\in\left\{ 1,2,\ldots,n\right\} $ and $v\in\left\{ 1,2,\ldots ,n\right\} $, we let $S_{\sim u,\sim v}$ be the $\left( n-1\right) \times\left( n-1\right) $-matrix obtained by crossing out the $u$-th row and the $v$-th column in $S$. (Thus, $\operatorname*{adj}S=\left( \left( -1\right) ^{i+j}S_{\sim j,\sim i}\right) _{1\leq i\leq n,\ 1\leq j\leq n}$.) For every four elements $u$, $u^{\prime}$, $v$ and $v^{\prime}$ of $\left\{ 1,2,\ldots,n\right\} $ with $u\neq u^{\prime}$ and $v\neq v^{\prime}$, we let $S_{\left( \sim u,\sim u^{\prime}\right) ,\left( \sim v,\sim v^{\prime }\right) }$ be the $\left( n-2\right) \times\left( n-2\right) $-matrix obtained by crossing out the $u$-th and $u^{\prime}$-th rows and the $v$-th and $v^{\prime}$-th columns in $S$. Let $u$, $i$, $v$ and $j$ be four elements of $\left\{ 1,2,\ldots,n\right\} $ with $u\neq v$ and $i\neq j$. Then, \begin{align} & \det\left( S_{\sim i,\sim u}\right) \cdot\det\left( S_{\sim j,\sim v}\right) -\det\left( S_{\sim i,\sim v}\right) \cdot\det\left( S_{\sim j,\sim u}\right) \\ & = \left( -1\right) ^{\left[ i<j\right] +\left[ u<v\right] }\det S\cdot\det\left( S_{\left( \sim i,\sim j\right) ,\left( \sim u,\sim v\right) }\right) . \end{align} Here, we use the Iverson bracket notation (that is, we write $\left[ \mathcal{A}\right] $ for the truth value of a statement $\mathcal{A}$; this is defined by $\left[ \mathcal{A}\right] = \begin{cases} 1, & \text{if }\mathcal{A}\text{ is true;}\\ 0, & \text{if }\mathcal{A}\text{ is false} \end{cases} $).
There are several ways to prove Theorem 2: I am aware of one argument that derives it from the Plücker relations (the simplest ones, where just one column is being shuffled around). There is at least one combinatorial argument that proves Theorem 2 in the case when $i = 1$, $j = n$, $u = 1$ and $v = n$ (see Zeilberger's paper); the general case can be reduced to this case by permuting rows and columns (although it is quite painful to track how the signs change under these permutations). (See also a paper by Berliner and Brualdi for a generalization of Theorem 2, with a combinatorial proof too.) There is at least one short algebraic proof of Theorem 2 (again in the case when $i = 1$, $j = n$, $u = 1$ and $v = n$ only) which relies on "formal" division by $\det S$ (that is, it proves that \begin{align} & \det S \cdot \left(\det\left( S_{\sim 1,\sim 1}\right) \cdot\det\left( S_{\sim 2,\sim 2}\right) -\det\left( S_{\sim 1,\sim 2}\right) \cdot\det\left( S_{\sim 2,\sim 1}\right) \right) \\ & = \left(\det S\right)^2 \cdot\det\left( S_{\left( \sim 1,\sim 2\right) ,\left( \sim 1,\sim 2\right) }\right) , \end{align} and then argues that $\det S$ can be cancelled because the determinant of a "generic" square matrix is invertible). (This proof appears in Bressoud's Proofs and Confirmations; a French version can also be found in lecture notes by Yoann Gelineau.) Unfortunately, none of these proofs seems to release the reader from the annoyance of dealing with the signs. Maybe exterior powers are the best thing to use here, but I do not see how. I have written up a division-free (but laborious and annoying) proof of Theorem 2 in my determinant notes; more precisely, I have written up the proof of the $i < j$ and $u < v$ case, but the general case can easily be obtained from it as follows:
Proof of Theorem 2. We need to prove the equality \begin{align} & \det\left( S_{\sim i,\sim u}\right) \cdot\det\left( S_{\sim j,\sim v}\right) -\det\left( S_{\sim i,\sim v}\right) \cdot\det\left( S_{\sim j,\sim u}\right) \\ & = \left( -1\right) ^{\left[ i<j\right] +\left[ u<v\right] }\det S\cdot\det\left( S_{\left( \sim i,\sim j\right) ,\left( \sim u,\sim v\right) }\right) . \label{darij.eq.1} \tag{1} \end{align} If we interchange $u$ with $v$, then the left hand side of this equality gets multiplied by $-1$ (because its subtrahend and its minuend switch places), whereas the right hand side also gets multiplied by $-1$ (since $S_{\left( \sim i,\sim j\right) ,\left( \sim u,\sim v\right)}$ does not change, but $\left[ u<v\right]$ either changes from $0$ to $1$ or changes from $1$ to $0$). Hence, if we interchange $u$ with $v$, then the equality \eqref{darij.eq.1} does not change its truth value. Thus, we can WLOG assume that $u \leq v$ (since otherwise we can just interchange $u$ with $v$). Assume this. For similar reasons, we can WLOG assume that $i \leq j$; assume this too. From $u \leq v$ and $u \neq v$, we obtain $u < v$. From $i \leq j$ and $i \neq j$, we obtain $i < j$. Thus, Theorem 6.126 in my Notes on the combinatorial fundamentals of algebra, version of 10 January 2019 (applied to $A=S$, $p=i$ and $q=j$) shows that \begin{align} & \det S\cdot\det\left( S_{\left( \sim i,\sim j\right) ,\left( \sim u,\sim v\right) }\right) \\ & = \det\left( S_{\sim i,\sim u}\right) \cdot\det\left( S_{\sim j,\sim v}\right) -\det\left( S_{\sim i,\sim v}\right) \cdot\det\left( S_{\sim j,\sim u}\right) \label{darij.eq.2} \tag{2} \end{align} (indeed, what I am calling $S_{\left( \sim i,\sim j\right) ,\left( \sim u,\sim v\right) }$ here is what I am calling $\operatorname{sub}^{1,2,\ldots,\widehat{u},\ldots,\widehat{v},\ldots,n}_{1,2,\ldots,\widehat{i},\ldots,\widehat{j},\ldots,n} A$ in my notes).
But both $\left[i < j\right]$ and $\left[u < v\right]$ equal $1$ (since $i < j$ and $u < v$). Thus, $\left( -1\right) ^{\left[ i<j\right] +\left[ u<v\right] } = \left(-1\right)^{1+1} = 1$. Therefore \begin{align} & \underbrace{\left( -1\right) ^{\left[ i<j\right] +\left[ u<v\right] }}_{=1}\det S\cdot\det\left( S_{\left( \sim i,\sim j\right) ,\left( \sim u,\sim v\right) }\right) \\ & = \det S\cdot\det\left( S_{\left( \sim i,\sim j\right) ,\left( \sim u,\sim v\right) }\right) \\ & = \det\left( S_{\sim i,\sim u}\right) \cdot\det\left( S_{\sim j,\sim v}\right) -\det\left( S_{\sim i,\sim v}\right) \cdot\det\left( S_{\sim j,\sim u}\right) \end{align} (by \eqref{darij.eq.2}). This proves Theorem 2. $\blacksquare$
Finally, here is an obvious lemma:
Lemma 3. Let $n\in\mathbb{N}$. For every $i\in\left\{ 1,2,\ldots ,n\right\} $ and $j\in\left\{ 1,2,\ldots,n\right\} $, let $E_{i,j}$ be the $n\times n$-matrix whose $\left( i,j\right) $-th entry is $1$ and whose all other entries are $0$. (This is called an elementary matrix.) Then, every alternating $n\times n$-matrix is a $\mathbb{K}$-linear combination of the matrices $E_{i,j}-E_{j,i}$ for pairs $\left( i,j\right) $ of integers satisfying $1\leq i<j\leq n$.
Proof of Theorem 1. We shall use the notation $E_{i,j}$ defined in Lemma 3.
We need to prove that every entry of the matrix $\left( \operatorname*{adj} S\right) \cdot A\cdot\left( \operatorname*{adj}\left( S^{T}\right) \right) $ is divisible by $\det S$. In other words, we need to prove that, for every $\left( u,v\right) \in\left\{ 1,2,\ldots,n\right\} ^{2}$, the $\left( u,v\right) $-th entry of the matrix $\left( \operatorname*{adj} S\right) \cdot A\cdot\left( \operatorname*{adj}\left( S^{T}\right) \right) $ is divisible by $\det S$. So, fix $\left( u,v\right) \in\left\{ 1,2,\ldots,n\right\} ^{2}$.
We need to show that the $\left( u,v\right) $-th entry of the matrix $\left( \operatorname*{adj}S\right) \cdot A\cdot\left( \operatorname*{adj} \left( S^{T}\right) \right) $ is divisible by $\det S$. This statement is clearly $\mathbb{K}$-linear in $A$ (in the sense that if $A_{1}$ and $A_{2}$ are two alternating $n\times n$-matrices such that this statement holds both for $A=A_{1}$ and for $A=A_{2}$, and if $\lambda_{1}$ and $\lambda_{2}$ are two elements of $\mathbb{K}$, then this statement also holds for $A=\lambda_{1}A_{1}+\lambda_{2}A_{2}$). Thus, we can WLOG assume that $A$ has the form $E_{i,j}-E_{j,i}$ for a pair $\left( i,j\right) $ of integers satisfying $1\leq i<j\leq n$ (according to Lemma 3). Assume this, and consider this pair $\left( i,j\right) $.
We have $\operatorname*{adj}S=\left( \left( -1\right) ^{x+y}\det\left( S_{\sim y,\sim x}\right) \right) _{1\leq x\leq n,\ 1\leq y\leq n}$ and \begin{align} \operatorname*{adj}\left( S^{T}\right) & = \left( \left( -1\right) ^{x+y}\det\left( \underbrace{\left( S^{T}\right) _{\sim y,\sim x} }_{=\left( S_{\sim x,\sim y}\right) ^{T}}\right) \right) _{1\leq x\leq n,\ 1\leq y\leq n} \\ & = \left( \left( -1\right) ^{x+y}\underbrace{\det\left( \left( S_{\sim x,\sim y}\right) ^{T}\right) }_{=\det\left( S_{\sim x,\sim y}\right) }\right) _{1\leq x\leq n,\ 1\leq y\leq n} \\ & = \left( \left( -1\right) ^{x+y}\det\left( S_{\sim x,\sim y}\right) \right) _{1\leq x\leq n,\ 1\leq y\leq n} . \end{align} Hence, \begin{align} & \underbrace{\left( \operatorname*{adj}S\right) }_{=\left( \left( -1\right) ^{x+y}\det\left( S_{\sim y,\sim x}\right) \right) _{1\leq x\leq n,\ 1\leq y\leq n}}\cdot\underbrace{A}_{=E_{i,j}-E_{j,i}}\cdot \underbrace{\left( \operatorname*{adj}\left( S^{T}\right) \right) }_{=\left( \left( -1\right) ^{x+y}\det\left( S_{\sim x,\sim y}\right) \right) _{1\leq x\leq n,\ 1\leq y\leq n}} \\ & =\left( \left( -1\right) ^{x+y}\det\left( S_{\sim y,\sim x}\right) \right) _{1\leq x\leq n,\ 1\leq y\leq n}\cdot\left( E_{i,j}-E_{j,i}\right) \\ & \qquad \qquad \cdot\left( \left( -1\right) ^{x+y}\det\left( S_{\sim x,\sim y}\right) \right) _{1\leq x\leq n,\ 1\leq y\leq n} \\ & = \left( \left( -1\right) ^{x+i}\det\left( S_{\sim i,\sim x}\right) \cdot\left( -1\right) ^{j+y}\det\left( S_{\sim j,\sim y}\right) \right. \\ & \qquad \qquad \left. -\left( -1\right) ^{x+j}\det\left( S_{\sim j,\sim x}\right) \cdot\left( -1\right) ^{i+y}\det\left( S_{\sim i,\sim y}\right) \right) _{1\leq x\leq n,\ 1\leq y\leq n} . \end{align} Hence, the $\left( u,v\right) $-th entry of the matrix $\left( \operatorname*{adj}S\right) \cdot A\cdot\left( \operatorname*{adj}\left( S^{T}\right) \right) $ is \begin{align} & \left( -1\right) ^{u+i}\det\left( S_{\sim i,\sim u}\right) \cdot\left( -1\right) ^{j+v}\det\left( S_{\sim j,\sim v}\right) -\left( -1\right) ^{u+j}\det\left( S_{\sim j,\sim u}\right) \cdot\left( -1\right) ^{i+v} \det\left( S_{\sim i,\sim v}\right) \\ & = \left( -1\right) ^{i+j+u+v}\left( \det\left( S_{\sim i,\sim u}\right) \cdot\det\left( S_{\sim j,\sim v}\right) -\det\left( S_{\sim i,\sim v}\right) \cdot\det\left( S_{\sim j,\sim u}\right) \right) . \label{darij.eq.3} \tag{3} \end{align} We need to prove that this is divisible by $\det S$. If $u=v$, then this is obvious (because if $u=v$, then the right hand side of \eqref{darij.eq.3} is $0$). Hence, we WLOG assume that $u\neq v$. Thus, \eqref{darij.eq.3} shows that the $\left( u,v\right) $-th entry of the matrix $\left( \operatorname*{adj}S\right) \cdot A\cdot\left( \operatorname*{adj}\left( S^{T}\right) \right) $ is \begin{align} & \left( -1\right) ^{i+j+u+v}\underbrace{\left( \det\left( S_{\sim i,\sim u}\right) \cdot\det\left( S_{\sim j,\sim v}\right) -\det\left( S_{\sim i,\sim v}\right) \cdot\det\left( S_{\sim j,\sim u}\right) \right) }_{\substack{=\left( -1\right) ^{\left[ i<j\right] +\left[ u<v\right] }\det S\cdot\det\left( S_{\left( \sim i,\sim j\right) ,\left( \sim u,\sim v\right) }\right) \\\text{(by Theorem 2)}}} \\ & = \left( -1\right) ^{i+j+u+v}\left( -1\right) ^{\left[ i<j\right] +\left[ u<v\right] }\det S\cdot\det\left( S_{\left( \sim i,\sim j\right) ,\left( \sim u,\sim v\right) }\right) , \end{align} which is clearly divisible by $\det S$. Theorem 1 is thus proven. $\blacksquare$