I am trying to optimize an objective function $L(\theta)$ in which some parameters that I aim to recover belong to a covariance matrix, $\Sigma$. $\Sigma$ has a unique structure, which includes ones on the diagonals: $$ \Sigma = \begin{pmatrix} 1 & & \\ \sigma_{21} & 1 & \\ \sigma_{31} & \sigma_{32} & 1\\ \end{pmatrix} \\ $$
If $\Sigma$ had no such restrictions, one would typically include the lower-triangular Cholesky decomposition, $L$, in $\theta$ and pass it to the objective function, and then recreate $\Sigma = L L'$ inside the objective function. This ensures that, regardless of the values that the optimizer tries, you always have a valid covariance matrix (symmetric and positive semidefinite).
In my case, because $\Sigma$ has ones on the diagonal, only three parameters are identified, which means that I can only pass three parameters from the Cholesky decomposition to my objective function. Let the Cholesky decomposition be written as $$ L = \begin{pmatrix} x_{11} & 0 & 0\\ x_{21} & x_{22} & 0 \\ x_{31} & x_{32} & x_{33} \\ \end{pmatrix} $$
One option is to pass the parameters the off-diagonal parameters of $L$, $x_{21}, x_{31}, x_{32}$, and then set $x_{11} = 1$, $x_{22} = \sqrt{1 - x_{21}^2}$, and $x_{33} = \sqrt{1 - x_{31}^2 - x_{32}^2}$ inside the objective function. If I did this, then $LL' = \Sigma.$ However, this is suboptimal from an optimization standpoint, because $x_{21}, x_{31}, x_{32}$ cannot vary freely over the parameter space, but rather must satisfy a series of nonlinear constraints $(x_{21}^2 < 1, x_{31}^2 + x_{32}^2 < 1)$.
What is the best way of passing three parameters to the likelihood function, such that I can reconstruct $\Sigma$ inside the objective function without running into problems arising from taking the square root of a negative number?