Is recasting a multivariate linear regression model as a multiple linear regression entirely equivalent? What is different? If they are indeed equivalent, why bother even having resources on multivariate regression since the literature for the latter is much more detailed?
I have read this in a few places (Bayesian Data Analysis -- Gelman et al., and Multivariate Old School -- Marden) that a multivariate linear model can easily be reparameterized as multiple regression. However, neither source elaborates on this at all. They essentially just mention it, then continue using the multivariate model. Mathematically, I'll write the multivariate version first,
$$ \underset{n \times t}{\mathbf{Y}} = \underset{n \times k}{\mathbf{X}} \hspace{2mm}\underset{k \times t}{\mathbf{B}} + \underset{n \times t}{\mathbf{R}},
$$
where the bold variables are matrices with their sizes below them. As usual, $\mathbf{Y}$ is data, $\mathbf{X}$ is the design matrix, $\mathbf{R}$ are normally distributed residuals, and $\mathbf{B}$ is what we are interested in making inferences with.
To reparameterize this as the familiar multiple linear regression, one simply rewrites the variables as:
$$ \underset{nt \times 1}{\mathbf{y}} = \underset{nt \times nk}{\mathbf{D}} \hspace{2mm} \underset{nk \times 1}{\boldsymbol{\beta}} + \underset{nt \times 1}{\mathbf{r}},
$$
where the reparameterizations used are $\mathbf{y} = row(\mathbf{Y}) $, $\boldsymbol\beta = row(\mathbf{B})$, and $\mathbf{D} = \mathbf{X} \otimes \mathbf{I}_{n}$. $row()$ means that the rows of the matrix are arranged end to end, and $\otimes$ is the kronecker, or outer, product.
So, if this is so easy, why bother writing books on multivariate models, test statistics for them etc.? It is most effective to just transform the variables first and use common univariate techniques. I'm sure there is a good reason, I just am having a hard time thinking of one, at least in the case of a linear model. Are there situations with the multivariate linear model and normally distributed random errors where this reparameterization does not apply, or limits the possibilities of the analysis you can undertake?
EDIT:
Some examples:
1. Say you want to do a global test, like Pillai's trace or Roy's max root etc. Why not transfrom from multivariate to multiple, and use an F distribution. It is nice and exact, unlike the approximations for multivariate models.
2. Correcting for heteroscedasticity. I can find many approaches to address this for multiple regression. Not so much for multivariate.
3. Basically, most techniques for multiple regression get very complicated when trying to generalize them to multivariate data. I can't find a reason not to reparameterize my models and then use common regression techniques.
Obviously, I'm not looking for an explanation for every possible difference. I just can't think of a situation where it makes sense *not* to reparameterize.
Thank you!
NOTE: What I'm asking about is not the same as doing $t$ separate multiple regressions. Also, any covariance in the DVs isn't destroyed by doing this reparameterization (I can't see how, if there is, how?).