Skip to main content

Questions tagged [reduced-rank-regression]

Multivariate multiple linear regression with a constraint that the coefficient matrix should be of low rank.

0 votes
1 answer
120 views

So with reduced rank regression we identify response variables associated with our outcome of interest (Y) and model proc pls relationships between the independent variables say dietary data vs. the ...
Patsy's user avatar
  • 1
3 votes
0 answers
394 views

A main benefit of Gaussian Process Regression is, that we not only get a prediction, but also a variance that we might use as indication of the prediction confidence. While bayesian linear regression ...
0-_-0's user avatar
  • 299
0 votes
0 answers
147 views

I got two multidimensional datasets, X and Y. I thought I build the model, which explains the relationship between two datasets, using canonical correlation analysis (CCA). The first correlation ...
kiddwill's user avatar
2 votes
1 answer
84 views

In sparse optimization, I am trying to solve the problem $$ \min_{x\in \mathbb R^{n}} \quad f(x) + \|x\|_1 $$ and at optimality, $x^*$ may be sparse. If I define the sparse manifold as $\mathcal M = ...
Y. S.'s user avatar
  • 1,277
2 votes
0 answers
122 views

I am trying to optimize a panel regression $G=\beta G+e$. $G \in R^{N\times T}$. $\beta\in R^{N\times N}$ is unknown coefficient, constrained to $diag(\beta)=0$, and reduced rank $rank(\beta)\leq r$. ...
kangyin ye's user avatar
0 votes
1 answer
1k views

I am interested in running a partial least squares analysis using PROC PLS in SAS 9.4. I understand that, by default, the predictors and response variables in PLS are centered to a mean 0 and scaled ...
Mark G's user avatar
  • 1
1 vote
0 answers
78 views

Consider a rank k matrix, call it M, of size nxm. All the elements are non-negative. Now do a noisy observation of it and assume independent Poissonian errors (the error on element $M_{ij}$ is ...
Patrick's user avatar
  • 213
6 votes
1 answer
425 views

In grad school, I was always taught the general linear model $$\mathbf{y} = \mathbf{X}\boldsymbol\beta + \boldsymbol\epsilon\tag{1}$$ where $\mathbf{y}$ is a vector, $\mathbf{X}$ is some matrix, $\...
Clarinetist's user avatar
  • 5,297
3 votes
1 answer
577 views

I am trying to find dietary patterns related to a disease outcome. Unfortunately, I only have the binary outcome "disease yes/no" as outcome. I tried to perform PCA on the data, but the dietary ...
user131483's user avatar
3 votes
0 answers
2k views

What is meant by the term "meta-parameter"? Can a definition, informal and/or formal, be provided? For example, in reduced-rank regression, the rank ($r$) can be referred to as a meta-parameter of ...
Graeme Walsh's user avatar
  • 4,207
10 votes
1 answer
861 views

This question results from the discussion following a previous question: What is the connection between partial least squares, reduced rank regression, and principal component regression? For ...
Minkov's user avatar
  • 475
23 votes
1 answer
6k views

Are reduced rank regression and principal component regression just special cases of partial least squares? This tutorial (Page 6, "Comparison of Objectives") states that when we do partial least ...
Minkov's user avatar
  • 475
5 votes
1 answer
3k views

Given two vectors of random variables $X$ and $Y$, Canonical Correlation Analysis (CCA) finds the transformation matrices $A$ and $B$ so that $\operatorname{corr}(A_{1*} X, B_{1*} Y)$ is first maximal,...
statotito's user avatar
45 votes
2 answers
25k views

I have been reading The Elements of Statistical Learning and I could not understand what Section 3.7 "Multiple outcome shrinkage and selection" is all about. It talks about RRR (reduced-rank ...
cgo's user avatar
  • 9,507
13 votes
1 answer
5k views

I am trying to learn Reduced-Rank Regression (RRR) from The Elements of Statistical Learning. I find the writing and them mathematics a little too prohibitive. Does any of you have a resource/text/...
cgo's user avatar
  • 9,507

15 30 50 per page