This was a question on a past final that we can't figure out. Take the least squares system
$$\min_x ||Ax-b||_2\, ,$$
where $A\in\mathbb{R}^{mxn}$, $m<n$, and A is full rank. A has $\mathcal{O}(n)$ non-zero entries. Assume you only have $\mathcal{O}(n)$ computer memory.
- Suggest an algorithm for approximating the solution to the least squares problem
- What is the expected number of iterations and overall computational cost?
- What is the expected error in the solution x? (Provide an appropriate definition of error for this problem)
We've only learned a little bit about iterative methods, just like Rayleigh Quotient, Inverse Iteration, Power Iteration, and Conjugate Gradient. None of these seem to help. Any ideas would be much appreciated.
Edit: I'm thinking now, what if we take a random mm columns so that a square matrix can be formed, and then use CG on that problem? And then do this with various random columns to find average solutions?