Given vectors $x_1, \dots, x_n \in \mathbb{R}^d$, I am seeking a solution to the following optimization problem:
$$ \underset{v \in \mathbb{R}^d}{\arg\max} \sum_{i=1}^n \left| \langle v, x_i \rangle \right| \quad\text{subject to}\quad \lVert v \rVert = 1 $$
So, I'm trying to find a vector $v$ of unit length that maximizes the sum of absolute dot products. I thought about solving this with Lagrange multipliers, but I don't think that's the way to go. Anyways, here's the derivative:
$$\frac{\partial}{\partial v} \sum_{i=1}^n |\langle v, x_i \rangle| = \sum_{i=1}^n \text{sgn}(\langle v, x_i \rangle) \cdot x_i$$
Root finding turns out to be a combinatorial problem where $v$ has to be chosen so that it assigns positive or negative sign to $x_i$ making the sum yield zero. So very possibly a root does not even exist. On second thought, the problem is a piecewise linear function, so it didn't make sense to try gradient-based methods in the first place.
So what class of optimization problems does this fall into? And are there related problems with known solutions?
I could think of a heuristic approach that sorts all $x_i$ by their lengths (assume $x_i$ are sorted length descending), then initializes $v=\frac{x_1}{\lVert x_1 \rVert}$ and then lets each subsequent $x_i$ turn $v$ a little bit into its direction according to their 'importance', i.e., their length. Something like this: $$ v^{(i)} := v^{(i-1)} + \text{sgn}(\langle v^{(i-1)}, x_i \rangle) \cdot \frac{x_i}{\lVert x_i \rVert} \cdot w(\lVert x_i \rVert) \quad \text{with some appropriate weight function $w$}$$ $$ v^{(i)} := \frac{v^{(i)}}{\lVert v^{(i)} \rVert} \quad \text{(normalize $v$)}$$
Considering D.W.'s pointer to a similar problem, there may be a connection to support vector machines. Or maybe a connection to solving the dual problem, so the objective and constraints need to be rewritten.
Or maybe minimizing $\lVert v \rVert$ could be done while constraining $|\langle v,x_i \rangle| \geq 1$. Not sure how this relates to the original problem though, and whether it is equivalent.