You want the least-squares solution of $$ \left[\begin{array}{rrr}1 & 0 & 2 \\ 0 & 1 & 3 \\ -1 & 1 & 1 \\ 0 & -1 & -3 \end{array}\right]x =\left[\begin{array}{r}1 \\ 0 \\ 0 \\ 1\end{array}\right] $$ What this means is that you want to minimize the distance between the two sides. Let $c_{1}$, $c_{2}$, $c_{3}$ denote the column vectors of the coefficient matrix. The following is orthogonal to $c_1$: $$ c_{2}' = c_{2}-\frac{(c_2,c_1)}{(c_1,c_1)}c_1=c_2+\frac{1}{2}c_1 =\left[\begin{array}{c} 1/2 \\ 1 \\ 1/2 \\ -1 \end{array}\right] $$ And the following is orthogonal to $c_1$, $c_2$: $$ c_{3}'=c_3-\frac{(c_3,c_1)}{(c_1,c_1)}c_1 -\frac{(c_3,c_2')}{(c_2',c_2')}(c_1+\frac{1}{2}c_2) \\ = c_{3}-\frac{1}{2}c_1-\frac{15/2}{5/2}(c_2+\frac{1}{2}c_1) \\\ = c_{3}-2c_1 -3c_2 = \left[\begin{array}{c} 0\\0\\0\\0\end{array}\right]. $$ Now we know that $x_1=-2, x_2=-3, c_3=1$ is in the null space of the original problem, which makes the least squares solution non-unique. So we'll choose a particular least squares solution to minimize $x_1^{2}+x_2^{2}+x_3^{2}$. In other words, there are two minimization problems required to solve this problem.
Let $c_{4}$ denote the column vector on the right. A best solution $x$ is where $$ c_{4} - \frac{(c_4,c_1)}{(c_1,c_1)}c_1-\frac{(c_4,c_2')}{(c_2',c_2')}(c_2+\frac{1}{2}c_1) \\ = c_4 -\frac{1}{2}c_1-\frac{-1/2}{5/2}(c_2+\frac{1}{2}c_1) \\ = c_4 -\frac{2}{5}c_1+\frac{1}{5}c_2. $$ Therefore, a least squares solution is $$ x = \left[\begin{array}{c}2/5 \\ -1/5 \\ 0\end{array}\right] $$ But we may vary this over the null space of the matrix $$ x=\left[\begin{array}{c}2/5 \\ -1/5 \\ 0\end{array}\right] +\alpha\left[\begin{array}{c}2 \\ 3 \\ -1\end{array}\right] $$ The solution vector of smallest length is where $\alpha$ is chosen so that the above is orthogonal to the second vector on the right: $$ x=\left[\begin{array}{c}2/5 \\ -1/5 \\ 0\end{array}\right]-\frac{1/5}{14}\left[\begin{array}{c}2 \\ 3 \\ -1\end{array}\right] = \left[\begin{array}{r}13/35 \\ -17/70 \\1/70\end{array}\right] $$