0
$\begingroup$

Suppose you are given $6$ one-dimensional points: $3$ with negative labels $x_1 = −1$, $x_2 = 0$, $x_3 = 1$ and $3$ with positive labels $x_4 = −3$, $x_5 = −2$, $x_6 = 3$. In this question, we first compare the performance of linear classifier with or without kernel. Then we solve for the maximum margin classifier using SVM.

Consider a linear classifier of form $f(x) = sign(w_1x+w_0)$. Write down the optimal value of $w$ and its classification accuracy on the above 6 points. There might be more than one optimal solution, writing down one of them is enough.

My attempt:
I understand that the data isn't linearly separable and that there will be some error, but I don't get how to get the optimal value of $w$. Do I minimize $f(x)$? But how do I take the derivative of $f(x)$? Any guidance would be appreciated, I'm a little lost.

$\endgroup$

1 Answer 1

0
$\begingroup$

The best that we can do is to classify $5$ points correctly and sacrifice one point.

We want to classify $-1,0,1$ as negative and $-3,-2$ as positivie. (we have to sacrifice $3$).

The boundary with maximum margin would be in the middle of $-2$ and $-1$. That is $-\frac32$.

$$sign(-(2x+3))=sign(-2x-3).$$

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.