Skip to main content
I fixed a typo in the answer. Instead of infinity sign I have used zero, now all is correct.
Source Link
TwinPenguins
  • 4.4k
  • 3
  • 22
  • 54

The following is the cost function of the Logistic Regression derived via Maximum Likelihood Likelihood Estimation: enter image description here enter image description here

  • If y = 1 (positive): i) cost = 0 if prediction is correct (i.e. h=1), ii) cost $-> \infty $$\rightarrow \infty $ if $h_{\theta}(x)->0$$h_{\theta}(x)\rightarrow 0$.
  • If y = 0 (negative): i) cost = 0 if prediction is correct (i.e. h=0), ii) cost $-> \infty$$\rightarrow \infty$ if $(1-h_{\theta}(x))->0$$(1-h_{\theta}(x))\rightarrow 0$.

The intuition is that larger mistakes should get larger penalties. Further readings, 1,2,3,4.

The following is the cost function of the Logistic Regression derived via Maximum Likelihood Estimation: enter image description here enter image description here

  • If y = 1 (positive): i) cost = 0 if prediction is correct (i.e. h=1), ii) cost $-> \infty $ if $h_{\theta}(x)->0$.
  • If y = 0 (negative): i) cost = 0 if prediction is correct (i.e. h=0), ii) cost $-> \infty$ if $(1-h_{\theta}(x))->0$.

The intuition is that larger mistakes should get larger penalties. Further readings, 1,2,3,4.

The cost function of the Logistic Regression derived via Maximum Likelihood Estimation: enter image description here enter image description here

  • If y = 1 (positive): i) cost = 0 if prediction is correct (i.e. h=1), ii) cost $\rightarrow \infty $ if $h_{\theta}(x)\rightarrow 0$.
  • If y = 0 (negative): i) cost = 0 if prediction is correct (i.e. h=0), ii) cost $\rightarrow \infty$ if $(1-h_{\theta}(x))\rightarrow 0$.

The intuition is that larger mistakes should get larger penalties. Further readings, 1,2,3,4.

I fixed a typo in the answer. Instead of infinity sign I have used zero, now all is correct.
Source Link
TwinPenguins
  • 4.4k
  • 3
  • 22
  • 54

The following is the cost function of the Logistic Regression derived via Maximum Likelihood Estimation: enter image description here enter image description here

  • If y = 1 (positive): i) cost = 0 if prediction isis correct (i.e. h=1), ii) cost $->$ 0$-> \infty $ if $h_{\theta}(x)->0$.
  • If y = 0 (negative): i) cost = 0 if prediction isis correct (i.e. h=0), ii) cost $->$ 0$-> \infty$ if $(1-h_{\theta}(x))->0$.

The intuition is that larger mistakes should get larger penalties. Further readings, 1,2,3,4.

The following is the cost function of the Logistic Regression derived via Maximum Likelihood Estimation: enter image description here enter image description here

  • If y = 1 (positive): i) cost = 0 if prediction is correct (i.e. h=1), ii) cost $->$ 0 if $h_{\theta}(x)->0$.
  • If y = 0 (negative): i) cost = 0 if prediction is correct (i.e. h=0), ii) cost $->$ 0 if $(1-h_{\theta}(x))->0$.

The intuition is that larger mistakes should get larger penalties. Further readings, 1,2,3,4.

The following is the cost function of the Logistic Regression derived via Maximum Likelihood Estimation: enter image description here enter image description here

  • If y = 1 (positive): i) cost = 0 if prediction is correct (i.e. h=1), ii) cost $-> \infty $ if $h_{\theta}(x)->0$.
  • If y = 0 (negative): i) cost = 0 if prediction is correct (i.e. h=0), ii) cost $-> \infty$ if $(1-h_{\theta}(x))->0$.

The intuition is that larger mistakes should get larger penalties. Further readings, 1,2,3,4.

Source Link
TwinPenguins
  • 4.4k
  • 3
  • 22
  • 54

The following is the cost function of the Logistic Regression derived via Maximum Likelihood Estimation: enter image description here enter image description here

  • If y = 1 (positive): i) cost = 0 if prediction is correct (i.e. h=1), ii) cost $->$ 0 if $h_{\theta}(x)->0$.
  • If y = 0 (negative): i) cost = 0 if prediction is correct (i.e. h=0), ii) cost $->$ 0 if $(1-h_{\theta}(x))->0$.

The intuition is that larger mistakes should get larger penalties. Further readings, 1,2,3,4.