The following is the cost function of the Logistic Regression derived via Maximum Likelihood Likelihood Estimation:

- If y = 1 (positive): i) cost = 0 if prediction is correct (i.e. h=1), ii) cost $-> \infty $$\rightarrow \infty $ if $h_{\theta}(x)->0$$h_{\theta}(x)\rightarrow 0$.
- If y = 0 (negative): i) cost = 0 if prediction is correct (i.e. h=0), ii) cost $-> \infty$$\rightarrow \infty$ if $(1-h_{\theta}(x))->0$$(1-h_{\theta}(x))\rightarrow 0$.
The intuition is that larger mistakes should get larger penalties. Further readings, 1,2,3,4.