I am learning Maximum Likelihood Estimation.
Per this post, the log of the PDF for a normal distribution looks like this:
$$ \log{\left(f\left(x_i;\,\mu,\sigma^2\right)\right)} = - \frac{n}{2} \log{\left(2 \pi\right)} - \frac{n}{2} \log{\left(\sigma^2\right)} - \frac{1}{2 \sigma^2} \sum{{\left(x_i - \mu\right)}^2} \tag{1} $$
According to any Probability Theory textbook, the formula of the PDF for a normal distribution: $$ \frac {1}{\sigma \sqrt {2\pi}} e^{-\frac {(x - \mu)^2}{2\sigma ^2}} \rlap{\qquad \text{where}~-\infty <x<\infty} \tag{2} $$
Taking the log of Expression 2 produces
\begin{align} \ln\left(\frac{1}{\sigma \sqrt {2\pi}} e^{-\frac{\left(x - \mu\right)^2}{2\sigma ^2}}\right) &= \ln\left(\frac {1}{\sigma \sqrt {2\pi}}\right)+\ln{\left(e^{-\frac {(x - \mu)^2}{2\sigma ^2}}\right)} \tag{3} \\[5px] &=-\ln\left(\sigma\right)-\frac{1}{2} \ln\left(2\pi\right) - \frac{\left(x - \mu\right)^2}{2\sigma ^2} \tag{4} \end{align}
which is very different from Equation 1.
Is Equation 1 right? What am I missing?