I'm working on Statistical Physics Methods in Optimization and Machine Learning by Krzakala and Zdeborova, which can be accessed here: https://sphinxteam.github.io/EPFLDoctoralLecture2021/Notes.pdf.
On page 10 (22 of 287), the authors consider the Curie-Weiss model with zero field, $h=0$. They then claim that
$$\mathbb{P}(\bar{S}=m, h = 0) = \frac{\sum e^{-\beta \mathcal{H}_N^0} I(\bar{S}=m)}{Z_N(\beta, h = 0)} = \frac{\sum e^{-\beta \mathcal{H}_N^0} I(\bar{S}=m)}{e^{N \log \Phi_N(\beta, 0)}} \asymp e^{-NI_0^*(m)}$$
where $I_0^*(m)$ is the true large deviation rate at zero external field, and $\log$ is the natural logarithm. My problem with this arises from the fact that Theorem 1 on page 6 (18 of 287) defines
$$\Phi_N(\beta, h) = \frac{1}{N} \log Z_N(\beta, h)$$
which would necessitate
$$Z_N(\beta, h = 0) = e^{N \Phi_N(\beta, h = 0)}$$
instead of the aforementioned $e^{N \log \Phi_N(\beta, h=0)}$.
Is this a typo, or am I missing something important here?