Let $f : \mathbb{R}^n \rightarrow \mathbb{R}$ continuously differentiable. We assume that there exists $L > 0$ such that \begin{equation} \|∇f(x)-∇f(x')\| \leq L\|x − x'\| \qquad \forall (x,x') \in \mathbb{R}^n \times \mathbb{R}^n. \end{equation}
Show that \begin{equation} |f(x + h) − f(x) − \langle ∇f(x), h\rangle | \leq \frac{L}{2} \|h\|^2 \qquad \forall (x, h) \in \mathbb{R}^n \times \mathbb{R}^n. \end{equation}
I do not understand part of the demonstration of which here is:
As $f$ is continuously differentiable, we have from Taylor's formula at zero order with residual in integral form \begin{equation} f (x + h) = f(x) + \int^1_0 \langle ∇f(x + th), h \rangle dt. \tag{1} \end{equation}
Me, when I apply Taylor's formula at zero order with residual in integral form, we get \begin{equation} f (x + h) = f(x) + \int_x^{x+h} ∇f(t) dt. \tag{2} \end{equation} What is the integration process that allows us to leave (2) to (1) ?