7
$\begingroup$

Given $\sqrt[n]{x}$, prove using the formal definition of a derivative that :

$$\frac{d}{dx} (\sqrt[n]{x}) = \frac{x^{\frac{1-n}{n}}}{n}$$

Now this would be ridiculously easy to show using the Power Rule, but alas, that is not the goal of this question.


Using the formal definition of a limit we get :

\begin{equation} \begin{split} f'(x) & = \lim_{h \ \to \ 0} \frac{f(x+h)-f(x)}{h} \\ & = \lim_{h \ \to \ 0} \frac{\sqrt[n]{x+h}-\sqrt[n]{x}}{h} \\ & = \lim_{h \ \to \ 0} \frac{(x+h)^{\frac{1}{n}}-(x)^{\frac{1}{n}}}{h} \end{split} \end{equation}

But it is unclear to me how to proceed next, essentially all we need to do to get this limit into a determinate form (it currently is in an indeterminate form) is to factor out a $h$ in the numerator, but there doesn't seem to be an obvious way to do so.

What algebraic technique, would you use to factor out a $h$ in the numerator in this case? For $n=2$, you could easily multiply the fraction by the conjugate to get the limit into a determinate form, and for $n=3$, you could do the same with the help of a few identities, but how would you go about this for the general case, as stated in the example I've given above.

This question is the general $n^{th}$ case of finding the derivative using the formal definition, for functions such as $f(x) = \sqrt{x}$, $f(x) = \sqrt[3]{x}$ and so forth, and is aimed at finding the best algebraic technique to manipulate the limit to get it into a determinate form.

$\endgroup$
5
  • 6
    $\begingroup$ Hint: Use $ x^n - y^n = (x-y) \sum_{i+j = n-1} x^i y^j $. $\endgroup$ Commented May 10, 2016 at 17:14
  • $\begingroup$ If you don't know how to manipulate the summations, you will find that solving the problem is rather difficult. $\endgroup$ Commented May 10, 2016 at 17:37
  • $\begingroup$ Perhaps this problem is easy with power rule, so maybe the goal should be showing why power rule works and applying that to your problem. The general method should work just fine. $\endgroup$ Commented May 10, 2016 at 17:38
  • $\begingroup$ See my answer to the math stackexchange question Differentiation using first principles with rational powers. $\endgroup$ Commented May 12, 2016 at 19:20
  • $\begingroup$ Another derivative limit: $$f'(x)=\lim_{h\to x}\frac{f(x)-f(h)}{x-h}$$ Try that, seems simpler $\endgroup$ Commented May 13, 2016 at 17:23

5 Answers 5

11
$\begingroup$

The question is definitely not trivial. +1 for OP. The solution follows from the following theorem:

Theorem: If $a > 0$ and $n$ is a rational number then $$\lim_{x \to a}\frac{x^{n} - a^{n}}{x - a} = na^{n - 1}\tag{1}$$ This is one of the standard limits which can be used to evaluate many limits involving algebraic functions.

The proof of the above theorem is easy if $n$ is an integer. For positive integers we can simply use $$x^{n} - a^{n} = (x - a)\sum_{i = 0}^{n - 1}x^{n - 1 - i}a^{i}$$ For $n = 0$ the result is obvious. For negative integer $n = -m$ we can use $x^{n} = 1/x^{m}$ and the fact that the result holds for positive integers. Similarly if the result holds positive rational number $n$ we can show that it holds for negative rational $n$ also.

Thus we need to show that if $n = p/q$ with integers $p > 0, q > 1$ then the formula $(1)$ holds. Let $b = a^{1/q}$ so that $a = b^{q}$. We know that $$\lim_{y \to b}\frac{y^{q} - b^{q}}{y - b} = qb^{q - 1}\tag{2}$$ From $(2)$ it follows that the ratio $(y^{q} - b^{q})/(y - b)$ is bounded and away from $0$ as $y \to b$. Hence its reciprocal is also bounded and away from $0$ as $y \to b$. Also note that when $y \to b$ then $x = y^{q} \to b^{q} = a$ (and vice-versa because $f(y) = y^{q}$ is strictly monotone in $[0, \infty)$). Thus the ratio $(x^{1/q} - a^{1/q})/(x - a)$ is bounded when $x = y^{q} \to a$ and therefore $$\lim_{x \to a}x^{1/q} = a^{1/q}\tag{3}$$ (this by the way proves continuity of $x^{1/q}$).

Now we have \begin{align} L &= \lim_{x \to a}\frac{x^{n} - a^{n}}{x - a}\notag\\ &= \lim_{x \to a}\frac{x^{p/q} - a^{p/q}}{x - a}\notag\\ &= \lim_{t \to b}\frac{t^{p} - b^{p}}{t^{q} - b^{q}}\text{ (putting }x = t^{q}, a = b^{q}\text{ and using (3))}\notag\\ &= \lim_{t \to b}\dfrac{\dfrac{t^{p} - b^{p}}{t - b}}{\dfrac{t^{q} - b^{q}}{t - b}}\notag\\ &= \frac{pb^{p - 1}}{qb^{q - 1}}\notag\\ &= \frac{p}{q}b^{p - q}\notag\\ &= na^{n - 1}\notag \end{align} There is another way to prove this (via inequalities and squeeze theorem) without using the continuity of $x^{1/q}$. Let me know if you are interested in that version.


Update: On request of OP I am providing a proof of formula $(1)$ based on Squeeze Theorem. The credit for this proof must go to G. H. Hardy!

In what follows all the numbers are positive (whether they are integers, rationals or reals will be mentioned as and when needed).

Let $a, b$ be real numbers with $a > 1 > b > 0$. Let $r$ be an integer. Clearly we have $a^{r} > a^{i}$ for all $i = 0, 1, 2, \ldots, r - 1$. Hence on adding these inequalities we get $$ra^{r} > 1 + a + a^{2} + \cdots + a^{r - 1}$$ Multiplying by $(a - 1) > 0$ we get $$ra^{r}(a - 1) > a^{r} - 1$$ Adding $r(a^{r} - 1)$ on both sides, and dividing by $r(r + 1)$, we obtain $$\frac{a^{r + 1} - 1}{r + 1} > \frac{a^{r} - 1}{r}\tag{4}$$ Similarly we can prove that $$\frac{1 - b^{r + 1}}{r + 1} < \frac{1 - b^{r}}{r}\tag{5}$$ It follows that if $r, s$ are positive integers with $r > s$ then $$\frac{a^{r} - 1}{r} > \frac{a^{s} - 1}{s},\,\frac{1 - b^{r}}{r} < \frac{1 - b^{s}}{s}\tag{6}$$ If we put $s = 1$ we get $$a^{r} - 1 > r(a - 1),\, 1 - b^{r} < r(1 - b)\tag{7}$$ for $r > 1$.

Next we show that the inequalities $(6), (7)$ hold when $r, s$ are positive rational numbers with $r > s$. Let $r = k/l, s = m/n$ and $r > s$ implies that $kn > lm$. Let $c = a^{1/ln}$ so that $c > 1$. In the first inequality of $(6)$ we can replace $a$ by $c$, $r$ by $kn$ and $s$ by $lm$ to get $$\frac{c^{kn} - 1}{kn} > \frac{c^{lm} - 1}{lm}$$ or $$\frac{a^{r} - 1}{r} > \frac{a^{s} - 1}{s}$$ In similar manner we can prove that other inequalities also hold when $r, s$ are rational numbers. Now that $r, s$ are rational, it is possible to take $r = 1$ in $(6)$ to get $$a^{s} - 1 < s(a - 1),\,1 - b^{s} > s(1 - b)\tag{8}$$ for rational $s$ with $0 < s < 1$. Thus we have inequalities $(6)-(8)$ for all positive rational numbers $r, s$ with $r > 1 > s$.

In what follows we will assume that $a, b$ are real with $a > 1 > b > 0$ (same as before) and $r, s$ are rational with $r > 1 > s > 0$. Clearly $1/b > 1$ and hence replacing $a$ by $1/b$ and $b$ by $1/a$ in $(7)$ we get $$a^{r} - 1 < ra^{r - 1}(a - 1),\, 1 - b^{r} > rb^{r - 1}(1 - b)\tag{9}$$ Similarly from $(8)$ we get $$a^{s} - 1 > sa^{s - 1}(a - 1),\, 1 - b^{s} < sb^{s - 1}(1 - b)\tag{10}$$ Combining $(7)$ and $(9)$ we get $$ra^{r - 1}(a - 1) > a^{r} - 1 > r(a - 1)\tag{11}$$ Writing $a = x/y$ we get $$rx^{r - 1}(x - y) > x^{r} - y^{r} > ry^{r - 1}(x - y)\tag{12}$$ for $x > y > 0$. Similarly from $(8)$ and $(10)$ we get $$sx^{s - 1}(x - y) < x^{s} - y^{s} < sy^{s - 1}(x - y)\tag{13}$$ for $x > y > 0$.

From the above inequalities it is clear that the function $f(x) = x^{r}$ is continuous for $x > 0$. Taking reciprocals it is easy to see that the function $f(x)$ is continuous even if $r$ is negative rational number. Further if we divide by $(x - y) > 0$ and let $x \to y^{+}$ we get via Squeeze Theorem the fundamental result $$\lim_{x \to y^{+}}\frac{x^{r} - y^{r}}{x - y} = ry^{r - 1}$$ for all positive rational numbers $r$ and $y > 0$. Interchanging the roles of $x, y$ it is easy to see that the limit holds for $x \to y^{-}$. This proves the formula $(1)$ for positive rational values of $n$.

This is the way Hardy proves the formula $$\frac{d}{dx}(x^{n}) = nx^{n - 1}$$ for rational $n$ in his classic text "A Course of Pure Mathematics".

$\endgroup$
8
  • $\begingroup$ Thanks for the excellent answer to this question. I'm currently at University now, but I have a few further comments/questions on your answer which I'll post in a few hours. If possible, I would also be really interested in the version using inequalities and squeeze theorem. $\endgroup$ Commented May 12, 2016 at 12:06
  • $\begingroup$ is there a more general analog of Theorem $(1)$, $\forall n, a \in \mathbb{R}$ ? $\endgroup$ Commented May 12, 2016 at 16:04
  • $\begingroup$ @Perturbative: Yes it holds for all $n \in \mathbb{R}$ but then $a$ must be strictly positive. The idea is that it holds when $na^{n - 1}$ is a real number. Also I am writing a proof via squeeze theorem. You will see it after some time (may be 20 min). $\endgroup$ Commented May 12, 2016 at 16:16
  • $\begingroup$ @Perturbative: It took me much longer than I expected, thanks to internet explorer hang (and all the draft of the edit lost). I did it again in firefox. $\endgroup$ Commented May 12, 2016 at 19:24
  • $\begingroup$ Thank you again for this all round superb answer, and for the proof via Squeeze Theorem, if there was a way I could upvote this more I would. You post has been hugely helpful. $\endgroup$ Commented May 12, 2016 at 19:33
5
$\begingroup$

Using the identity

$$ a^n - b^n = (a - b)(a^{n-1} + a^{n-2}b + \dots + ab^{n-2} + b^{n-1}) $$

you have

$$ \frac{(x+h)^{\frac{1}{n}} - x^{\frac{1}{n}}}{h} \cdot \frac{(x+h)^{\frac{n-1}{n}} + (x+h)^{\frac{n-2}{n}}x+\dots+(x+h)^{\frac{1}{n}}x^{\frac{n-2}{n}}+x^{\frac{n-1}{n}}}{(x+h)^{\frac{n-1}{n}} + (x+h)^{\frac{n-2}{n}}x+\dots+(x+h)^{\frac{1}{n}}x^{\frac{n-2}{n}}+x^{\frac{n-1}{n}}} = \frac{1}{(x+h)^{\frac{n-1}{n}} + (x+h)^{\frac{n-2}{n}}x+\dots+(x+h)^{\frac{1}{n}}x^{\frac{n-2}{n}}+x^{\frac{n-1}{n}}} \xrightarrow[h \to 0]{} \frac{1}{x^{1-\frac{1}{n}}+x^{1-\frac{2}{n}}x^{\frac{1}{n}}+\dots +x^{\frac{1}{n}}x^{1-\frac{2}{n}}+x^{1-\frac{1}{n}}}=\frac{1}{n x^{1 - \frac{1}{n}}}=\frac{x^{\frac{1}{n}-1}}{n}. $$

$\endgroup$
1
  • $\begingroup$ XD Love following through with the lines, but it feels like this was overly complicated. +1 $\endgroup$ Commented May 10, 2016 at 17:26
2
$\begingroup$

Multiply top and bottom by $$\sum_\limits{k=0}^{n-1} (x+h)^\frac{k}{n}(x)^\frac{n-1-k}{n}$$

this gives you.

$$\lim_\limits{h\to 0} \dfrac {(x+h)-x}{h\sum_\limits{k=0}^{n-1} (x+h)^\frac{k}{n}(x)^\frac{n-1-k}{n}}$$

And evaluate as $h$ goes to $0.$

$\endgroup$
4
  • $\begingroup$ Could explain how that simplifies down to $\frac{x^{\frac1n-1}}n$ $\endgroup$ Commented May 10, 2016 at 17:36
  • $\begingroup$ @SimpleArt where am I losing you? $\endgroup$ Commented May 12, 2016 at 15:41
  • $\begingroup$ @DougM How to evaluate the summation after doing the limit? $\endgroup$ Commented May 12, 2016 at 20:14
  • $\begingroup$ As $h \to 0, (x+h)^{\frac{k}{n}}(x)^{\frac{n-1-k}{n}} \to x^{\frac{n-1}{n}}$ and we sum n identical terms giving us $\frac{1}{nx^{\frac{n-1}{n}}}=\frac{x^{\frac{1}{n}-1}}{n}$ $\endgroup$ Commented May 12, 2016 at 20:39
0
$\begingroup$

If $f(x) =x^{1/n}$ then $f(x)^n=x$. Now derive using the chain rule and solve the result for $f'$.

If you really need to use the limit of a difference quotient, note that he above reasoning is really the same as doing the following:

Write $$ f'(x) = \lim_{y\to x}\frac{\sqrt[n]{y}-\sqrt[n]{x}}{y-x} $$ with the substitution $t = \sqrt[n]{x}$, $s=\sqrt[n]{y}$ which then gives $$ f'(x) = \lim_{s\to t}\frac{s-t}{s^n-t^n} = \left(\lim_{s\to t}\frac{s^n-t^n}{s-t}\right)^{-1} = (nt^{n-1})^{-1} = \frac{1}{n}x^{-\tfrac{n}{n-1}} $$ (where we used that the derivative of $t^n$ is $nt^{n-1}$).

$\endgroup$
1
  • $\begingroup$ @TheGreatDuck Actually, it is really the limit definition in disguise (see the edit) and no, you don't need the binomial theorem (really not: The above way uses the derivative of $t^n$ but this can be proven recursively via the product rule without the binomial theorem). $\endgroup$ Commented May 11, 2016 at 5:43
0
$\begingroup$

$$\frac{d}{dx}x^n=\lim_{h\to0}\frac{(x+h)^n-x^n}h$$

$$=\lim_{h\to0}\frac{[x^n+nx^{n-1}h+\frac{n(n-1)x^{n-2}h^2}{2!}+\dots]-x^n}h$$ named Binomial Expansion. $$=\lim_{h\to0}\frac{{nx^{n-1}h+\frac{n(n-1)x^{n-2}}{2!}h^2+\dots}}{h}$$

$$=\lim_{h\to0}nx^{n-1}+\frac{n(n-1)x^{n-2}}{2!}h+\dots$$

$$=nx^{n-1}+0+0+\dots=nx^{n-1}$$

Just use $n=\frac1m$ for your case.

$\endgroup$
7
  • $\begingroup$ Would be nice if the binomial theorem would hold for non-integers. $\endgroup$ Commented May 11, 2016 at 5:46
  • $\begingroup$ Doesn't it hold for most complex numbers? @Dirk $\endgroup$ Commented May 12, 2016 at 11:07
  • $\begingroup$ @YoTengoUnLCD You get an infinite series and hence, you can't pull the limit through without further arguments. $\endgroup$ Commented May 12, 2016 at 13:16
  • $\begingroup$ @Dirk Euler extended the binomial theorem to non-integers. In fact, the above expansion is his expansion. And you can pull through with the limit. You simply put in $h=0$ and it solves everything for you. The infinite series doesn't seem to have any problem with that, nor does it diverge. $\endgroup$ Commented May 12, 2016 at 20:16
  • $\begingroup$ Like others have mentioned in comments, taking limit of an infinite series is complicated stuff (far more than proving this simple limit and hence not given in introductory calculus). Also the binomial expansion itself requires the limit formula for its proof. It is important to understand that most of the proofs in calculus given in early phases of calculus lacked rigor. It was only when people like Cauchy arrived that rigorous calculus came into picture. And final piece was added by Cantor Dedekind via their theory of real numbers which made calculus fully rigorous. $\endgroup$ Commented May 13, 2016 at 4:32

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.