Skip to main content
Source Link
Thomas Andrews
  • 188k
  • 18
  • 229
  • 432

There is a lovely way of motivating the "existence" of the complex numbers just by using a little calculus on the real numbers. I found this in Visual Complex Analysis, and it tickled me, so I thought I'd share it here, despite the lateness of the answer.

If $r_1,...,r_n$ are real numbers, define:

$$f(x)= \frac{1}{(x-r_1)(x-r_2)...(x-r_n)}$$

When $a\notin \{r_1,...,,r_n\}$ we can find the Taylor series around $a$:

$$\sum_{k=0}^\infty \frac{f^{(k)}(a)}{k!}(x-a)^k$$

The question is, for what (real) $x$ does this series converge to $f(x)$?

As it turns out, if we let $R=\min_{k} |a-r_k|$, then if $|x-a|<R$ this series converges to $f(x)$ and if $|x-a|>R$, then it doesn't converge.

So, in a sense, the $r_i$ "block" the ability of the Taylor series to converge around them.

Now, what about the Taylor series for $g(x)=\frac{1}{x^2+1}$?

Given an $a$, this function has no "real" blockages - it is defined on all of $\mathbb R$ - but the Taylor series for $g(x)$ around $a$ has a similar $R$ value, and that $R$ value is $\sqrt{1+a^2}$, a value that can be computed entirely with real number calculations.

That then looks like there is some geometric obstruction to the Taylor series, an obstruction not on the real line, but a unit distance away from the real line in a perpendicular direction away from $0$. It "looks like" an "imaginary" root of $x^2+1=0$.

Post Made Community Wiki by Thomas Andrews