10
$\begingroup$

I’ve read similar questions, but the answers I found were either contradictory or too advanced for my current level. I’m a computer engineering student, and I’m trying to understand a point in Terence Tao’s Analysis I.

Tao distinguishes two concepts:

  1. Mathematical induction: a method for proving that a property holds for all natural numbers.

  2. Recursive definitions: a way to define sequences or operations using functions

When he uses induction in proofs, he introduces the inductive step with “suppose inductively,” which is clear to me because it’s the second step of the mathematical Induction defined earlier.

What confuses me is that when he gives recursive definitions—for instance, defining addition and multiplication—he uses the same logical structure as induction and even the same phrase “suppose inductively.” But in this context he isn’t proving anything; he’s defining a new operation from scratch. I don’t understand why the same formulation is used if mathematical induction and recursive definition are supposed to be different notions.

Some answers I’ve seen say that recursion and induction are “the same thing,” which only adds to my confusion. If they refer to the same underlying idea, why use two different words? Why not just call recursive definitions “inductive definitions”?

Thanks in advance.

$\endgroup$
5
  • 4
    $\begingroup$ Recursion and induction describe the same mathematical phenomenon, but from different points of view. It wouldn't be the first time when a thing has multiple equivalent descriptions. $\endgroup$ Commented Nov 15 at 11:09
  • 7
    $\begingroup$ I think this can be answered, but can you give a specific reference to the book to make sure the answer addresses what you are seeing? $\endgroup$ Commented Nov 15 at 11:11
  • 1
    $\begingroup$ Hi, welcome to Math SE. That recursive definitions uniquely specify a well-defined object often requires proof by induction. To see what goes wrong if you don't (or can't) use induction, read this example. This sidebar question is also relevant. $\endgroup$ Commented Nov 15 at 11:14
  • $\begingroup$ Induction can be used both to prove that a given sequence of objects satisfies some property (usual use of induction in proofs) or to, given some property, construct a sequence of objects that satisfies it (e.g. recursive definition), which can be seem as proof of existence anyway. $\endgroup$ Commented Nov 15 at 11:15
  • 1
    $\begingroup$ @CarlMummert here’s a link to the book: tiu-edu.uz/media/books/2024/05/28/1664976801.pdf The definitions that i’m referring to can be found in the following pages: 19,23,25 $\endgroup$ Commented Nov 15 at 11:26

5 Answers 5

5
$\begingroup$

This is how Tao states the definition of addition in the text cited, using $n++$ to mean $n+1$.

Definition 2.2.1 (Addition of natural numbers). Let $m$ be a natural number. To add zero to $m$, we define $0 + m := m$. Now suppose inductively that we have defined how to add $n$ to $m$. Then we can add n++ to m by defining $(n++) + m := (n + m)++$.

Remember that induction is the principle that, for each property $P(n)$ of a natural number $n$, we have $$ (P(0) \land (\forall n)[P(n)\to P(n+1)]) \to (\forall n)P(n) $$

The easiest way to read Tao's statement as an induction proof is to let $P(n)$ be "we can add $n$ to $m$", where $m$ is a fixed natural number that we have already selected. Tao argues that we can add $0$ to $m$ and that, if we can add $n$ to $m$ then we can add $n+1$ to $m$.

Here "we can add $n$ to $m$" is intended to mean "there is a unique number $k_n$ that we have identified as the sum $m+n$". In that language, Tao shows that $k_0 = m$ and $k_{n+1} = k_{n} + 1$. Then, by induction, we know that $k_n$ is defined (and uniquely specified) for all $n$, so for all $n$ we can add $n$ to $m$.

So this does use mathematical induction, but the inductive hypothesis $P$ is not just an equation, it is an assertion that $k_n$ is uniquely defined. This is the typical way to read and write constructions like this in ordinary mathematics. It is true that the sequence $k_n$ is being recursively defined. We verify that the recursive definition works for all $n$ using mathematical induction.

In mathematical logic, there are some cases where we do distinguish a "recursive definition" from a proof by induction. But the two concepts are tightly connected because every recursive definition creates an induction principle.

In normal mathematics, outside formal logic, there's not usually any benefit to the task at hand if we try to emphasize a distinction between "definitions by induction" and "definitions by recursion". The distinction is especially small in cases like this when the definition uses a set like the set of natural numbers that we already have at hand.

$\endgroup$
3
  • $\begingroup$ Thank you for your answer. The fact that we use induction on top of the recursive definition made everything clearer, even though I’m not sure I fully understand how this works. I would like to ask what you mean by: “But the two concepts are tightly connected because every recursive definition creates an induction principle.” Does this mean that even if we can define sequences like Kn without using induction, we still need induction to prove that our recursive definition works for all n? or it's a different concept? $\endgroup$ Commented Nov 15 at 16:49
  • $\begingroup$ @Federico Tecleme: Here is an example. Suppose we start with $\mathbb{R}$ and we define the natural numbers as a set $N \subseteq \mathbb{R}$ so that (i) $0 \in N$, (ii) if $x \in N$ then $x+1 \in N$, and (iii) a real $x$ is in $N$ if and only If it is required to be in $N$ by clauses (i) and (ii). This is called either a recursive definition of the set $N$ or an inductive definition of the set $N$, depending on context. Now, from the definition, we get an induction principle: if $M$ is any set of reals so that $0 \in M$ and $M$ is closed under $x \mapsto x + 1$, then $N \subseteq M$. $\endgroup$ Commented Nov 15 at 20:37
  • $\begingroup$ Might be worth mentioning that (iii) means "$N$ is the intersection of all sets $S\subseteq\mathbb{R}$ such that $0\in S$ and for all $x\in S$, $x+1\in S$". Then using $S=\{n\in N: P(n)\}$, the mentioned induction principle implies $S=N$, and from that induction itself follows. $\endgroup$ Commented Nov 16 at 18:36
5
$\begingroup$

Recursion can be seen (and defined) as the non-dependent case of induction.

Inductively defined types like the natural numbers come with a general induction principle or eliminator, which, in the case of the natural numbers, looks like

$$ \mathrm{ind}_\mathbb{N} : (P : \mathbb{N} \to \mathcal{U}) \to P\,0 \to ((n : \mathbb{N}) \to P\,n \to P(n+1)) \to (n : \mathbb{N}) \to P\,n $$

Specialising this to the case where $P = \lambda n.\, A$ is a constant (non-dependent) family of types, we get the recursion principle for the natural numbers instead (primitive recursion):

$$ \begin{align} &\mathrm{rec}_\mathbb{N} : (A : \mathcal{U}) \to A \to (\mathbb{N} \to A \to A) \to \mathbb{N} \to A \\ &\mathrm{rec}_\mathbb{N}\,A = \mathrm{ind}_\mathbb{N}(\lambda n.\, A) \end{align} $$

For more on this perspective, see the HoTT book (this is first spelled out in detail for the case of product types in section 1.5, but applies more generally).

$\endgroup$
1
  • 3
    $\begingroup$ Thank you, but I looked at the book you sent and it seems a little too technical for me at the moment $\endgroup$ Commented Nov 15 at 17:18
5
$\begingroup$

Let me copy and edit a passage from your post, unveiling the parallels that are hidden in what you wrote by being more explicit about the mathematical nature of a sequence:

  1. Mathematical induction: a method for proving a property $P_n$ for all natural numbers $n$.

  2. Recursive definition: a method for defining a sequence $x_n$ for all natural numbers $n$.

Perhaps the evident parallels in these outlines might lead one to suspect parallels in the full description.

So what are these two descriptions?

For mathematical induction, you assume that $P_1$ is true, and you then for any natural number $k$ you assume that $P_k$ is true and use that to prove $P_{k+1}$.

For a recursive definition, you assume that $x_1$ is defined, and then for any natural number $k$ you assume that $x_k$ is defined and use that to define $x_{k+1}$.

Does this mean that they are "the same thing"? Well, no, but clearly they are analogous things.

But this invites another question: Is there some mathematical rigor underlying the analogy? There is indeed, as described well in the other answers.

$\endgroup$
1
$\begingroup$

Tao's definition of a function $f:X\to Y$ consists of two sets $X$ and $Y$ along with a predicate $P(x,y)$ such that for all $x\in X$ there is a unique $y\in Y$ such that $P(x,y)$. (Side note. Sometimes books define a functions to be subsets $G\subseteq X\times Y$ such that for all $x\in X$ there is a unique $y\in Y$ such that $(x,y)\in G$. These are basically equivalent: given a predicate you get a set $G=\{(x,y)\in X\times Y : P(x,y)\}$, and given a set $G$ you get a predicate $P(x,y):(x,y)\in G$.)

Let's use the example of defining an addition function (2.2.1, given in this answer). We want a function $a:\mathbb{N}\times\mathbb{N}\to\mathbb{N}$ with the rules $a(0,m)=m$ and $a(n{++},m)=a(n,m){++}$ for all $n,m\in\mathbb{N}$. This means we need to find a predicate that defines a function with these rules.

For non-recursive definitions, it's easy to find a predicate. For example, with $s:\mathbb{N}\to\mathbb{N}$ defined by $s(n)=n{++}$, you can use $P(n,y):y=n{++}$. Defining functions by cases is easy too. However, recursive rules are trickier. How do you know that the rules actually define a function? They might never terminate when you apply them. There are even simple recursive rules like those of the Collatz conjecture where we don't even know yet whether they define a function or not!

Motto: Induction is the reasoning principle for recursively-defined data.

The natural numbers are recursively defined: you have $0$, and for every $n$, you have a natural number $n{++}$. Also, every natural number can be obtained from these. This last fact is encoded via the principle of induction, that if you prove $P(0)$ and that, for all $n'$, if $P(n')$ then $P(n'{++})$, then $P(n)$ is true for all $n$. Proofs by induction are sort of recursive proofs; when $n\geq 1$, the theorem is allowed to "call itself" on $n-1$. Calling itself on smaller inputs guarantees the argument terminates, so to speak.

Back to addition: we need to build a predicate $P((n,m),y)$. Reading the rules for $a$, the predicate ought to have the following properties:

  • for all $m$, $P((0,m),m)$ is true
  • for all $n$, $m$, and $y$, if $P((n,m),y)$ then $P((n{++},m),y{++})$.

This second rule is a one-direction version of the equality, but it's the only one we need. (Intuitively, it says we can calculate $a(n{++},m)$ from $a(n,m)$, so the recursive call $a(n,m)$ is using smaller inputs, guaranteeing termination.)

Theorem. There exists a predicate $P(x,y)$ with domain $(\mathbb{N}\times\mathbb{N})\times\mathbb{N}$ that defines a function $a:\mathbb{N}\times\mathbb{N}\to\mathbb{N}$ such that $a(0,m)=m$ and $a(n{++},m)=a(n,m){++}$ for all $n,m\in\mathbb{N}$.

Proof. It suffices to prove that for all $N\in\mathbb{N}$ there exists a unique $G\subseteq (\mathbb{N}\times\mathbb{N})\times\mathbb{N}$ such that:

  • for all $n>N$ and $m,y\in\mathbb{N}$, $((n,m),y)\not\in G$;
  • for all $n\leq N$ and $m\in\mathbb{N}$ there is a unique $y\in\mathbb{N}$ such that $((n,m),y)\in G$;
  • for all $m$, $((0,m),m)\in G$; and
  • for all $n<N$ and $m,y\in\mathbb{N}$, if $((n,m),y)\in G$ then $((n{++},m),y{++})\in G$.

This is because we can define the predicate $P$ by $$ P((n,m),y): \forall G\subseteq (\mathbb{N}\times\mathbb{N})\times\mathbb{N}\ \forall N\geq n\ (Q(G,N)\to ((n,m),y)\in G) $$ where $Q(G,N)$ is the predicate defined by the above four bulleted properties. The properties guarantee that $G$ defines a function up to a particular point, and one can also see that if $N\leq N'$ and $G_N$ and $G_{N'}$ are the corresponding sets that uniquely exist, $G_N\leq G_{N'}$, so the choice of $N$ doesn't matter; this is because if you take $\{((n,m),y)\in G_{N'} : n\leq N\}$, this set satisfies the above properties for $N$, and by uniqueness it equals $G_N$.

Let $N\in \mathbb{N}$ be arbitrary and proceed by induction on $N$.

Base case. $G=\{((0,m),m):m\in\mathbb{N}\}$ works.

Inductive step. Let $N'\in\mathbb{N}$ be arbitrary and suppose $G'$ is the unique set satisfying the above for $N'$. Let $$ G = G' \cup \{((n{++},m),y{++}) : n,m,y\in\mathbb{N}\text{ and } ((n,m),y)\in G'\text{ and }n{++}=N'\} $$ By construction, $G$ satisfies the first two properties. The third property holds because it holds for $G'$. The fourth property holds because we added the necessary elements. This $G$ is unique because the fourth property forces these newly added elements to be what they are.

This completes the induction. □

That's fairly technical, but the main idea here is "supposing the rules so far define a well-defined function when you restrict the first argument to $\{0,\dots,N\}$, we can apply the recursive rule to extend the function to $\{0,\dots,N+1\}$." Then, strangely, we don't actually need to extend this to all of $\mathbb{N}$, because we can say "to compute $a(n,m)$, first find the function $\{0,\dots,n\}\times\mathbb{N}\to \mathbb{N}$, then evaluate it at $(n,m)$." It might seem inefficient from a computer engineering perspective, but math doesn't see the inefficiency.


One last idea: in the logical setup of that book, when you construct things, technically speaking, there's an implicit "there exists" in front of it. Notice even functions only say "for every $x$ there's a unique $y$". The value of $y$ only exists, it's not a concrete object. We're in the realm of logic.

Induction is the logical principle for reasoning about recursive data. Thus, it's what you must use to define a recursive function — that's to say, to prove the existence of that function.

$\endgroup$
1
  • $\begingroup$ Thank you for your answer; it really completes Carl Mummet’s one $\endgroup$ Commented Nov 18 at 11:19
-1
$\begingroup$

What recursive definitions and inductive proofs have in common is the idea:

  • start with $n = 0$,
  • move up the natural numbers in steps of one,
  • eventually handle every single natural number as a result.

Other than that, they are formally distinct notions:

  • recursive definitions allow you to define sequences,
  • inductive proofs allow you to prove statements about natural numbers.

Take the recursive definition of the addition of natural numbers as an example. Recall that (a representative variant of) the Recursion theorem states:

Given any functions $f : A \to B$, $g : \mathbb{N} \times A \times B \to B$, there is a unique $h : \mathbb{N} \times A \to B$ such that $$\begin{cases} h(0, a) = f(a) & \text{for all } a \in A, \\ h(n+1, a) = g(n, a, h(n, a)) & \text{for all } n \in \mathbb{N}, a \in A \end{cases}$$

Then in order to define $+ : \mathbb{N} \times \mathbb{N} \to \mathbb{N}$, theoretically we should just invoke the Recursion theorem with

  • $f: \mathbb{N} \to \mathbb{N}$, $\hspace{25mm} f(a) = a$;
  • $g: \mathbb{N} \times \mathbb{N} \times \mathbb{N} \to \mathbb{N}$, $\hspace{8mm} g(n, a, b) = \text{the successor of } b$;

and let $n+m := h(n, m)$.

In particular it makes no formal sense to "assume" anything "inductively" here.

Then why does Tao use inductive language in a recursive construction?

Formally it makes no sense, but I see at least two good reasons behind this:

  1. Explicit invocation of the Recursion theorem can be a bit dry to read. It is often more convenient (although not formally correct) to think of recursion as repeatedly (i) assuming that $x_n$ has already been defined, (ii) defining $x_{n+1}$ using this fact. This way of thinking so much resembles the inductive proofs that it becomes tempting to use the corresponding language as well.

    In short, it is a metaphor that mildly sacrifices the formal correctness for the significant boost in clarity.

  2. In some more complex recursive definitions the sensibility of the definition of $x_{n+1}$ depends on some unproven fact about the term $x_n$ (or even on the terms further back). We can not prove this fact in advance because the sequence has not been defined yet. The formal way to solve this seemingly circular dependency would be to:

    • define $x_{n+1}$ as $$x_{n+1} = \begin{cases} \text{the desired value} & \text{if } x_n \text{ has the desired property}, \\ \text{rubbish} & \text{otherwise}, \end{cases}$$

    • inductively prove that every $x_n$ has the desired property, which at the same time proves that the "desired" branch was used in the recursive definition and so none of the terms is actually "rubbish".

    This is a completely standard rigorous way to deal with this issue, but again not very pleasant to read. For this reason a metaphor is often used instead: we simultaneously define $x_{n+1}$ and prove that it has the desired property, both in one go. This is also not formally correct, but understood by everybody (well, those with formal awareness at least) to be a sort of "syntactic sugar" for the rigorous construction.

    In this metaphor the recursion and induction are coupled so tightly that the use of inductive language becomes natural. Once you've done sufficiently many such proofs, you may get used to it to the point where you start using the inductive language even in the recursive constructions that do not rely on induction at all.

Nonetheless, I think it is a good habit to always remember what is formally happening "behind the scenes" - that is the only way you can guarantee that you are not going too far with the metaphor in a way that no longer makes mathematical sense.

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.