Tao's definition of a function $f:X\to Y$ consists of two sets $X$ and $Y$ along with a predicate $P(x,y)$ such that for all $x\in X$ there is a unique $y\in Y$ such that $P(x,y)$. (Side note. Sometimes books define a functions to be subsets $G\subseteq X\times Y$ such that for all $x\in X$ there is a unique $y\in Y$ such that $(x,y)\in G$. These are basically equivalent: given a predicate you get a set $G=\{(x,y)\in X\times Y : P(x,y)\}$, and given a set $G$ you get a predicate $P(x,y):(x,y)\in G$.)
Let's use the example of defining an addition function (2.2.1, given in this answer). We want a function $a:\mathbb{N}\times\mathbb{N}\to\mathbb{N}$ with the rules $a(0,m)=m$ and $a(n{++},m)=a(n,m){++}$ for all $n,m\in\mathbb{N}$. This means we need to find a predicate that defines a function with these rules.
For non-recursive definitions, it's easy to find a predicate. For example, with $s:\mathbb{N}\to\mathbb{N}$ defined by $s(n)=n{++}$, you can use $P(n,y):y=n{++}$. Defining functions by cases is easy too. However, recursive rules are trickier. How do you know that the rules actually define a function? They might never terminate when you apply them. There are even simple recursive rules like those of the Collatz conjecture where we don't even know yet whether they define a function or not!
Motto: Induction is the reasoning principle for recursively-defined data.
The natural numbers are recursively defined: you have $0$, and for every $n$, you have a natural number $n{++}$. Also, every natural number can be obtained from these. This last fact is encoded via the principle of induction, that if you prove $P(0)$ and that, for all $n'$, if $P(n')$ then $P(n'{++})$, then $P(n)$ is true for all $n$. Proofs by induction are sort of recursive proofs; when $n\geq 1$, the theorem is allowed to "call itself" on $n-1$. Calling itself on smaller inputs guarantees the argument terminates, so to speak.
Back to addition: we need to build a predicate $P((n,m),y)$. Reading the rules for $a$, the predicate ought to have the following properties:
- for all $m$, $P((0,m),m)$ is true
- for all $n$, $m$, and $y$, if $P((n,m),y)$ then $P((n{++},m),y{++})$.
This second rule is a one-direction version of the equality, but it's the only one we need. (Intuitively, it says we can calculate $a(n{++},m)$ from $a(n,m)$, so the recursive call $a(n,m)$ is using smaller inputs, guaranteeing termination.)
Theorem. There exists a predicate $P(x,y)$ with domain $(\mathbb{N}\times\mathbb{N})\times\mathbb{N}$ that defines a function $a:\mathbb{N}\times\mathbb{N}\to\mathbb{N}$ such that $a(0,m)=m$ and $a(n{++},m)=a(n,m){++}$ for all $n,m\in\mathbb{N}$.
Proof. It suffices to prove that for all $N\in\mathbb{N}$ there exists a unique $G\subseteq (\mathbb{N}\times\mathbb{N})\times\mathbb{N}$ such that:
- for all $n>N$ and $m,y\in\mathbb{N}$, $((n,m),y)\not\in G$;
- for all $n\leq N$ and $m\in\mathbb{N}$ there is a unique $y\in\mathbb{N}$ such that $((n,m),y)\in G$;
- for all $m$, $((0,m),m)\in G$; and
- for all $n<N$ and $m,y\in\mathbb{N}$, if $((n,m),y)\in G$ then $((n{++},m),y{++})\in G$.
This is because we can define the predicate $P$ by $$ P((n,m),y): \forall G\subseteq (\mathbb{N}\times\mathbb{N})\times\mathbb{N}\ \forall N\geq n\ (Q(G,N)\to ((n,m),y)\in G) $$ where $Q(G,N)$ is the predicate defined by the above four bulleted properties. The properties guarantee that $G$ defines a function up to a particular point, and one can also see that if $N\leq N'$ and $G_N$ and $G_{N'}$ are the corresponding sets that uniquely exist, $G_N\leq G_{N'}$, so the choice of $N$ doesn't matter; this is because if you take $\{((n,m),y)\in G_{N'} : n\leq N\}$, this set satisfies the above properties for $N$, and by uniqueness it equals $G_N$.
Let $N\in \mathbb{N}$ be arbitrary and proceed by induction on $N$.
Base case. $G=\{((0,m),m):m\in\mathbb{N}\}$ works.
Inductive step. Let $N'\in\mathbb{N}$ be arbitrary and suppose $G'$ is the unique set satisfying the above for $N'$. Let $$ G = G' \cup \{((n{++},m),y{++}) : n,m,y\in\mathbb{N}\text{ and } ((n,m),y)\in G'\text{ and }n{++}=N'\} $$ By construction, $G$ satisfies the first two properties. The third property holds because it holds for $G'$. The fourth property holds because we added the necessary elements. This $G$ is unique because the fourth property forces these newly added elements to be what they are.
This completes the induction. □
That's fairly technical, but the main idea here is "supposing the rules so far define a well-defined function when you restrict the first argument to $\{0,\dots,N\}$, we can apply the recursive rule to extend the function to $\{0,\dots,N+1\}$." Then, strangely, we don't actually need to extend this to all of $\mathbb{N}$, because we can say "to compute $a(n,m)$, first find the function $\{0,\dots,n\}\times\mathbb{N}\to \mathbb{N}$, then evaluate it at $(n,m)$." It might seem inefficient from a computer engineering perspective, but math doesn't see the inefficiency.
One last idea: in the logical setup of that book, when you construct things, technically speaking, there's an implicit "there exists" in front of it. Notice even functions only say "for every $x$ there's a unique $y$". The value of $y$ only exists, it's not a concrete object. We're in the realm of logic.
Induction is the logical principle for reasoning about recursive data. Thus, it's what you must use to define a recursive function — that's to say, to prove the existence of that function.