4
$\begingroup$

Due to some context, I have reason to believe that S(K(SII)) and SSI are actually equivalent CL terms. This is my attempt at a proof (assuming a and b to be arbitrary CL terms):

$$\text{S(K(SII))ab = SII(ab) = ab(ab)}$$

$$\text{SSIab = Saab = ab(ab)}$$

Since both simplify to the same form, can we say that S(K(SII)) and SSI are equivalent terms?

[This problem arose when I was looking to describe the Y-combinator using only S, K and I. The wikipedia definition starts with S(K(SII)), while mine starts with SSI]

$\endgroup$
4
  • 1
    $\begingroup$ I don't see any place for a doubt. You have not used any specific arguments, only the general ones, only the variables. So syntactically they're different but essentially they are the same. You've proved their extensional equality. BTW I've recently found yet another encoding for Y, which is one $SK$-letter shorter than the previous. We get it by $S$-rule contraction from the longer one, that one starting with $S(K(SII))$. :) And if we allow $I$, Turing's $\Theta$ is the absolute shortest, at just 9 symbols (not counting the parentheses). $\endgroup$ Commented May 21 at 9:58
  • $\begingroup$ @WillNess Thanks for providing the keyword ‘extensionality’ (I was unaware of that term, and am not much well versed in lambda calculus). I guess that I was actually asking whether extensional equality implies ‘equivalence’ of the CL terms. A pretty naïve question, now that I think about it, because I was unclear about the particular equivalence that I meant. Extensional equality does imply $\eta$-equivalence. $\endgroup$ Commented May 21 at 12:23
  • 1
    $\begingroup$ This Wikipedia section discusses more about this. There, the term “equivalent” is defined to be the same as “extensionally equal”. Nice work on your new representations for $Y$, btw. Already upvoted :) $\endgroup$ Commented May 21 at 12:28
  • 1
    $\begingroup$ You're welcome! (and thank you :) ) $\eta$-equivalence is exactly right, I was contemplating adding another comment about that. It means that in combinatory calculus we can always freely add as many new fresh arguments on both sides of our equations, as needed -- as well as of course remove them if they do not appear anywhere else. Combinatory calculus ought to be able to be defined fully independently from lambda calculus, I think; I'm no mathematician myself unfortunately. :) $\endgroup$ Commented May 21 at 12:34

1 Answer 1

2
$\begingroup$

Your proof seems to work, but I'm not sure. However you can use lambda calculus to show a beta equivalence. That works guaranteed: $$I:=\lambda x.x, \quad K:=\lambda x. \lambda y. x, \quad S := \lambda x. \lambda y. \lambda z. xz(yz)$$ The first term reduces to: $$ SII = \lambda z. (\lambda x.x) z ((\lambda x.x)z) \rightarrow_\beta \lambda z.zz \\ S(K(SII)) = \lambda y . \lambda z. ( \lambda y . \lambda z . zz) z (yz) \rightarrow_\beta \lambda y . \lambda z. yz(yz)$$ And the second to (Keep in mind to rename conflicting variables!): $$ SS = \lambda y . \lambda z. ( \lambda x. \lambda y . \lambda z. xz(yz))z(yz) \rightarrow_\beta \lambda y . \lambda z. (\lambda z'. z z' (yz z'))\\ SSI = \lambda z . \lambda z' . z z' (z z') =_\alpha \lambda y . \lambda z. yz(yz)$$ Therefore $SSI$ and $S(K(SII))$ are $\beta$-equivalent.

Hope this helps :)

$\endgroup$
0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.