Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

4
  • $\begingroup$ Compilers have a front end and a back end. The concepts you mention are used in front end stages, lexical analysis and parsing, often via tools like lex/flex and yacc/bison. Back ends, doing optimization and code generation, use less formal methods. Interestingly, I believe Latex, based on Tex, is a macro system and I’m not sure whether it uses a parser at all... $\endgroup$ Commented Feb 22, 2020 at 15:22
  • $\begingroup$ @Daniel No, I suppose it wouldn't. But the result produced by this compiler is not LaTeX (nor TeX macro expansions), so the tokens produced by the lexing stage have to be parsed somehow, no? $\endgroup$ Commented Feb 22, 2020 at 16:07
  • 1
    $\begingroup$ If your thesis is focused on a specific compiler (can you share which it is?) it may use very ad-hoc or fully formal techniques - you have to dig into the code. If you can expand discussion to compilation in general, you are on safe ground, as the theory is very much applied in real implementations. $\endgroup$ Commented Feb 22, 2020 at 16:52
  • 1
    $\begingroup$ @DanielMGessel, TeX uses an ad hoc hand-written parser (surprisingly, as Knuth did fundamental work in parsing, and his professors were famously miffed that as an undergraduate he did more working on compilers during break than they did in a full year). $\endgroup$ Commented Feb 23, 2020 at 16:37