Building a parser (Part I)

David Rodrigues picture David Rodrigues · Feb 26, 2012 · Viewed 34.7k times · Source

I'm making my own javascript-based programming language (yeah, it is crazy, but it's for learn only... maybe?). Well, I'm reading about parsers and the first pass is to convert the code source to tokens, like:

if(x > 5)
  return true;

Tokenizer to:

T_IF          "if"
T_LPAREN      "("
T_IDENTIFIER  "x"
T_GT          ">"
T_NUMBER      "5"
T_RPAREN      ")"
T_IDENTIFIER  "return"
T_TRUE        "true"
T_TERMINATOR  ";"

I don't know if my logic is correct for that for while. On my parser it is even better (or not?) and translate to it (yeah, multidimensional array):

T_IF             "if"
  T_EXPRESSION     ...
    T_IDENTIFIER     "x"
    T_GT             ">"
    T_NUMBER         "5"
  T_CLOSURE        ...
    T_IDENTIFIER     "return"
    T_TRUE           "true"

I have some doubts:

  1. Is my way better or worse that the original way? Note that my code will be read and compiled (translated to another language, like PHP), instead of interpreted all the time.
  2. After I tokenizer, what I need do exactly? I'm really lost on this pass!
  3. There are some good tutorial to learn how I can do it?

Well, is that. Bye!

Answer

Jon Purdy picture Jon Purdy · Feb 26, 2012

Generally, you want to separate the functions of the tokeniser (also called a lexer) from other stages of your compiler or interpreter. The reason for this is basic modularity: each pass consumes one kind of thing (e.g., characters) and produces another one (e.g., tokens).

So you’ve converted your characters to tokens. Now you want to convert your flat list of tokens to meaningful nested expressions, and this is what is conventionally called parsing. For a JavaScript-like language, you should look into recursive descent parsing. For parsing expressions with infix operators of different precedence levels, Pratt parsing is very useful, and you can fall back on ordinary recursive descent parsing for special cases.

Just to give you a more concrete example based on your case, I’ll assume you can write two functions: accept(token) and expect(token), which test the next token in the stream you’ve created. You’ll make a function for each type of statement or expression in the grammar of your language. Here’s Pythonish pseudocode for a statement() function, for instance:

def statement():

  if accept("if"):
    x = expression()
    y = statement()
    return IfStatement(x, y)

  elif accept("return"):
    x = expression()
    return ReturnStatement(x)

  elif accept("{")
    xs = []
    while True:
      xs.append(statement())
      if not accept(";"):
        break
    expect("}")
    return Block(xs)

  else:
    error("Invalid statement!")

This gives you what’s called an abstract syntax tree (AST) of your program, which you can then manipulate (optimisation and analysis), output (compilation), or run (interpretation).