Lexing - Tokenization
← Back to Compilation Pipeline
The first stage of compilation that converts raw source code text into a sequence of tokens (keywords, identifiers, operators, literals). The lexer strips whitespace and comments, producing a flat stream of meaningful language elements.