lexer

package
v0.0.0-...-cacd4db Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 24, 2024 License: MIT Imports: 6 Imported by: 0

Documentation

Index

Constants

View Source
const (
	VALID_CHARS   = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz_"
	NUMS          = "0123456789"
	SPECIAL_CHARS = "(){}[]"
	WHITESPACE    = " \t\r"
)
View Source
const EOF rune = -1

Variables

This section is empty.

Functions

This section is empty.

Types

type Lexer

type Lexer struct {
	Errors error
	// contains filtered or unexported fields
}

func New

func New(src string) (*Lexer, <-chan token.Token)

Creates and returns a lexer and a read only token channel

func (*Lexer) Current

func (l *Lexer) Current() string

Current returns the value being being analyzed at this moment.

func (*Lexer) Emit

func (l *Lexer) Emit(k token.TokenKind)

Emit will receive a token type and push a new token with the current analyzed value into the tokens channel.

func (*Lexer) Ignore

func (l *Lexer) Ignore()

Ignore clears the rewind stack and then sets the current beginning pos to the current pos in the source which effectively ignores the section of the source being analyzed.

func (*Lexer) NewError

func (l *Lexer) NewError(e string)

Partial yyLexer implementation

func (*Lexer) Next

func (l *Lexer) Next() rune

Next pulls the next rune from the Lexer and returns it, moving the pos forward in the source.

func (*Lexer) Peek

func (l *Lexer) Peek() rune

Peek performs a Next operation immediately followed by a Rewind returning the peeked rune.

func (*Lexer) Rewind

func (l *Lexer) Rewind()

Rewind will take the last rune read (if any) and rewind back. Rewinds can occur more than once per call to Next but you can never rewind past the last point a token was emitted.

func (*Lexer) Take

func (l *Lexer) Take(chars string)

Take receives a string containing all acceptable strings and will contine over each consecutive character in the source until a token not in the given string is encountered. This should be used to quickly pull token parts.

type StateFunc

type StateFunc func(*Lexer) StateFunc

func IdentifierState

func IdentifierState(l *Lexer) StateFunc

func NumberState

func NumberState(l *Lexer) StateFunc

func SpecialState

func SpecialState(l *Lexer) StateFunc

func TextState

func TextState(l *Lexer) StateFunc

func WhitespaceState

func WhitespaceState(l *Lexer) StateFunc

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL