• Home
  • History
  • Annotate
Name Date Size #Lines LOC

..11-Sep-2021-

README.mdH A D11-Sep-20212.1 KiB8164

ast.goH A D11-Sep-202149.6 KiB2,2551,748

benchmark_test.goH A D11-Sep-20217.1 KiB345289

js_test.goH A D11-Sep-20217.9 KiB227126

lex.goH A D11-Sep-202120.1 KiB787684

lex_test.goH A D11-Sep-202111.8 KiB296257

parse.goH A D11-Sep-202156.9 KiB2,2252,022

parse_test.goH A D11-Sep-202150.7 KiB1,032963

table.goH A D11-Sep-20213.5 KiB143129

tokentype.goH A D11-Sep-20218.3 KiB405368

util.goH A D11-Sep-2021757 3933

util_test.goH A D11-Sep-2021757 2722

walk.goH A D11-Sep-20214.7 KiB290260

walk_test.goH A D11-Sep-20211.7 KiB113100

README.md

1# JS [![API reference](https://img.shields.io/badge/godoc-reference-5272B4)](https://pkg.go.dev/github.com/tdewolff/minify/v2/parse/js?tab=doc)
2
3This package is a JS lexer (ECMAScript 2020) written in [Go][1]. It follows the specification at [ECMAScript 2020 Language Specification](https://tc39.es/ecma262/). The lexer takes an io.Reader and converts it into tokens until the EOF.
4
5## Installation
6Run the following command
7
8	go get -u github.com/tdewolff/parse/v2/js
9
10or add the following import and run project with `go get`
11
12	import "github.com/tdewolff/parse/v2/js"
13
14## Lexer
15### Usage
16The following initializes a new Lexer with io.Reader `r`:
17``` go
18l := js.NewLexer(parse.NewInput(r))
19```
20
21To tokenize until EOF an error, use:
22``` go
23for {
24	tt, text := l.Next()
25	switch tt {
26	case js.ErrorToken:
27		// error or EOF set in l.Err()
28		return
29	// ...
30	}
31}
32```
33
34### Regular Expressions
35The ECMAScript specification for `PunctuatorToken` (of which the `/` and `/=` symbols) and `RegExpToken` depend on a parser state to differentiate between the two. The lexer will always parse the first token as `/` or `/=` operator, upon which the parser can rescan that token to scan a regular expression using `RegExp()`.
36
37### Examples
38``` go
39package main
40
41import (
42	"os"
43
44	"github.com/tdewolff/parse/v2/js"
45)
46
47// Tokenize JS from stdin.
48func main() {
49	l := js.NewLexer(parse.NewInput(os.Stdin))
50	for {
51		tt, text := l.Next()
52		switch tt {
53		case js.ErrorToken:
54			if l.Err() != io.EOF {
55				fmt.Println("Error on line", l.Line(), ":", l.Err())
56			}
57			return
58		case js.IdentifierToken:
59			fmt.Println("Identifier", string(text))
60		case js.NumericToken:
61			fmt.Println("Numeric", string(text))
62		// ...
63		}
64	}
65}
66```
67
68## Parser
69### Usage
70The following parses a file and returns an abstract syntax tree (AST).
71``` go
72ast, err := js.NewParser(parse.NewInputString("if (state == 5) { console.log('In state five'); }"))
73```
74
75See [ast.go](https://github.com/tdewolff/parse/blob/master/js/ast.go) for all available data structures that can represent the abstact syntax tree.
76
77## License
78Released under the [MIT license](https://github.com/tdewolff/parse/blob/master/LICENSE.md).
79
80[1]: http://golang.org/ "Go Language"
81