Tokenize is a Julia package that serves a similar purpose and API as the tokenize module in Python but for Julia. This is to take a string or buffer containing Julia code, perform lexical analysis and return a stream of tokens.
Features
- Fast, it currently lexes all of Julia source files in ~0.25 seconds (580 files, 2 million Tokens)
- Round trippable, that is, from a stream of tokens the original string should be recoverable exactly
- Round trippable, that is, from a stream of tokens the original string should be recoverable exactly
- The function tokenize is the main entrypoint for generating Tokens
- Each Token is represented by where it starts and ends, what string it contains and what type it is
- Documentation available
License
MIT LicenseFollow Tokenize.jl
Other Useful Business Software
Securden Privileged Account Manager
Discover and manage administrator, service, and web app passwords, keys, and identities. Automate management with approval workflows. Centrally control, audit, monitor, and record all access to critical IT assets.
Rate This Project
Login To Rate This Project
User Reviews
Be the first to post a review of Tokenize.jl!