Low Level API Reference
jsonpath.token.Token
A token, as returned from lex.Lexer.tokenize()
.
ATTRIBUTE | DESCRIPTION |
---|---|
kind |
The token's type. It is always one of the constants defined in jsonpath.token.py.
TYPE:
|
value |
The path substring containing text for the token.
TYPE:
|
index |
The index at which value starts in path.
TYPE:
|
path |
A reference to the complete JSONPath string from which this token derives.
TYPE:
|
jsonpath.filter.FilterExpression
jsonpath.lex.Lexer
Tokenize a JSONPath string.
Some customization can be achieved by subclassing Lexer and setting
class attributes. Then setting lexer_class
on a JSONPathEnvironment
.
ATTRIBUTE | DESCRIPTION |
---|---|
key_pattern |
The regular expression pattern used to match mapping keys/properties.
|
logical_not_pattern |
The regular expression pattern used to match
logical negation tokens. By default,
|
logical_and_pattern |
The regular expression pattern used to match
logical and tokens. By default,
|
logical_or_pattern |
The regular expression pattern used to match
logical or tokens. By default,
|
jsonpath.parse.Parser
TODO:
jsonpath.selectors.JSONPathSelector
TODO:
jsonpath.stream.TokenStream
TODO: