Low Level API Reference
jsonpath.token.Token
A token, as returned from lex.Lexer.tokenize()
.
ATTRIBUTE | DESCRIPTION |
---|---|
kind |
The token's type. It is always one of the constants defined in jsonpath.token.py.
TYPE:
|
value |
The path substring containing text for the token.
TYPE:
|
index |
The index at which value starts in path.
TYPE:
|
path |
A reference to the complete JSONPath string from which this token derives.
TYPE:
|
jsonpath.filter.FilterExpression
Bases: ABC
Base class for all filter expression nodes.
children
abstractmethod
Return a list of direct child expressions.
evaluate
abstractmethod
Resolve the filter expression in the given context.
PARAMETER | DESCRIPTION |
---|---|
context
|
Contextual information the expression might choose use during evaluation.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
object
|
The result of evaluating the expression. |
evaluate_async
abstractmethod
async
An async version of evaluate
.
jsonpath.lex.Lexer
Tokenize a JSONPath string.
Some customization can be achieved by subclassing Lexer and setting
class attributes. Then setting lexer_class
on a JSONPathEnvironment
.
ATTRIBUTE | DESCRIPTION |
---|---|
key_pattern |
The regular expression pattern used to match mapping keys/properties.
|
logical_not_pattern |
The regular expression pattern used to match
logical negation tokens. By default,
|
logical_and_pattern |
The regular expression pattern used to match
logical and tokens. By default,
|
logical_or_pattern |
The regular expression pattern used to match
logical or tokens. By default,
|
jsonpath.parse.Parser
TODO:
jsonpath.selectors.JSONPathSelector
TODO:
jsonpath.stream.TokenStream
TODO: