Skip to content

Low Level API Reference

jsonpath.token.Token

A token, as returned from lex.Lexer.tokenize().

ATTRIBUTE DESCRIPTION
kind

The token's type. It is always one of the constants defined in jsonpath.token.py.

TYPE: str

value

The path substring containing text for the token.

TYPE: str

index

The index at which value starts in path.

TYPE: str

path

A reference to the complete JSONPath string from which this token derives.

TYPE: str

position

position() -> Tuple[int, int]

Return the line and column number for the start of this token.

handler: python

jsonpath.filter.FilterExpression

Bases: ABC

Base class for all filter expression nodes.

children abstractmethod

children() -> List[FilterExpression]

Return a list of direct child expressions.

evaluate abstractmethod

evaluate(context: FilterContext) -> object

Resolve the filter expression in the given context.

PARAMETER DESCRIPTION
context

Contextual information the expression might choose use during evaluation.

TYPE: FilterContext

RETURNS DESCRIPTION
object

The result of evaluating the expression.

evaluate_async abstractmethod async

evaluate_async(context: FilterContext) -> object

An async version of evaluate.

set_children abstractmethod

set_children(children: List[FilterExpression]) -> None

Update this expression's child expressions.

children is assumed to have the same number of items as is returned by self.children, and in the same order.

handler: python

jsonpath.lex.Lexer

Tokenize a JSONPath string.

Some customization can be achieved by subclassing Lexer and setting class attributes. Then setting lexer_class on a JSONPathEnvironment.

ATTRIBUTE DESCRIPTION
key_pattern

The regular expression pattern used to match mapping keys/properties.

logical_not_pattern

The regular expression pattern used to match logical negation tokens. By default, not and ! are equivalent.

logical_and_pattern

The regular expression pattern used to match logical and tokens. By default, and and && are equivalent.

logical_or_pattern

The regular expression pattern used to match logical or tokens. By default, or and || are equivalent.

compile_rules

compile_rules() -> Pattern[str]

Prepare regular expression rules.

tokenize

tokenize(path: str) -> Iterator[Token]

Generate a sequence of tokens from a JSONPath string.

jsonpath.parse.Parser

TODO:

jsonpath.selectors.JSONPathSelector

TODO:

jsonpath.stream.TokenStream

TODO: