Skip to content

Low Level API Reference

jsonpath.token.Token

A token, as returned from lex.Lexer.tokenize().

ATTRIBUTE DESCRIPTION
kind

The token's type. It is always one of the constants defined in jsonpath.token.py.

TYPE: str

value

The path substring containing text for the token.

TYPE: str

index

The index at which value starts in path.

TYPE: str

path

A reference to the complete JSONPath string from which this token derives.

TYPE: str

position

position() -> Tuple[int, int]

Return the line and column number for the start of this token.

jsonpath.filter.FilterExpression

Bases: BaseExpression

An expression that evaluates to True or False.

cache_tree

cache_tree() -> FilterExpression

Return a copy of self.expression augmented with caching nodes.

cacheable_nodes

cacheable_nodes() -> bool

Return True if there are any cacheable nodes in this expression tree.

jsonpath.lex.Lexer

Tokenize a JSONPath string.

Some customization can be achieved by subclassing Lexer and setting class attributes. Then setting lexer_class on a JSONPathEnvironment.

ATTRIBUTE DESCRIPTION
key_pattern

The regular expression pattern used to match mapping keys/properties.

logical_not_pattern

The regular expression pattern used to match logical negation tokens. By default, not and ! are equivalent.

logical_and_pattern

The regular expression pattern used to match logical and tokens. By default, and and && are equivalent.

logical_or_pattern

The regular expression pattern used to match logical or tokens. By default, or and || are equivalent.

compile_rules

compile_rules() -> Pattern[str]

Prepare regular expression rules.

compile_strict_rules

compile_strict_rules() -> Pattern[str]

Prepare regular expression rules in strict mode.

tokenize

tokenize(path: str) -> Iterator[Token]

Generate a sequence of tokens from a JSONPath string.

jsonpath.parse.Parser

TODO:

jsonpath.selectors.JSONPathSelector

TODO:

jsonpath.stream.TokenStream

TODO: