Qucs-S S-parameter Viewer & RF Synthesis Tools
Loading...
Searching...
No Matches
Public Member Functions | Public Attributes | Protected Member Functions | List of all members
jinja2.lexer.Lexer Class Reference

Public Member Functions

None __init__ (self, "Environment" environment)
 
TokenStream tokenize (self, str source, t.Optional[str] name=None, t.Optional[str] filename=None, t.Optional[str] state=None)
 
t.Iterator[Tokenwrap (self, t.Iterable[t.Tuple[int, str, str]] stream, t.Optional[str] name=None, t.Optional[str] filename=None)
 
t.Iterator[t.Tuple[int, str, str]] tokeniter (self, str source, t.Optional[str] name, t.Optional[str] filename=None, t.Optional[str] state=None)
 

Public Attributes

 lstrip_blocks
 
 newline_sequence
 
 keep_trailing_newline
 

Protected Member Functions

str _normalize_newlines (self, str value)
 

Detailed Description

Class that implements a lexer for a given environment. Automatically
created by the environment class, usually you don't have to do that.

Note that the lexer is not automatically bound to an environment.
Multiple environments can share the same lexer.

Member Function Documentation

◆ _normalize_newlines()

str jinja2.lexer.Lexer._normalize_newlines (   self,
str  value 
)
protected
Replace all newlines with the configured sequence in strings
and template data.

◆ tokeniter()

t.Iterator[t.Tuple[int, str, str]] jinja2.lexer.Lexer.tokeniter (   self,
str  source,
t.Optional[str]  name,
t.Optional[str]   filename = None,
t.Optional[str]   state = None 
)
This method tokenizes the text and returns the tokens in a
generator. Use this method if you just want to tokenize a template.

.. versionchanged:: 3.0
    Only ``\\n``, ``\\r\\n`` and ``\\r`` are treated as line
    breaks.

◆ tokenize()

TokenStream jinja2.lexer.Lexer.tokenize (   self,
str  source,
t.Optional[str]   name = None,
t.Optional[str]   filename = None,
t.Optional[str]   state = None 
)
Calls tokeniter + tokenize and wraps it in a token stream.

◆ wrap()

t.Iterator[Token] jinja2.lexer.Lexer.wrap (   self,
t.Iterable[t.Tuple[int, str, str]]  stream,
t.Optional[str]   name = None,
t.Optional[str]   filename = None 
)
This is called with the stream as returned by `tokenize` and wraps
every token in a :class:`Token` and converts the value.

The documentation for this class was generated from the following file: