![]() |
Qucs-S S-parameter Viewer & RF Synthesis Tools
|
Classes | |
| class | _inherit |
| class | _PseudoMatch |
| class | _This |
| class | combined |
| class | default |
| class | DelegatingLexer |
| class | ExtendedRegexLexer |
| class | include |
| class | Lexer |
| class | LexerContext |
| class | LexerMeta |
| class | ProfilingRegexLexer |
| class | ProfilingRegexLexerMeta |
| class | RegexLexer |
| class | RegexLexerMeta |
| class | words |
Functions | |
| bygroups (*args) | |
| using (_other, **kwargs) | |
| do_insertions (insertions, tokens) | |
Variables | |
| line_re = re.compile('.*?\n') | |
| list | _encoding_map |
| _default_analyse = staticmethod(lambda x: 0.0) | |
| inherit = _inherit() | |
| this = _This() | |
pygments.lexer
~~~~~~~~~~~~~~
Base lexer classes.
:copyright: Copyright 2006-2023 by the Pygments team, see AUTHORS.
:license: BSD, see LICENSE for details.
| pip._vendor.pygments.lexer.bygroups | ( | * | args | ) |
Callback that yields multiple actions for each group in the match.
| pip._vendor.pygments.lexer.do_insertions | ( | insertions, | |
| tokens | |||
| ) |
Helper for lexers which must combine the results of several sublexers. ``insertions`` is a list of ``(index, itokens)`` pairs. Each ``itokens`` iterable should be inserted at position ``index`` into the token stream given by the ``tokens`` argument. The result is a combined token stream. TODO: clean up the code here.
| pip._vendor.pygments.lexer.using | ( | _other, | |
| ** | kwargs | ||
| ) |
Callback that processes the match with a different lexer.
The keyword arguments are forwarded to the lexer, except `state` which
is handled separately.
`state` specifies the state that the new lexer will start in, and can
be an enumerable such as ('root', 'inline', 'string') or a simple
string which is assumed to be on top of the root state.
Note: For that to work, `_other` must not be an `ExtendedRegexLexer`.
|
protected |