![]() |
Qucs-S S-parameter Viewer & RF Synthesis Tools
|


Public Member Functions | |
| __init__ (self, _root_lexer, _language_lexer, _needle=Other, **options) | |
| get_tokens_unprocessed (self, text) | |
Public Member Functions inherited from pip._vendor.pygments.lexer.Lexer | |
| __repr__ (self) | |
| add_filter (self, filter_, **options) | |
| analyse_text (text) | |
| get_tokens (self, text, unfiltered=False) | |
Public Member Functions inherited from pip._vendor.pygments.lexer.LexerMeta | |
| __new__ (mcs, name, bases, d) | |
Public Attributes | |
| root_lexer | |
| language_lexer | |
| needle | |
Public Attributes inherited from pip._vendor.pygments.lexer.Lexer | |
| options | |
| stripnl | |
| stripall | |
| ensurenl | |
| tabsize | |
| encoding | |
| filters | |
Additional Inherited Members | |
Static Public Attributes inherited from pip._vendor.pygments.lexer.Lexer | |
| name = None | |
| list | aliases = [] |
| list | filenames = [] |
| list | alias_filenames = [] |
| list | mimetypes = [] |
| int | priority = 0 |
| url = None | |
This lexer takes two lexer as arguments. A root lexer and a language lexer. First everything is scanned using the language lexer, afterwards all ``Other`` tokens are lexed using the root lexer. The lexers from the ``template`` lexer package use this base lexer.
| pip._vendor.pygments.lexer.DelegatingLexer.__init__ | ( | self, | |
| options, | |||
| _language_lexer, | |||
_needle = Other, |
|||
| ** | options | ||
| ) |
This constructor takes arbitrary options as keyword arguments.
Every subclass must first process its own options and then call
the `Lexer` constructor, since it processes the basic
options like `stripnl`.
An example looks like this:
.. sourcecode:: python
def __init__(self, **options):
self.compress = options.get('compress', '')
Lexer.__init__(self, **options)
As these options must all be specifiable as strings (due to the
command line usage), there are various utility functions
available to help with that, see `Utilities`_.
Reimplemented from pip._vendor.pygments.lexer.Lexer.
Reimplemented in pip._vendor.pygments.lexers.python.PythonConsoleLexer.
| pip._vendor.pygments.lexer.DelegatingLexer.get_tokens_unprocessed | ( | self, | |
| text | |||
| ) |
This method should process the text and return an iterable of ``(index, tokentype, value)`` tuples where ``index`` is the starting position of the token within the input text. It must be overridden by subclasses. It is recommended to implement it as a generator to maximize effectiveness.
Reimplemented from pip._vendor.pygments.lexer.Lexer.