HEX
Server: Apache/2.4.52 (Ubuntu)
System: Linux spn-python 5.15.0-89-generic #99-Ubuntu SMP Mon Oct 30 20:42:41 UTC 2023 x86_64
User: arjun (1000)
PHP: 8.1.2-1ubuntu2.20
Disabled: NONE
Upload Files
File: //home/arjun/projects/env/lib/python3.10/site-packages/flake8/__pycache__/processor.cpython-310.pyc
o

%we�@�@sJdZddlmZddlZddlZddlZddlZddlmZddlm	Z	ddlm
Z
ddlmZddlm
Z
dd	lmZdd
lmZddlmZddlmZe�e�Zeejejg�Zeejejejejg�Ze
eeeeeffZee
ee
eefZ Gd
d�d�Z!d%dd�Z"d%dd�Z#d%dd�Z$d&dd�Z%d'd d!�Z&d(d#d$�Z'dS))zFModule containing our file processor that tokenizes a file for checks.�)�annotationsN)�Any)�	Generator)�List)�Tuple)�defaults)�utils)�FSTRING_END)�FSTRING_MIDDLE)�LoadedPluginc@seZdZdZdZ	dSdTdd
�ZedUdd��ZdVdd�ZdWdd�Z	dXdd�Z
dXdd�ZdXdd�ZdYd"d#�Z
dZd&d'�ZdXd(d)�Zd[d+d,�Zd\d.d/�Zd]d1d2�Zd^d7d8�Zd_d:d;�Zd`d?d@�ZdadCdD�ZdbdEdF�ZdcdHdI�ZdcdJdK�ZdcdLdM�ZdddOdP�ZdXdQdR�ZdS)e�
FileProcessora=Processes a file and holds state.

    This processes a file by generating tokens, logical and physical lines,
    and AST trees. This also provides a way of passing state about the file
    to checks expecting that state. Any public attribute on this object can
    be requested by a plugin. The known public attributes are:

    - :attr:`blank_before`
    - :attr:`blank_lines`
    - :attr:`checker_state`
    - :attr:`indent_char`
    - :attr:`indent_level`
    - :attr:`line_number`
    - :attr:`logical_line`
    - :attr:`max_line_length`
    - :attr:`max_doc_length`
    - :attr:`multiline`
    - :attr:`noqa`
    - :attr:`previous_indent_level`
    - :attr:`previous_logical`
    - :attr:`previous_unindented_logical_line`
    - :attr:`tokens`
    - :attr:`file_tokens`
    - :attr:`total_lines`
    - :attr:`verbose`
    FN�filename�str�options�argparse.Namespace�lines�list[str] | None�return�NonecCs�||_||_|dur|n|��|_|��d|_d|_i|_i|_|j	|_	d|_
d|_|j|_d|_
d|_|j|_|j|_d|_d|_d|_d|_g|_t|j�|_|j|_ddi|_d|_d|_d|_dS)z]Initialize our file processor.

        :param filename: Name of the file to process
        Nr�F�
logical lines���)rr
�
read_linesr�
strip_utf_bom�blank_before�blank_lines�_checker_states�
checker_state�hang_closing�indent_char�indent_level�indent_size�line_number�logical_line�max_line_length�max_doc_length�	multiline�previous_indent_level�previous_logical� previous_unindented_logical_line�tokens�len�total_lines�verbose�
statistics�_file_tokens�_noqa_line_mapping�_fstring_start)�selfr
rr�r3�I/home/arjun/projects/env/lib/python3.10/site-packages/flake8/processor.py�__init__=s6


zFileProcessor.__init__�list[tokenize.TokenInfo]cs2|jdurt|j��tt��fdd���|_|jS)z-Return the complete set of tokens for a file.Ncst��S�N)�nextr3��	line_iterr3r4�<lambda>�sz+FileProcessor.file_tokens.<locals>.<lambda>)r/�iterr�list�tokenize�generate_tokens�r2r3r9r4�file_tokenszs

�zFileProcessor.file_tokens�lineno�intcCs
||_dS)z#Signal the beginning of an fstring.N)r1)r2rBr3r3r4�
fstring_start��
zFileProcessor.fstring_start�token�tokenize.TokenInfo�Generator[str, None, None]ccsj�|jtkr
|j}n|jd}d|_||_t||jd�D]}|j|jdV|jd7_qd|_dS)z0Iterate through the lines of a multiline string.rT�FN)	�typer	r1�startr&r"�range�endr)r2rFrK�_r3r3r4�multiline_string�s�


zFileProcessor.multiline_stringcCs
d|_dS)z)Reset the blank_before attribute to zero.rN)rr@r3r3r4�reset_blank_before�rEz FileProcessor.reset_blank_beforecCs|jd=dS)z-Delete the first token in the list of tokens.rN)r*r@r3r3r4�delete_first_token��z FileProcessor.delete_first_tokencCs|jd7_dS)z&Note that we visited a new blank line.rIN)rr@r3r3r4�visited_new_blank_line��z$FileProcessor.visited_new_blank_line�mapping�_LogicalMappingcCsL|dd\}}|j|d}t|d|��|_|j|jkr$|j|_dSdS)z:Update the indent level based on the logical line mapping.rrIN)r�
expand_indentr rr)r2rU�	start_row�	start_col�
start_liner3r3r4�update_state�s�zFileProcessor.update_state�pluginrcCs$d|jvr|j�|ji�|_dSdS)z2Update the checker_state attribute for the plugin.rN)�
parametersr�
setdefault�
entry_namer)r2r\r3r3r4�update_checker_state_for�s


��z&FileProcessor.update_checker_state_forcCs4|jr|j|_|j|_|js|j|_d|_g|_dS)zoRecord the previous logical line.

        This also resets the tokens list and the blank_lines count.
        rN)r#r r'r(r)rr*r@r3r3r4�next_logical_line�s
zFileProcessor.next_logical_line�_LogicalcCs(g}g}g}d}d}}|jD]\}}}	}
}|tvrq|s"d|	fg}|tjkr-|�|�q|tjkr7t|�}n
|tkrAdt|�}|rx|	\}}
||krl|d}|d}|j	||}|dksf|dvrk|dvrkd|��}n||
krx|||
�|}|�|�|t|�7}|�||
f�|
\}}q|||fS)	z4Build the mapping, comments, and logical line lists.rN�xrI�,z{[(�}])� )
r*�SKIP_TOKENSr>�COMMENT�append�STRING�
mutate_stringr
r+r)r2�logical�commentsrU�length�previous_row�previous_column�
token_type�textrKrM�linerX�start_column�	row_index�column_index�
previous_textr3r3r4�build_logical_line_tokens�sD





�


z'FileProcessor.build_logical_line_tokens�ast.ASTcCst�d�|j��S)z5Build an abstract syntax tree from the list of lines.r)�ast�parse�joinrr@r3r3r4�	build_ast�rTzFileProcessor.build_ast� tuple[str, str, _LogicalMapping]cCsB|��\}}}d�|�}d�|�|_|jdd7<||j|fS)z2Build a logical line from the current tokens list.rrrI)rxr|r#r.)r2rmrl�mapping_list�joined_commentsr3r3r4�build_logical_line�s

z FileProcessor.build_logical_liner]�dict[str, bool]�	arguments�dict[str, Any]c	CsZi}|��D]$\}}||vrqz	t||�||<Wqty*|r"�t�d|�Yqw|S)z8Generate the keyword arguments for a list of parameters.zPPlugin requested optional parameter "%s" but this is not an available parameter.)�items�getattr�AttributeError�LOG�warning)r2r]r��ret�param�requiredr3r3r4�keyword_arguments_for�s��	z#FileProcessor.keyword_arguments_for�)Generator[tokenize.TokenInfo, None, None]ccsB�t�|j�D]}|dd|jkrdS|j�|�|VqdS)z'Tokenize the file and yield the tokens.�rN)r>r?�	next_liner,r*ri)r2rFr3r3r4r?s��zFileProcessor.generate_tokens�min_line�max_line�dict[int, str]cCs2t||d�}d�|j|d|��}t�||�S)NrIr)rLr|r�dict�fromkeys)r2r�r��
line_range�joinedr3r3r4�_noqa_line_rangeszFileProcessor._noqa_line_ranger"�
str | Nonec
	Cs�|jdurnz|j}Wntjtfyi|_YnUwi}t|j�d}d}|D]6\}}\}}\}	}}|tjkr9n%t||�}t	||	�}|tj
tjfvr]|�|�
||��t|j�d}d}q'|dkrk|�|�
||��||_|j�|�S)z7Retrieve the line which will be used to determine noqa.Nr�r)r0rAr>�
TokenError�SyntaxErrorr+r�	ENDMARKER�min�max�NL�NEWLINE�updater��get)
r2r"rAr�r�r��tprN�s_line�e_liner3r3r4�
noqa_line_fors.


�


�zFileProcessor.noqa_line_forcCsT|j|jkrdS|j|j}|jd7_|jdur(|dd�tjvr(|d|_|S)z Get the next line from the list.rrINr)r"r,rrr�
WHITESPACE)r2rsr3r3r4r�@s
zFileProcessor.next_line�	list[str]cCs0|jdkr|jjp
d|_|��}|S|��}|S)z%Read the lines for this file checker.�-�stdin)r
r�stdin_display_name�read_lines_from_stdin�read_lines_from_filename)r2rr3r3r4rJs
�zFileProcessor.read_linescCs�zt�|j��
}|��Wd�WS1swYWdSttfyGt|jdd��}|��Wd�YS1s?wYYdSw)zRead the lines for a file.Nzlatin-1)�encoding)r>�openr
�	readlinesr��UnicodeError)r2�fdr3r3r4r�Ss(�*��z&FileProcessor.read_lines_from_filenamecCst��S)z Read the lines from standard in.)r�stdin_get_linesr@r3r3r4r�^sz#FileProcessor.read_lines_from_stdin�boolcCsF|jjstdd�|jD��rdStdd�|jD��r!t�d�dSdS)z�Check if ``flake8: noqa`` is in the file to be ignored.

        :returns:
            True if a line matches :attr:`defaults.NOQA_FILE`,
            otherwise False
        cs��|]	}tj�|�VqdSr7)r�	NOQA_FILE�match��.0rsr3r3r4�	<genexpr>is�
�z3FileProcessor.should_ignore_file.<locals>.<genexpr>Tcsr�r7)rr��searchr�r3r3r4r�ms�z[Detected `flake8: noqa` on line with code. To ignore an error on a line use `noqa` instead.F)r�disable_noqa�anyrr�r�r@r3r3r4�should_ignore_filebs��z FileProcessor.should_ignore_filecCs�|jsdSt|jdd�}|dvrdS|dkr&|jddd�|jd<dS|jddd�dkr?|jddd�|jd<dSdS)z-Strip the UTF bom from the lines of the file.Nr)���r�rI�u)r�ord)r2�
first_byter3r3r4rvs�zFileProcessor.strip_utf_bomr7)r
rrrrrrr)rr6)rBrCrr)rFrGrrH)rr)rUrVrr)r\rrr)rrb)rry)rr~)r]r�r�r�rr�)rr�)r�rCr�rCrr�)r"rCrr�)rr)rr�)rr�)�__name__�
__module__�__qualname__�__doc__�noqar5�propertyrArDrOrPrQrSr[r`rarxr}r�r�r?r�r�r�rr�r�r�rr3r3r3r4rs8�=










%





(


	

rrFrGrr�cCs,|dtvp|d|ddd���dkS)z+Check if the token is an end-of-line token.r�r�rINz\
)r��lstrip�rFr3r3r4�is_eol_token�s,r�cCs |jtkp|jtjkod|jvS)z$Check if this is a multiline string.�
)rJr	r>rj�stringr�r3r3r4�is_multiline_string�s
�r�cCs|dtvS)z0Check if the token type is a newline token type.r)r�r�r3r3r4�token_is_newline�rRr��current_parentheses_countrC�
token_textrcCs$|dvr|dS|dvr|dS|S)z Count the number of parentheses.z([{rIrer3)r�r�r3r3r4�count_parentheses�s
r�rscCst|�d��S)z�Return the amount of indentation.

    Tabs are expanded to the next multiple of 8.

    >>> expand_indent('    ')
    4
    >>> expand_indent('\t')
    8
    >>> expand_indent('       \t')
    8
    >>> expand_indent('        \t')
    16
    �)r+�
expandtabs)rsr3r3r4rW�srWrrcCsb|�|d�d}t|�d}|dd�dvr|d7}|d8}|d|�d||||d�S)z�Replace contents with 'xxx' to prevent syntax matching.

    >>> mutate_string('"abc"')
    '"xxx"'
    >>> mutate_string("'''abc'''")
    "'''xxx'''"
    >>> mutate_string("r'abc'")
    "r'xxx'"
    rrI���N)z"""z'''r�rc)�indexr+)rrrKrMr3r3r4rk�s
$rk)rFrGrr�)r�rCr�rrrC)rsrrrC)rrrrr)(r��
__future__r�argparserz�loggingr>�typingrrrr�flake8rr�flake8._compatr	r
�flake8.plugins.finderr�	getLoggerr�r��	frozensetr�r��INDENT�DEDENTrgrCrVrrbrr�r�r�r�rWrkr3r3r3r4�<module>s>
�
k