HEX
Server: Apache/2.4.52 (Ubuntu)
System: Linux spn-python 5.15.0-89-generic #99-Ubuntu SMP Mon Oct 30 20:42:41 UTC 2023 x86_64
User: arjun (1000)
PHP: 8.1.2-1ubuntu2.20
Disabled: NONE
Upload Files
File: //usr/local/lib/python3.10/dist-packages/langchain/chains/flare/__pycache__/base.cpython-310.pyc
o

���gS!�@sddlmZddlZddlmZmZmZmZmZm	Z	ddl
Zddlm
Z
ddlmZddlmZddlmZddlmZdd	lmZdd
lmZddlmZddlmZdd
lmZm Z m!Z!ddl"m#Z#d%dd�Z$Gdd�de#�Z%d&d!d"�Z&Gd#d$�d$e�Z'dS)'�)�annotationsN)�Any�Dict�List�Optional�Sequence�Tuple)�CallbackManagerForChainRun)�BaseLanguageModel)�	AIMessage)�StrOutputParser)�BasePromptTemplate)�
BaseRetriever)�Runnable)�Field)�Chain)�PROMPT�QUESTION_GENERATOR_PROMPT�FinishedOutputParser)�LLMChain�responser�return�Tuple[List[str], List[float]]cCs@g}g}|jddD]}|�|d�|�|d�q||fS)z>Extract tokens and log probabilities from chat model response.�logprobs�content�token�logprob)�response_metadata�append)r�tokens�	log_probsr�r!�F/usr/local/lib/python3.10/dist-packages/langchain/chains/flare/base.py�_extract_tokens_and_log_probssr#c@s<eZdZUdZeZded<	eddd��Ze	d
d	d
��Z
dS)�QuestionGeneratorChainz4Chain that generates questions from uncertain spans.r
�promptr�boolcCsdS)NFr!)�clsr!r!r"�is_lc_serializable+sz)QuestionGeneratorChain.is_lc_serializable�	List[str]cCsgd�S)�Input keys for the chain.��
user_input�contextrr!��selfr!r!r"�
input_keys/sz!QuestionGeneratorChain.input_keysN)rr&�rr))�__name__�
__module__�__qualname__�__doc__rr%�__annotations__�classmethodr(�propertyr0r!r!r!r"r$%s
r$r�
Sequence[str]r �Sequence[float]�min_prob�float�
min_token_gap�int�num_pad_tokensr)cs�t�t�|�|k�d}�fdd�|D�}t|�dkrgS|d|d|dgg}t|dd��D] \}}	|	|d}
|	|||krK|
|dd<q2|�|	|
g�q2�fdd�|D�S)Nrcs g|]}t�d�|�r|�qS)z\w)�re�search)�.0�i�rr!r"�
<listcomp>=s z)_low_confidence_spans.<locals>.<listcomp>����cs"g|]
\}}d��||���qS)�)�join)rB�start�endrDr!r"rEGs")�np�where�exp�len�	enumerater)rr r;r=r?�_low_idx�low_idx�spansrC�idxrKr!rDr"�_low_confidence_spans5srUc@s�eZdZUdZded<	ded<	eed�Zded<	ded	<	d
Zded<	d
Z	ded<	dZ
ded<	dZded<	dZded<	e
d9dd��Ze
d9dd��Zd:d$d%�Zd;d(d)�Z	*d<d=d/d0�Ze	1d>d?d7d8��Zd*S)@�
FlareChainz�Chain that combines a retriever, a question generator,
    and a response generator.

    See [Active Retrieval Augmented Generation](https://arxiv.org/abs/2305.06983) paper.
    r�question_generator_chain�response_chain)�default_factoryr�
output_parserr�	retrieverg�������?r<r;�r>r=�r?�
�max_iterTr&�start_with_retrievalrr)cC�dgS)r*r,r!r.r!r!r"r0d�zFlareChain.input_keyscCra)zOutput keys for the chain.rr!r.r!r!r"�output_keysirbzFlareChain.output_keys�	questionsr,�strr�_run_managerr	�Tuple[str, bool]cCs~|��}g}|D]}|�|j�|��qd�dd�|D��}|j�|||d�d|i�}	t|	t�r3|	j}	|j	�
|	�\}
}|
|fS)Nz

css�|]}|jVqdS�N)�page_content)rB�dr!r!r"�	<genexpr>ys�z,FlareChain._do_generation.<locals>.<genexpr>r+�	callbacks)�	get_child�extendr[�invokerIrX�
isinstancerrrZ�parse)r/rdr,rrfrl�docs�questionr-�result�marginal�finishedr!r!r"�_do_generationns ��
zFlareChain._do_generation�low_confidence_spans�initial_responsec
s���fdd�|D�}|��}t�jt�r&�jj||d�}�fdd�|D�}	n
�jj|d|id�}	|jd|	��dd	d
���|	�||�S)Ncsg|]}��|d��qS))r,�current_response�uncertain_spanr!)rB�span)ryr,r!r"rE�s���z,FlareChain._do_retrieval.<locals>.<listcomp>)rlcsg|]
}|�jjd�qS)r)rWrc)rB�outputr.r!r"rE�s��rl)�configzGenerated Questions: �yellow�
��colorrK)rmrprWr�apply�batch�on_textrw)
r/rxrfr,rry�question_gen_inputsrl�question_gen_outputsrdr!)ryr/r,r"�
_do_retrieval�s$��
���zFlareChain._do_retrievalN�inputs�Dict[str, Any]�run_manager�$Optional[CallbackManagerForChainRun]cCs|pt��}||jd}d}t|j�D]g}|jd|��ddd�|d|d�}t|j�|d|�	�i��\}}	t
||	|j|j|j
�}
|��d	d�|�}|
sd|}|j�|�\}}
|
rc|jd|iSq|�|
||||�\}}
|��d	|}|
r{nq|jd|iS)
NrrHzCurrent Response: �bluer�r�r+rl� )r	�get_noop_managerr0�ranger_r�r#rXrormrUr;r=r?�striprIrZrqrcr�)r/r�r�rfr,rrC�_inputrr rxry�final_responservrur!r!r"�_call�sN������zFlareChain._call� �llmr
�max_generation_len�kwargsrcKs`zddlm}Wntytd��w||ddd�}t|B}t|Bt�B}|d||d�|��S)	aHCreates a FlareChain from a language model.

        Args:
            llm: Language model to use.
            max_generation_len: Maximum length of the generated response.
            kwargs: Additional arguments to pass to the constructor.

        Returns:
            FlareChain class with the given language model.
        r)�
ChatOpenAIz_OpenAI is required for FlareChain. Please install langchain-openai.pip install langchain-openaiT)�max_completion_tokensr�temperature)rWrXNr!)�langchain_openair��ImportErrorrrr)r'r�r�r�r�rX�question_gen_chainr!r!r"�from_llm�s$�����zFlareChain.from_llmr1)
rdr)r,rerrerfr	rrg)rxr)rfr	r,rerreryrerrgrh)r�r�r�r�rr�)r�)r�r
r�r>r�rrrV)r2r3r4r5r6rrrZr;r=r?r_r`r8r0rcrwr�r�r7r�r!r!r!r"rVJs>


%�0�rV)rrrr)rr9r r:r;r<r=r>r?r>rr))(�
__future__rr@�typingrrrrrr�numpyrL�langchain_core.callbacksr	�langchain_core.language_modelsr
�langchain_core.messagesr�langchain_core.output_parsersr�langchain_core.promptsr
�langchain_core.retrieversr�langchain_core.runnablesr�pydanticr�langchain.chains.baser�langchain.chains.flare.promptsrrr�langchain.chains.llmrr#r$rUrVr!r!r!r"�<module>s&