HEX
Server: Apache/2.4.52 (Ubuntu)
System: Linux spn-python 5.15.0-89-generic #99-Ubuntu SMP Mon Oct 30 20:42:41 UTC 2023 x86_64
User: arjun (1000)
PHP: 8.1.2-1ubuntu2.20
Disabled: NONE
Upload Files
File: //usr/local/lib/python3.10/dist-packages/langchain/chains/flare/__pycache__/prompts.cpython-310.pyc
o

���g��@slddlmZddlmZddlmZGdd�deeeef�ZdZ	ee	gd�d�Z
d	Zeegd
�d�ZdS)�)�Tuple)�BaseOutputParser)�PromptTemplatec@s:eZdZUdZdZeed<	dedeeeffdd�Z	dS)	�FinishedOutputParserz4Output parser that checks if the output is finished.�FINISHED�finished_value�text�returncCs$|��}|j|v}|�|jd�|fS)N�)�stripr�replace)�selfr�cleaned�finished�r�I/usr/local/lib/python3.10/dist-packages/langchain/chains/flare/prompts.py�parse
s
zFinishedOutputParser.parseN)
�__name__�
__module__�__qualname__�__doc__r�str�__annotations__r�boolrrrrrrs

rz�Respond to the user message using any relevant context. If context is provided, you should ground your answer in that context. Once you're done responding return FINISHED.

>>> CONTEXT: {context}
>>> USER INPUT: {user_input}
>>> RESPONSE: {response})�
user_input�context�response)�template�input_variablesa&Given a user input and an existing partial response as context, ask a question to which the answer is the given term/entity/phrase:

>>> USER INPUT: {user_input}
>>> EXISTING PARTIAL RESPONSE: {current_response}

The question to which the answer is the term/entity/phrase "{uncertain_span}" is:)r�current_response�uncertain_spanN)
�typingr�langchain_core.output_parsersr�langchain_core.promptsrrrr�PROMPT_TEMPLATE�PROMPT�"QUESTION_GENERATOR_PROMPT_TEMPLATE�QUESTION_GENERATOR_PROMPTrrrr�<module>s
�
�