File: //usr/local/lib/python3.10/dist-packages/langchain/chains/flare/__pycache__/base.cpython-310.pyc
o
���gS! � @ s d dl mZ d dlZd dlmZmZmZmZmZm Z d dl
Zd dlm
Z
d dlmZ d dlmZ d dlmZ d dlmZ d d lmZ d d
lmZ d dlmZ d dlmZ d d
lmZm Z m!Z! d dl"m#Z# d%dd�Z$G dd� de#�Z%d&d!d"�Z&G d#d$� d$e�Z'dS )'� )�annotationsN)�Any�Dict�List�Optional�Sequence�Tuple)�CallbackManagerForChainRun)�BaseLanguageModel)� AIMessage)�StrOutputParser)�BasePromptTemplate)�
BaseRetriever)�Runnable)�Field)�Chain)�PROMPT�QUESTION_GENERATOR_PROMPT�FinishedOutputParser)�LLMChain�responser �return�Tuple[List[str], List[float]]c C s@ g }g }| j d d D ]}|�|d � |�|d � q||fS )z>Extract tokens and log probabilities from chat model response.�logprobs�content�token�logprob)�response_metadata�append)r �tokens� log_probsr � r! �F/usr/local/lib/python3.10/dist-packages/langchain/chains/flare/base.py�_extract_tokens_and_log_probs s r# c @ s<