我正在使用Adal Flask 用于聊天数据。 因此,在ole里,我直接从公开审计局那里得到可流的响应,因为我能够用旗帜(<条码>流星>/True)进行播音。
问题在于,我可以把“过去”的溪流或“how” st倒在我的APIC呼吁中。
开放审计和链条处理守则是:
def askQuestion(self, collection_id, question):
collection_name = "collection-" + str(collection_id)
self.llm = ChatOpenAI(model_name=self.model_name, temperature=self.temperature, openai_api_key=os.environ.get( OPENAI_API_KEY ), streaming=True, callback_manager=CallbackManager([StreamingStdOutCallbackHandler()]))
self.memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True, output_key= answer )
chroma_Vectorstore = Chroma(collection_name=collection_name, embedding_function=self.embeddingsOpenAi, client=self.chroma_client)
self.chain = ConversationalRetrievalChain.from_llm(self.llm, chroma_Vectorstore.as_retriever(similarity_search_with_score=True),
return_source_documents=True,verbose=VERBOSE,
memory=self.memory)
result = self.chain({"question": question})
res_dict = {
"answer": result["answer"],
}
res_dict["source_documents"] = []
for source in result["source_documents"]:
res_dict["source_documents"].append({
"page_content": source.page_content,
"metadata": source.metadata
})
return res_dict`
和AP路标代码:
@app.route("/collection/<int:collection_id>/ask_question", methods=["POST"])
def ask_question(collection_id):
question = request.form["question"]
# response_generator = document_thread.askQuestion(collection_id, question)
# return jsonify(response_generator)
def stream(question):
completion = document_thread.askQuestion(collection_id, question)
for line in completion[ answer ]:
yield line
return app.response_class(stream_with_context(stream(question)))
我正在用曲折测试我的终点,我通过旗帜——N到曲线,因此,如果有可能,我就应当得到可流的反应。
当我首先发出APICA要闻时,终点站正在等待数据处理(我可以在我的VS代码终端看到可以流出的答案),在完成时,我就看到了在上面所展示的一切。
成就