0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 1 year has passed since last update.

記事投稿キャンペーン 「AI、機械学習」

LangChainを使用して会話履歴を踏まえストリーミングで応答を得る

Last updated at Posted at 2023-11-05

Chat Completions APIで会話履歴を踏まえストリーミングで応答を得るAPIリクエストで実装した処理を、LangChainで置き換える。

ライブラリインストール

pip install langchain openai

ソースコード

from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferMemory
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
from langchain.chat_models import ChatOpenAI
from langchain.schema import AIMessage, HumanMessage, SystemMessage

chat = ChatOpenAI(
    model="gpt-3.5-turbo",
    streaming=True,
    callbacks=[StreamingStdOutCallbackHandler()]
)

conversation = ConversationChain(
    llm=chat,
    memory=ConversationBufferMemory()
)

while True:
    user_message = input("You: ")
    conversation.run(input=user_message)

解説

  • ストリーミング処理には、streaming=Trueとした上で、callback関数にStreamingStdOutCallbackHandler()を指定する
  • 会話履歴にはConversationBufferMemory()を指定する
    • 他にも会話履歴の持ち方の違いで色々種類がある
  • 会話はConversationChainというChainで(処理をつなげる機能)会話をwhileの無限ループで繋いでいる
0
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?