実際のところ
導入
$ pip install ollama
実験
反応がないと不安になるのでstreamはTrueに設定
import asyncio from ollama import AsyncClient async def chat(): message = {'role': 'user', 'content': 'Why is the sky blue?'} async for part in await AsyncClient().chat(model='llama3', messages=[message], stream=True): print(part['message']['content'], end='', flush=True) asyncio.run(chat())
試しに動作してみると、(CPU稼働なので遅いですが)ちゃんとレスポンスを返してきます
The sky appears blue because of a phenomenon called Rayleigh scattering, named after the British physicist Lord Rayleigh, who first described it in the late 19th century. Here's what happens: ...
ちゃんと対話できる改良版
とはいえ、これではmessagesがベタ書きでチャットできないので改良します
import asyncio from ollama import AsyncClient async def chat(): # Initialize an empty message dictionary message = {'role': 'user', 'content': ''} # Initialize the client client = AsyncClient() while True: # Get user input from standard input content = input("\nメッセージを入力してください (終了する場合は 'exit' を入力): ") if content.lower() == 'exit': break # Update the content in the message message['content'] = content # Chat with the model async for part in await client.chat(model='llama3', messages=[message], stream=True): print(part['message']['content'], end='', flush=True) # Execute the chat function asyncio.run(chat())