@hexos/runtimeStreams chat completions from Ollama with support for tool calling and agent iteration.
Orchestrates the full LLM interaction cycle with locally-hosted Ollama models: sends messages to Ollama’s chat API, streams text deltas, handles tool calls with approval workflows, executes tools, and returns results to the LLM. Implements an agentic loop that continues until the model produces a final response or reaches the maximum iteration limit.
The function yields RuntimeEvent objects for each stage: text-delta for streaming content, tool-call-start/args/result/error for tool execution phases, approval-required for human-in-the-loop decisions, and text-complete when the conversation is finished. Tool call IDs are generated using crypto.randomUUID since Ollama does not provide them in the response.
function streamWithOllamaProvider(params: OllamaStreamParams): AsyncGenerator<RuntimeEvent>Parameters