Introduce shared OpenAI/Ollama wire-format helpers (src/llm/openai-common.ts) and refactor the Ollama adapter to reuse them. Update Ollama client baseURL and API key handling, add example env detection for OLLAMA_BASE_URL and prefer Ollama for the analyst agent when present. Also update types to include 'ollama' in provider unions and adjust readonly annotations. Minor formatting/packaging updates and small cleanup in adapter and example files.
Introduce a new OllamaAdapter (src/llm/ollama.ts) implementing LLMAdapter with chat and streaming support, converting between the framework's ContentBlock types and Ollama (OpenAI-compatible) chat completions. Wire the adapter into the factory (src/llm/adapter.ts) and extend provider types (src/types.ts) to include 'ollama'. Update example (examples/04-multi-model-team.ts) to allow selecting Ollama as a model/provider option. Ollama adapter defaults its base URL from OLLAMA_BASE_URL or http://localhost:11434 and handles tool calls, tool results, images, and finish reason normalization.