Local models (Ollama, vLLM) sometimes return tool calls as text instead of using the native tool_calls wire format. This adds a safety-net extractor that parses tool calls from model text output when native tool_calls is empty. - Add text-tool-extractor with support for bare JSON, code fences, and Hermes <tool_call> tags - Wire fallback into OpenAI adapter chat() and stream() paths - Add onWarning callback when model ignores configured tools - Add timeoutMs on AgentConfig for per-run abort (local models can be slow) - Add 26 tests for extractor and fallback behavior - Document local model compatibility in README |
||
|---|---|---|
| .. | ||
| adapter.ts | ||
| anthropic.ts | ||
| copilot.ts | ||
| gemini.ts | ||
| grok.ts | ||
| openai-common.ts | ||
| openai.ts | ||