diff --git a/README.md b/README.md index 8497c15..d1849cd 100644 --- a/README.md +++ b/README.md @@ -17,7 +17,7 @@ CrewAI is Python. LangGraph makes you draw the graph by hand. `open-multi-agent` - **Goal to result in one call.** `runTeam(team, "Build a REST API")` kicks off a coordinator agent that decomposes the goal into a task DAG, resolves dependencies, runs independent tasks in parallel, and synthesizes the final output. No graph to draw, no tasks to wire up. - **TypeScript-native, three runtime dependencies.** `@anthropic-ai/sdk`, `openai`, `zod`. That is the whole runtime. Embed in Express, Next.js, serverless functions, or CI/CD pipelines. No Python runtime, no subprocess bridge, no cloud sidecar. -- **Multi-model teams.** Claude, GPT, Gemini, Grok, Copilot, or any OpenAI-compatible local model (Ollama, vLLM, LM Studio, llama.cpp) in the same team. Run the architect on Opus 4.6, the developer on GPT-5.4, the reviewer on local Gemma 4, all in one `runTeam()` call. Gemini ships as an optional peer dependency: `npm install @google/genai` to enable. +- **Multi-model teams.** Claude, GPT, Gemini, Grok, MiniMax, DeepSeek, Copilot, or any OpenAI-compatible local model (Ollama, vLLM, LM Studio, llama.cpp) in the same team. Run the architect on Opus 4.6, the developer on GPT-5.4, the reviewer on local Gemma 4, all in one `runTeam()` call. Gemini ships as an optional peer dependency: `npm install @google/genai` to enable. Other features (MCP integration, context strategies, structured output, task retry, human-in-the-loop, lifecycle hooks, loop detection, observability) live below the fold and in [`examples/`](./examples/). @@ -72,6 +72,9 @@ Set the API key for your provider. Local models via Ollama require no API key - `OPENAI_API_KEY` - `GEMINI_API_KEY` - `XAI_API_KEY` (for Grok) +- `MINIMAX_API_KEY` (for MiniMax) +- `MINIMAX_BASE_URL` (for MiniMax — optional, selects endpoint) +- `DEEPSEEK_API_KEY` (for DeepSeek) - `GITHUB_TOKEN` (for Copilot) **CLI (`oma`).** For shell and CI, the package exposes a JSON-first binary. See [docs/cli.md](./docs/cli.md) for `oma run`, `oma task`, `oma provider`, exit codes, and file formats. @@ -139,14 +142,17 @@ For MapReduce-style fan-out without task dependencies, use `AgentPool.runParalle ## Examples -16 runnable scripts in [`examples/`](./examples/). Start with these four: +18 runnable scripts and 1 full-stack demo in [`examples/`](./examples/). Start with these: - [02 — Team Collaboration](examples/02-team-collaboration.ts): `runTeam()` coordinator pattern. - [06 — Local Model](examples/06-local-model.ts): Ollama and Claude in one pipeline via `baseURL`. - [09 — Structured Output](examples/09-structured-output.ts): any agent returns Zod-validated JSON. - [11 — Trace Observability](examples/11-trace-observability.ts): `onTrace` spans for LLM calls, tools, and tasks. +- [17 — MiniMax](examples/17-minimax.ts): three-agent team using MiniMax M2.7. +- [18 — DeepSeek](examples/18-deepseek.ts): three-agent team using DeepSeek Chat. +- [with-vercel-ai-sdk](examples/with-vercel-ai-sdk/): Next.js app — OMA `runTeam()` + AI SDK `useChat` streaming. -Run any with `npx tsx examples/02-team-collaboration.ts`. +Run scripts with `npx tsx examples/02-team-collaboration.ts`. ## Architecture @@ -182,6 +188,8 @@ Run any with `npx tsx examples/02-team-collaboration.ts`. │ │ - CopilotAdapter │ │ │ - GeminiAdapter │ │ │ - GrokAdapter │ + │ │ - MiniMaxAdapter │ + │ │ - DeepSeekAdapter │ │ └──────────────────────┘ ┌────────▼──────────┐ │ AgentRunner │ ┌──────────────────────┐ @@ -281,6 +289,9 @@ Notes: | Anthropic (Claude) | `provider: 'anthropic'` | `ANTHROPIC_API_KEY` | Verified | | OpenAI (GPT) | `provider: 'openai'` | `OPENAI_API_KEY` | Verified | | Grok (xAI) | `provider: 'grok'` | `XAI_API_KEY` | Verified | +| MiniMax (global) | `provider: 'minimax'` | `MINIMAX_API_KEY` | Verified | +| MiniMax (China) | `provider: 'minimax'` + `MINIMAX_BASE_URL` | `MINIMAX_API_KEY` | Verified | +| DeepSeek | `provider: 'deepseek'` | `DEEPSEEK_API_KEY` | Verified | | GitHub Copilot | `provider: 'copilot'` | `GITHUB_TOKEN` | Verified | | Gemini | `provider: 'gemini'` | `GEMINI_API_KEY` | Verified | | Ollama / vLLM / LM Studio | `provider: 'openai'` + `baseURL` | — | Verified | @@ -290,7 +301,7 @@ Gemini requires `npm install @google/genai` (optional peer dependency). Verified local models with tool-calling: **Gemma 4** (see [example 08](examples/08-gemma4-local.ts)). -Any OpenAI-compatible API should work via `provider: 'openai'` + `baseURL` (DeepSeek, Groq, Mistral, Qwen, MiniMax, etc.). **Grok now has first-class support** via `provider: 'grok'`. +Any OpenAI-compatible API should work via `provider: 'openai'` + `baseURL` (Groq, Mistral, Qwen, etc.). **Grok, MiniMax, and DeepSeek now have first-class support** via `provider: 'grok'`, `provider: 'minimax'`, and `provider: 'deepseek'`. ### Local Model Tool-Calling @@ -330,7 +341,34 @@ const grokAgent: AgentConfig = { } ``` -(Set your `XAI_API_KEY` environment variable — no `baseURL` needed anymore.) +(Set your `XAI_API_KEY` environment variable — no `baseURL` needed.) + +```typescript +const minimaxAgent: AgentConfig = { + name: 'minimax-agent', + provider: 'minimax', + model: 'MiniMax-M2.7', + systemPrompt: 'You are a helpful assistant.', +} +``` + +Set `MINIMAX_API_KEY`. The adapter selects the endpoint via `MINIMAX_BASE_URL`: + +- `https://api.minimax.io/v1` Global, default +- `https://api.minimaxi.com/v1` China mainland endpoint + +You can also pass `baseURL` directly in `AgentConfig` to override the env var. + +```typescript +const deepseekAgent: AgentConfig = { + name: 'deepseek-agent', + provider: 'deepseek', + model: 'deepseek-chat', + systemPrompt: 'You are a helpful assistant.', +} +``` + +Set `DEEPSEEK_API_KEY`. Available models: `deepseek-chat` (DeepSeek-V3, recommended for coding) and `deepseek-reasoner` (thinking mode). ## Contributing diff --git a/README_zh.md b/README_zh.md index bb49589..aa9c83e 100644 --- a/README_zh.md +++ b/README_zh.md @@ -17,7 +17,7 @@ CrewAI 是 Python。LangGraph 需要你自己画图。`open-multi-agent` 是你 - **一次调用从目标到结果。** `runTeam(team, "构建一个 REST API")` 启动一个协调者 agent,把目标拆成任务 DAG,解析依赖,独立任务并行执行,最终合成输出。不需要画图,不需要手动连任务。 - **TypeScript 原生,3 个运行时依赖。** `@anthropic-ai/sdk`、`openai`、`zod`。这就是全部运行时。可嵌入 Express、Next.js、Serverless 函数或 CI/CD 流水线。没有 Python 运行时,没有子进程桥接,没有云端 sidecar。 -- **多模型团队。** Claude、GPT、Gemini、Grok、Copilot,或任何 OpenAI 兼容的本地模型(Ollama、vLLM、LM Studio、llama.cpp)可以在同一个团队中使用。让架构师用 Opus 4.6,开发者用 GPT-5.4,评审用本地的 Gemma 4,一次 `runTeam()` 调用全部搞定。Gemini 作为 optional peer dependency 提供:使用前需 `npm install @google/genai`。 +- **多模型团队。** Claude、GPT、Gemini、Grok、MiniMax、DeepSeek、Copilot,或任何 OpenAI 兼容的本地模型(Ollama、vLLM、LM Studio、llama.cpp)可以在同一个团队中使用。让架构师用 Opus 4.6,开发者用 GPT-5.4,评审用本地的 Gemma 4,一次 `runTeam()` 调用全部搞定。Gemini 作为 optional peer dependency 提供:使用前需 `npm install @google/genai`。 其他能力(MCP 集成、上下文策略、结构化输出、任务重试、人机协同、生命周期钩子、循环检测、可观测性)在下方章节和 [`examples/`](./examples/) 里。 @@ -72,6 +72,9 @@ npm install @jackchen_me/open-multi-agent - `OPENAI_API_KEY` - `GEMINI_API_KEY` - `XAI_API_KEY`(Grok) +- `MINIMAX_API_KEY`(MiniMax) +- `MINIMAX_BASE_URL`(MiniMax — 可选,用于选择接入端点) +- `DEEPSEEK_API_KEY`(DeepSeek) - `GITHUB_TOKEN`(Copilot) 三个智能体,一个目标——框架处理剩下的一切: @@ -137,14 +140,17 @@ Tokens: 12847 output tokens ## 示例 -[`examples/`](./examples/) 里有 15 个可运行脚本。推荐从这 4 个开始: +[`examples/`](./examples/) 里有 18 个可运行脚本和 1 个完整项目。推荐从这几个开始: - [02 — 团队协作](examples/02-team-collaboration.ts):`runTeam()` 协调者模式。 - [06 — 本地模型](examples/06-local-model.ts):通过 `baseURL` 把 Ollama 和 Claude 放在同一条管线。 - [09 — 结构化输出](examples/09-structured-output.ts):任意 agent 产出 Zod 校验过的 JSON。 - [11 — 可观测性](examples/11-trace-observability.ts):`onTrace` 回调,为 LLM 调用、工具、任务发出结构化 span。 +- [17 — MiniMax](examples/17-minimax.ts):使用 MiniMax M2.7 的三智能体团队。 +- [18 — DeepSeek](examples/18-deepseek.ts):使用 DeepSeek Chat 的三智能体团队。 +- [with-vercel-ai-sdk](examples/with-vercel-ai-sdk/):Next.js 应用 — OMA `runTeam()` + AI SDK `useChat` 流式输出。 -用 `npx tsx examples/02-team-collaboration.ts` 运行任意一个。 +用 `npx tsx examples/02-team-collaboration.ts` 运行脚本示例。 ## 架构 @@ -180,6 +186,8 @@ Tokens: 12847 output tokens │ │ - CopilotAdapter │ │ │ - GeminiAdapter │ │ │ - GrokAdapter │ + │ │ - MiniMaxAdapter │ + │ │ - DeepSeekAdapter │ │ └──────────────────────┘ ┌────────▼──────────┐ │ AgentRunner │ ┌──────────────────────┐ @@ -255,6 +263,9 @@ const customAgent: AgentConfig = { | Anthropic (Claude) | `provider: 'anthropic'` | `ANTHROPIC_API_KEY` | 已验证 | | OpenAI (GPT) | `provider: 'openai'` | `OPENAI_API_KEY` | 已验证 | | Grok (xAI) | `provider: 'grok'` | `XAI_API_KEY` | 已验证 | +| MiniMax(全球) | `provider: 'minimax'` | `MINIMAX_API_KEY` | 已验证 | +| MiniMax(国内) | `provider: 'minimax'` + `MINIMAX_BASE_URL` | `MINIMAX_API_KEY` | 已验证 | +| DeepSeek | `provider: 'deepseek'` | `DEEPSEEK_API_KEY` | 已验证 | | GitHub Copilot | `provider: 'copilot'` | `GITHUB_TOKEN` | 已验证 | | Gemini | `provider: 'gemini'` | `GEMINI_API_KEY` | 已验证 | | Ollama / vLLM / LM Studio | `provider: 'openai'` + `baseURL` | — | 已验证 | @@ -264,7 +275,7 @@ Gemini 需要 `npm install @google/genai`(optional peer dependency)。 已验证支持 tool-calling 的本地模型:**Gemma 4**(见[示例 08](examples/08-gemma4-local.ts))。 -任何 OpenAI 兼容 API 均可通过 `provider: 'openai'` + `baseURL` 接入(DeepSeek、Groq、Mistral、Qwen、MiniMax 等)。**Grok 现已原生支持**,使用 `provider: 'grok'`。 +任何 OpenAI 兼容 API 均可通过 `provider: 'openai'` + `baseURL` 接入(Groq、Mistral、Qwen 等)。**Grok、MiniMax 和 DeepSeek 现已原生支持**,分别使用 `provider: 'grok'`、`provider: 'minimax'` 和 `provider: 'deepseek'`。 ### 本地模型 Tool-Calling @@ -306,6 +317,33 @@ const grokAgent: AgentConfig = { (设置 `XAI_API_KEY` 环境变量即可,无需 `baseURL`。) +```typescript +const minimaxAgent: AgentConfig = { + name: 'minimax-agent', + provider: 'minimax', + model: 'MiniMax-M2.7', + systemPrompt: 'You are a helpful assistant.', +} +``` + +设置 `MINIMAX_API_KEY`。适配器通过 `MINIMAX_BASE_URL` 选择接入端点: + +- `https://api.minimax.io/v1` 全球端点,默认 +- `https://api.minimaxi.com/v1` 中国大陆端点 + +也可在 `AgentConfig` 中直接传入 `baseURL` 覆盖环境变量。 + +```typescript +const deepseekAgent: AgentConfig = { + name: 'deepseek-agent', + provider: 'deepseek', + model: 'deepseek-chat', + systemPrompt: '你是一个有用的助手。', +} +``` + +设置 `DEEPSEEK_API_KEY`。可用模型:`deepseek-chat`(DeepSeek-V3,推荐用于编码任务)和 `deepseek-reasoner`(思考模式)。 + ## 参与贡献 欢迎提 Issue、功能需求和 PR。以下方向的贡献尤其有价值: diff --git a/docs/cli.md b/docs/cli.md index b9c4d65..38a6522 100644 --- a/docs/cli.md +++ b/docs/cli.md @@ -20,7 +20,7 @@ npm run build node dist/cli/oma.js help ``` -Set the usual provider API keys in the environment (see [README](../README.md#quick-start)); the CLI does not read secrets from flags. +Set the usual provider API keys in the environment (see [README](../README.md#quick-start)); the CLI does not read secrets from flags. MiniMax additionally reads `MINIMAX_BASE_URL` to select the global (`https://api.minimax.io/v1`) or China (`https://api.minimaxi.com/v1`) endpoint. --- @@ -55,7 +55,7 @@ Global flags: [`--pretty`](#output-flags), [`--include-messages`](#output-flags) Read-only helper for wiring JSON configs and env vars. - **`oma provider`** or **`oma provider list`** — Prints JSON: built-in provider ids, API key environment variable names, whether `baseURL` is supported, and short notes (e.g. OpenAI-compatible servers, Copilot in CI). -- **`oma provider template `** — Prints a JSON object with example `orchestrator` and `agent` fields plus placeholder `env` entries. `` is one of: `anthropic`, `openai`, `gemini`, `grok`, `copilot`. +- **`oma provider template `** — Prints a JSON object with example `orchestrator` and `agent` fields plus placeholder `env` entries. `` is one of: `anthropic`, `openai`, `gemini`, `grok`, `minimax`, `deepseek`, `copilot`. Supports `--pretty`. diff --git a/examples/17-minimax.ts b/examples/17-minimax.ts new file mode 100644 index 0000000..882293f --- /dev/null +++ b/examples/17-minimax.ts @@ -0,0 +1,159 @@ +/** + * Example 17 — Multi-Agent Team Collaboration with MiniMax + * + * Three specialized agents (architect, developer, reviewer) collaborate via `runTeam()` + * to build a minimal Express.js REST API. Every agent uses MiniMax's flagship model. + * + * Run: + * npx tsx examples/17-minimax.ts + * + * Prerequisites: + * MINIMAX_API_KEY environment variable must be set. + * MINIMAX_BASE_URL environment variable can be set to switch to the China mainland endpoint if needed. + * + * Endpoints: + * Global (default): https://api.minimax.io/v1 + * China mainland: https://api.minimaxi.com/v1 (set MINIMAX_BASE_URL) + */ + +import { OpenMultiAgent } from '../src/index.js' +import type { AgentConfig, OrchestratorEvent } from '../src/types.js' + +// --------------------------------------------------------------------------- +// Agent definitions (all using MiniMax-M2.7) +// --------------------------------------------------------------------------- +const architect: AgentConfig = { + name: 'architect', + model: 'MiniMax-M2.7', + provider: 'minimax', + systemPrompt: `You are a software architect with deep experience in Node.js and REST API design. +Your job is to design clear, production-quality API contracts and file/directory structures. +Output concise plans in markdown — no unnecessary prose.`, + tools: ['bash', 'file_write'], + maxTurns: 5, + temperature: 0.2, +} + +const developer: AgentConfig = { + name: 'developer', + model: 'MiniMax-M2.7', + provider: 'minimax', + systemPrompt: `You are a TypeScript/Node.js developer. You implement what the architect specifies. +Write clean, runnable code with proper error handling. Use the tools to write files and run tests.`, + tools: ['bash', 'file_read', 'file_write', 'file_edit'], + maxTurns: 12, + temperature: 0.1, +} + +const reviewer: AgentConfig = { + name: 'reviewer', + model: 'MiniMax-M2.7', + provider: 'minimax', + systemPrompt: `You are a senior code reviewer. Review code for correctness, security, and clarity. +Provide a structured review with: LGTM items, suggestions, and any blocking issues. +Read files using the tools before reviewing.`, + tools: ['bash', 'file_read', 'grep'], + maxTurns: 5, + temperature: 0.3, +} + +// --------------------------------------------------------------------------- +// Progress tracking +// --------------------------------------------------------------------------- +const startTimes = new Map() + +function handleProgress(event: OrchestratorEvent): void { + const ts = new Date().toISOString().slice(11, 23) // HH:MM:SS.mmm + switch (event.type) { + case 'agent_start': + startTimes.set(event.agent ?? '', Date.now()) + console.log(`[${ts}] AGENT START → ${event.agent}`) + break + case 'agent_complete': { + const elapsed = Date.now() - (startTimes.get(event.agent ?? '') ?? Date.now()) + console.log(`[${ts}] AGENT DONE ← ${event.agent} (${elapsed}ms)`) + break + } + case 'task_start': + console.log(`[${ts}] TASK START ↓ ${event.task}`) + break + case 'task_complete': + console.log(`[${ts}] TASK DONE ↑ ${event.task}`) + break + case 'message': + console.log(`[${ts}] MESSAGE • ${event.agent} → (team)`) + break + case 'error': + console.error(`[${ts}] ERROR ✗ agent=${event.agent} task=${event.task}`) + if (event.data instanceof Error) console.error(` ${event.data.message}`) + break + } +} + +// --------------------------------------------------------------------------- +// Orchestrate +// --------------------------------------------------------------------------- +const orchestrator = new OpenMultiAgent({ + defaultModel: 'MiniMax-M2.7', + defaultProvider: 'minimax', + maxConcurrency: 1, // sequential for readable output + onProgress: handleProgress, +}) + +const team = orchestrator.createTeam('api-team', { + name: 'api-team', + agents: [architect, developer, reviewer], + sharedMemory: true, + maxConcurrency: 1, +}) + +console.log(`Team "${team.name}" created with agents: ${team.getAgents().map(a => a.name).join(', ')}`) +console.log('\nStarting team run...\n') +console.log('='.repeat(60)) + +const goal = `Create a minimal Express.js REST API in /tmp/express-api/ with: +- GET /health → { status: "ok" } +- GET /users → returns a hardcoded array of 2 user objects +- POST /users → accepts { name, email } body, logs it, returns 201 +- Proper error handling middleware +- The server should listen on port 3001 +- Include a package.json with the required dependencies` + +const result = await orchestrator.runTeam(team, goal) + +console.log('\n' + '='.repeat(60)) + +// --------------------------------------------------------------------------- +// Results +// --------------------------------------------------------------------------- +console.log('\nTeam run complete.') +console.log(`Success: ${result.success}`) +console.log(`Total tokens — input: ${result.totalTokenUsage.input_tokens}, output: ${result.totalTokenUsage.output_tokens}`) + +console.log('\nPer-agent results:') +for (const [agentName, agentResult] of result.agentResults) { + const status = agentResult.success ? 'OK' : 'FAILED' + const tools = agentResult.toolCalls.length + console.log(` ${agentName.padEnd(12)} [${status}] tool_calls=${tools}`) + if (!agentResult.success) { + console.log(` Error: ${agentResult.output.slice(0, 120)}`) + } +} + +// Sample outputs +const developerResult = result.agentResults.get('developer') +if (developerResult?.success) { + console.log('\nDeveloper output (last 600 chars):') + console.log('─'.repeat(60)) + const out = developerResult.output + console.log(out.length > 600 ? '...' + out.slice(-600) : out) + console.log('─'.repeat(60)) +} + +const reviewerResult = result.agentResults.get('reviewer') +if (reviewerResult?.success) { + console.log('\nReviewer output:') + console.log('─'.repeat(60)) + console.log(reviewerResult.output) + console.log('─'.repeat(60)) +} diff --git a/examples/18-deepseek.ts b/examples/18-deepseek.ts new file mode 100644 index 0000000..937b2ef --- /dev/null +++ b/examples/18-deepseek.ts @@ -0,0 +1,158 @@ +/** + * Example 18 — Multi-Agent Team Collaboration with DeepSeek + * + * Three specialized agents (architect, developer, reviewer) collaborate via `runTeam()` + * to build a minimal Express.js REST API. Every agent uses DeepSeek's flagship model. + * + * Run: + * npx tsx examples/18-deepseek.ts + * + * Prerequisites: + * DEEPSEEK_API_KEY environment variable must be set. + * + * Available models: + * deepseek-chat — DeepSeek-V3 (non-thinking mode, recommended for coding tasks) + * deepseek-reasoner — DeepSeek-V3 (thinking mode, for complex reasoning) + */ + +import { OpenMultiAgent } from '../src/index.js' +import type { AgentConfig, OrchestratorEvent } from '../src/types.js' + +// --------------------------------------------------------------------------- +// Agent definitions (all using deepseek-chat) +// --------------------------------------------------------------------------- +const architect: AgentConfig = { + name: 'architect', + model: 'deepseek-reasoner', + provider: 'deepseek', + systemPrompt: `You are a software architect with deep experience in Node.js and REST API design. +Your job is to design clear, production-quality API contracts and file/directory structures. +Output concise plans in markdown — no unnecessary prose.`, + tools: ['bash', 'file_write'], + maxTurns: 5, + temperature: 0.2, +} + +const developer: AgentConfig = { + name: 'developer', + model: 'deepseek-chat', + provider: 'deepseek', + systemPrompt: `You are a TypeScript/Node.js developer. You implement what the architect specifies. +Write clean, runnable code with proper error handling. Use the tools to write files and run tests.`, + tools: ['bash', 'file_read', 'file_write', 'file_edit'], + maxTurns: 12, + temperature: 0.1, +} + +const reviewer: AgentConfig = { + name: 'reviewer', + model: 'deepseek-chat', + provider: 'deepseek', + systemPrompt: `You are a senior code reviewer. Review code for correctness, security, and clarity. +Provide a structured review with: LGTM items, suggestions, and any blocking issues. +Read files using the tools before reviewing.`, + tools: ['bash', 'file_read', 'grep'], + maxTurns: 5, + temperature: 0.3, +} + +// --------------------------------------------------------------------------- +// Progress tracking +// --------------------------------------------------------------------------- +const startTimes = new Map() + +function handleProgress(event: OrchestratorEvent): void { + const ts = new Date().toISOString().slice(11, 23) // HH:MM:SS.mmm + switch (event.type) { + case 'agent_start': + startTimes.set(event.agent ?? '', Date.now()) + console.log(`[${ts}] AGENT START → ${event.agent}`) + break + case 'agent_complete': { + const elapsed = Date.now() - (startTimes.get(event.agent ?? '') ?? Date.now()) + console.log(`[${ts}] AGENT DONE ← ${event.agent} (${elapsed}ms)`) + break + } + case 'task_start': + console.log(`[${ts}] TASK START ↓ ${event.task}`) + break + case 'task_complete': + console.log(`[${ts}] TASK DONE ↑ ${event.task}`) + break + case 'message': + console.log(`[${ts}] MESSAGE • ${event.agent} → (team)`) + break + case 'error': + console.error(`[${ts}] ERROR ✗ agent=${event.agent} task=${event.task}`) + if (event.data instanceof Error) console.error(` ${event.data.message}`) + break + } +} + +// --------------------------------------------------------------------------- +// Orchestrate +// --------------------------------------------------------------------------- +const orchestrator = new OpenMultiAgent({ + defaultModel: 'deepseek-chat', + defaultProvider: 'deepseek', + maxConcurrency: 1, // sequential for readable output + onProgress: handleProgress, +}) + +const team = orchestrator.createTeam('api-team', { + name: 'api-team', + agents: [architect, developer, reviewer], + sharedMemory: true, + maxConcurrency: 1, +}) + +console.log(`Team "${team.name}" created with agents: ${team.getAgents().map(a => a.name).join(', ')}`) +console.log('\nStarting team run...\n') +console.log('='.repeat(60)) + +const goal = `Create a minimal Express.js REST API in /tmp/express-api/ with: +- GET /health → { status: "ok" } +- GET /users → returns a hardcoded array of 2 user objects +- POST /users → accepts { name, email } body, logs it, returns 201 +- Proper error handling middleware +- The server should listen on port 3001 +- Include a package.json with the required dependencies` + +const result = await orchestrator.runTeam(team, goal) + +console.log('\n' + '='.repeat(60)) + +// --------------------------------------------------------------------------- +// Results +// --------------------------------------------------------------------------- +console.log('\nTeam run complete.') +console.log(`Success: ${result.success}`) +console.log(`Total tokens — input: ${result.totalTokenUsage.input_tokens}, output: ${result.totalTokenUsage.output_tokens}`) + +console.log('\nPer-agent results:') +for (const [agentName, agentResult] of result.agentResults) { + const status = agentResult.success ? 'OK' : 'FAILED' + const tools = agentResult.toolCalls.length + console.log(` ${agentName.padEnd(12)} [${status}] tool_calls=${tools}`) + if (!agentResult.success) { + console.log(` Error: ${agentResult.output.slice(0, 120)}`) + } +} + +// Sample outputs +const developerResult = result.agentResults.get('developer') +if (developerResult?.success) { + console.log('\nDeveloper output (last 600 chars):') + console.log('─'.repeat(60)) + const out = developerResult.output + console.log(out.length > 600 ? '...' + out.slice(-600) : out) + console.log('─'.repeat(60)) +} + +const reviewerResult = result.agentResults.get('reviewer') +if (reviewerResult?.success) { + console.log('\nReviewer output:') + console.log('─'.repeat(60)) + console.log(reviewerResult.output) + console.log('─'.repeat(60)) +} diff --git a/examples/with-vercel-ai-sdk/.gitignore b/examples/with-vercel-ai-sdk/.gitignore new file mode 100644 index 0000000..757b2d5 --- /dev/null +++ b/examples/with-vercel-ai-sdk/.gitignore @@ -0,0 +1,5 @@ +node_modules/ +.next/ +.env +.env.local +*.tsbuildinfo diff --git a/examples/with-vercel-ai-sdk/README.md b/examples/with-vercel-ai-sdk/README.md new file mode 100644 index 0000000..dc41fb8 --- /dev/null +++ b/examples/with-vercel-ai-sdk/README.md @@ -0,0 +1,59 @@ +# with-vercel-ai-sdk + +A Next.js demo showing **open-multi-agent** (OMA) and **Vercel AI SDK** working together: + +- **OMA** orchestrates a research team (researcher agent + writer agent) via `runTeam()` +- **AI SDK** streams the result to a chat UI via `useChat` + `streamText` + +## How it works + +``` +User message + │ + ▼ +API route (app/api/chat/route.ts) + │ + ├─ Phase 1: OMA runTeam() + │ coordinator decomposes goal → researcher gathers info → writer drafts article + │ + └─ Phase 2: AI SDK streamText() + streams the team's output to the browser + │ + ▼ +Chat UI (app/page.tsx) — useChat hook renders streamed response +``` + +## Setup + +```bash +# 1. From repo root, install OMA dependencies +cd ../.. +npm install + +# 2. Back to this example +cd examples/with-vercel-ai-sdk +npm install + +# 3. Set your API key +export ANTHROPIC_API_KEY=sk-ant-... + +# 4. Run +npm run dev +``` + +`npm run dev` automatically builds OMA before starting Next.js (via the `predev` script). + +Open [http://localhost:3000](http://localhost:3000), type a topic, and watch the research team work. + +## Prerequisites + +- Node.js >= 18 +- `ANTHROPIC_API_KEY` environment variable (used by both OMA and AI SDK) + +## Key files + +| File | Role | +|------|------| +| `app/api/chat/route.ts` | Backend — OMA orchestration + AI SDK streaming | +| `app/page.tsx` | Frontend — chat UI with `useChat` hook | +| `package.json` | References OMA via `file:../../` (local link) | diff --git a/examples/with-vercel-ai-sdk/app/api/chat/route.ts b/examples/with-vercel-ai-sdk/app/api/chat/route.ts new file mode 100644 index 0000000..6eefe79 --- /dev/null +++ b/examples/with-vercel-ai-sdk/app/api/chat/route.ts @@ -0,0 +1,91 @@ +import { streamText, convertToModelMessages, type UIMessage } from 'ai' +import { createOpenAICompatible } from '@ai-sdk/openai-compatible' +import { OpenMultiAgent } from '@jackchen_me/open-multi-agent' +import type { AgentConfig } from '@jackchen_me/open-multi-agent' + +export const maxDuration = 120 + +// --- DeepSeek via OpenAI-compatible API --- +const DEEPSEEK_BASE_URL = 'https://api.deepseek.com' +const DEEPSEEK_MODEL = 'deepseek-chat' + +const deepseek = createOpenAICompatible({ + name: 'deepseek', + baseURL: `${DEEPSEEK_BASE_URL}/v1`, + apiKey: process.env.DEEPSEEK_API_KEY, +}) + +const researcher: AgentConfig = { + name: 'researcher', + model: DEEPSEEK_MODEL, + provider: 'openai', + baseURL: DEEPSEEK_BASE_URL, + apiKey: process.env.DEEPSEEK_API_KEY, + systemPrompt: `You are a research specialist. Given a topic, provide thorough, factual research +with key findings, relevant data points, and important context. +Be concise but comprehensive. Output structured notes, not prose.`, + maxTurns: 3, + temperature: 0.2, +} + +const writer: AgentConfig = { + name: 'writer', + model: DEEPSEEK_MODEL, + provider: 'openai', + baseURL: DEEPSEEK_BASE_URL, + apiKey: process.env.DEEPSEEK_API_KEY, + systemPrompt: `You are an expert writer. Using research from team members (available in shared memory), +write a well-structured, engaging article with clear headings and concise paragraphs. +Do not repeat raw research — synthesize it into readable prose.`, + maxTurns: 3, + temperature: 0.4, +} + +function extractText(message: UIMessage): string { + return message.parts + .filter((p): p is { type: 'text'; text: string } => p.type === 'text') + .map((p) => p.text) + .join('') +} + +export async function POST(req: Request) { + const { messages }: { messages: UIMessage[] } = await req.json() + const lastText = extractText(messages.at(-1)!) + + // --- Phase 1: OMA multi-agent orchestration --- + const orchestrator = new OpenMultiAgent({ + defaultModel: DEEPSEEK_MODEL, + defaultProvider: 'openai', + defaultBaseURL: DEEPSEEK_BASE_URL, + defaultApiKey: process.env.DEEPSEEK_API_KEY, + }) + + const team = orchestrator.createTeam('research-writing', { + name: 'research-writing', + agents: [researcher, writer], + sharedMemory: true, + }) + + const teamResult = await orchestrator.runTeam( + team, + `Research and write an article about: ${lastText}`, + ) + + const teamOutput = teamResult.agentResults.get('coordinator')?.output ?? '' + + // --- Phase 2: Stream result via Vercel AI SDK --- + const result = streamText({ + model: deepseek(DEEPSEEK_MODEL), + system: `You are presenting research from a multi-agent team (researcher + writer). +The team has already done the work. Your only job is to relay their output to the user +in a well-formatted way. Keep the content faithful to the team output below. +At the very end, add a one-line note that this was produced by a researcher agent +and a writer agent collaborating via open-multi-agent. + +## Team Output +${teamOutput}`, + messages: await convertToModelMessages(messages), + }) + + return result.toUIMessageStreamResponse() +} diff --git a/examples/with-vercel-ai-sdk/app/layout.tsx b/examples/with-vercel-ai-sdk/app/layout.tsx new file mode 100644 index 0000000..32ea20e --- /dev/null +++ b/examples/with-vercel-ai-sdk/app/layout.tsx @@ -0,0 +1,14 @@ +import type { Metadata } from 'next' + +export const metadata: Metadata = { + title: 'OMA + Vercel AI SDK', + description: 'Multi-agent research team powered by open-multi-agent, streamed via Vercel AI SDK', +} + +export default function RootLayout({ children }: { children: React.ReactNode }) { + return ( + + {children} + + ) +} diff --git a/examples/with-vercel-ai-sdk/app/page.tsx b/examples/with-vercel-ai-sdk/app/page.tsx new file mode 100644 index 0000000..f51ab6f --- /dev/null +++ b/examples/with-vercel-ai-sdk/app/page.tsx @@ -0,0 +1,97 @@ +'use client' + +import { useState } from 'react' +import { useChat } from '@ai-sdk/react' + +export default function Home() { + const { messages, sendMessage, status, error } = useChat() + const [input, setInput] = useState('') + + const isLoading = status === 'submitted' || status === 'streaming' + + const handleSubmit = async (e: React.FormEvent) => { + e.preventDefault() + if (!input.trim() || isLoading) return + const text = input + setInput('') + await sendMessage({ text }) + } + + return ( +
+

Research Team

+

+ Enter a topic. A researcher agent gathers information, a{' '} + writer agent composes an article — orchestrated by + open-multi-agent, streamed via Vercel AI SDK. +

+ +
+ {messages.map((m) => ( +
+
+ {m.role === 'user' ? 'You' : 'Research Team'} +
+
+ {m.parts + .filter((part): part is { type: 'text'; text: string } => part.type === 'text') + .map((part) => part.text) + .join('')} +
+
+ ))} + + {isLoading && status === 'submitted' && ( +
+ Agents are collaborating — this may take a minute... +
+ )} + + {error && ( +
+ Error: {error.message} +
+ )} +
+ +
+ setInput(e.target.value)} + placeholder="Enter a topic to research..." + disabled={isLoading} + style={{ + flex: 1, + padding: '10px 14px', + borderRadius: 8, + border: '1px solid #ddd', + fontSize: 15, + outline: 'none', + }} + /> + +
+
+ ) +} diff --git a/examples/with-vercel-ai-sdk/next-env.d.ts b/examples/with-vercel-ai-sdk/next-env.d.ts new file mode 100644 index 0000000..c4b7818 --- /dev/null +++ b/examples/with-vercel-ai-sdk/next-env.d.ts @@ -0,0 +1,6 @@ +/// +/// +import "./.next/dev/types/routes.d.ts"; + +// NOTE: This file should not be edited +// see https://nextjs.org/docs/app/api-reference/config/typescript for more information. diff --git a/examples/with-vercel-ai-sdk/next.config.ts b/examples/with-vercel-ai-sdk/next.config.ts new file mode 100644 index 0000000..200ecc8 --- /dev/null +++ b/examples/with-vercel-ai-sdk/next.config.ts @@ -0,0 +1,7 @@ +import type { NextConfig } from 'next' + +const nextConfig: NextConfig = { + serverExternalPackages: ['@jackchen_me/open-multi-agent'], +} + +export default nextConfig diff --git a/examples/with-vercel-ai-sdk/package-lock.json b/examples/with-vercel-ai-sdk/package-lock.json new file mode 100644 index 0000000..0468f2b --- /dev/null +++ b/examples/with-vercel-ai-sdk/package-lock.json @@ -0,0 +1,1209 @@ +{ + "name": "with-vercel-ai-sdk", + "lockfileVersion": 3, + "requires": true, + "packages": { + "": { + "name": "with-vercel-ai-sdk", + "dependencies": { + "@ai-sdk/openai-compatible": "^2.0.41", + "@ai-sdk/react": "^3.0.0", + "@jackchen_me/open-multi-agent": "file:../../", + "ai": "^6.0.0", + "next": "^16.0.0", + "react": "^19.0.0", + "react-dom": "^19.0.0" + }, + "devDependencies": { + "@types/node": "^22.0.0", + "@types/react": "^19.0.0", + "@types/react-dom": "^19.0.0", + "typescript": "^5.6.0" + } + }, + "../..": { + "name": "@jackchen_me/open-multi-agent", + "version": "1.1.0", + "license": "MIT", + "dependencies": { + "@anthropic-ai/sdk": "^0.52.0", + "openai": "^4.73.0", + "zod": "^3.23.0" + }, + "bin": { + "oma": "dist/cli/oma.js" + }, + "devDependencies": { + "@google/genai": "^1.48.0", + "@modelcontextprotocol/sdk": "^1.18.0", + "@types/node": "^22.0.0", + "@vitest/coverage-v8": "^2.1.9", + "tsx": "^4.21.0", + "typescript": "^5.6.0", + "vitest": "^2.1.0" + }, + "engines": { + "node": ">=18.0.0" + }, + "peerDependencies": { + "@google/genai": "^1.48.0", + "@modelcontextprotocol/sdk": "^1.18.0" + }, + "peerDependenciesMeta": { + "@google/genai": { + "optional": true + }, + "@modelcontextprotocol/sdk": { + "optional": true + } + } + }, + "node_modules/@ai-sdk/gateway": { + "version": "3.0.98", + "resolved": "https://registry.npmjs.org/@ai-sdk/gateway/-/gateway-3.0.98.tgz", + "integrity": "sha512-Ol+nP8PIlj8FjN8qKlxhE89N0woqAaGi9CUBGp1boe3RafpphJ7WMuq/RErSvxtwTqje03TP+zIdzP113krxRg==", + "license": "Apache-2.0", + "dependencies": { + "@ai-sdk/provider": "3.0.8", + "@ai-sdk/provider-utils": "4.0.23", + "@vercel/oidc": "3.1.0" + }, + "engines": { + "node": ">=18" + }, + "peerDependencies": { + "zod": "^3.25.76 || ^4.1.8" + } + }, + "node_modules/@ai-sdk/openai-compatible": { + "version": "2.0.41", + "resolved": "https://registry.npmjs.org/@ai-sdk/openai-compatible/-/openai-compatible-2.0.41.tgz", + "integrity": "sha512-kNAGINk71AlOXx10Dq/PXw4t/9XjdK8uxfpVElRwtSFMdeSiLVt58p9TPx4/FJD+hxZuVhvxYj9r42osxWq79g==", + "license": "Apache-2.0", + "dependencies": { + "@ai-sdk/provider": "3.0.8", + "@ai-sdk/provider-utils": "4.0.23" + }, + "engines": { + "node": ">=18" + }, + "peerDependencies": { + "zod": "^3.25.76 || ^4.1.8" + } + }, + "node_modules/@ai-sdk/provider": { + "version": "3.0.8", + "resolved": "https://registry.npmjs.org/@ai-sdk/provider/-/provider-3.0.8.tgz", + "integrity": "sha512-oGMAgGoQdBXbZqNG0Ze56CHjDZ1IDYOwGYxYjO5KLSlz5HiNQ9udIXsPZ61VWaHGZ5XW/jyjmr6t2xz2jGVwbQ==", + "license": "Apache-2.0", + "dependencies": { + "json-schema": "^0.4.0" + }, + "engines": { + "node": ">=18" + } + }, + "node_modules/@ai-sdk/provider-utils": { + "version": "4.0.23", + "resolved": "https://registry.npmjs.org/@ai-sdk/provider-utils/-/provider-utils-4.0.23.tgz", + "integrity": "sha512-z8GlDaCmRSDlqkMF2f4/RFgWxdarvIbyuk+m6WXT1LYgsnGiXRJGTD2Z1+SDl3LqtFuRtGX1aghYvQLoHL/9pg==", + "license": "Apache-2.0", + "dependencies": { + "@ai-sdk/provider": "3.0.8", + "@standard-schema/spec": "^1.1.0", + "eventsource-parser": "^3.0.6" + }, + "engines": { + "node": ">=18" + }, + "peerDependencies": { + "zod": "^3.25.76 || ^4.1.8" + } + }, + "node_modules/@ai-sdk/react": { + "version": "3.0.163", + "resolved": "https://registry.npmjs.org/@ai-sdk/react/-/react-3.0.163.tgz", + "integrity": "sha512-UM8BwNx4YFcG1XIBSTepIGx48RXk974qVSplVZc2JPiY86tC4Qpb8trquh5MdtSKzlS6yrUX46n8gS2WZaUIXQ==", + "license": "Apache-2.0", + "dependencies": { + "@ai-sdk/provider-utils": "4.0.23", + "ai": "6.0.161", + "swr": "^2.2.5", + "throttleit": "2.1.0" + }, + "engines": { + "node": ">=18" + }, + "peerDependencies": { + "react": "^18 || ~19.0.1 || ~19.1.2 || ^19.2.1" + } + }, + "node_modules/@emnapi/runtime": { + "version": "1.9.2", + "resolved": "https://registry.npmjs.org/@emnapi/runtime/-/runtime-1.9.2.tgz", + "integrity": "sha512-3U4+MIWHImeyu1wnmVygh5WlgfYDtyf0k8AbLhMFxOipihf6nrWC4syIm/SwEeec0mNSafiiNnMJwbza/Is6Lw==", + "license": "MIT", + "optional": true, + "dependencies": { + "tslib": "^2.4.0" + } + }, + "node_modules/@img/colour": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/@img/colour/-/colour-1.1.0.tgz", + "integrity": "sha512-Td76q7j57o/tLVdgS746cYARfSyxk8iEfRxewL9h4OMzYhbW4TAcppl0mT4eyqXddh6L/jwoM75mo7ixa/pCeQ==", + "license": "MIT", + "optional": true, + "engines": { + "node": ">=18" + } + }, + "node_modules/@img/sharp-darwin-arm64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-darwin-arm64/-/sharp-darwin-arm64-0.34.5.tgz", + "integrity": "sha512-imtQ3WMJXbMY4fxb/Ndp6HBTNVtWCUI0WdobyheGf5+ad6xX8VIDO8u2xE4qc/fr08CKG/7dDseFtn6M6g/r3w==", + "cpu": [ + "arm64" + ], + "license": "Apache-2.0", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + }, + "optionalDependencies": { + "@img/sharp-libvips-darwin-arm64": "1.2.4" + } + }, + "node_modules/@img/sharp-darwin-x64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-darwin-x64/-/sharp-darwin-x64-0.34.5.tgz", + "integrity": "sha512-YNEFAF/4KQ/PeW0N+r+aVVsoIY0/qxxikF2SWdp+NRkmMB7y9LBZAVqQ4yhGCm/H3H270OSykqmQMKLBhBJDEw==", + "cpu": [ + "x64" + ], + "license": "Apache-2.0", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + }, + "optionalDependencies": { + "@img/sharp-libvips-darwin-x64": "1.2.4" + } + }, + "node_modules/@img/sharp-libvips-darwin-arm64": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-arm64/-/sharp-libvips-darwin-arm64-1.2.4.tgz", + "integrity": "sha512-zqjjo7RatFfFoP0MkQ51jfuFZBnVE2pRiaydKJ1G/rHZvnsrHAOcQALIi9sA5co5xenQdTugCvtb1cuf78Vf4g==", + "cpu": [ + "arm64" + ], + "license": "LGPL-3.0-or-later", + "optional": true, + "os": [ + "darwin" + ], + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-libvips-darwin-x64": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-x64/-/sharp-libvips-darwin-x64-1.2.4.tgz", + "integrity": "sha512-1IOd5xfVhlGwX+zXv2N93k0yMONvUlANylbJw1eTah8K/Jtpi15KC+WSiaX/nBmbm2HxRM1gZ0nSdjSsrZbGKg==", + "cpu": [ + "x64" + ], + "license": "LGPL-3.0-or-later", + "optional": true, + "os": [ + "darwin" + ], + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-libvips-linux-arm": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm/-/sharp-libvips-linux-arm-1.2.4.tgz", + "integrity": "sha512-bFI7xcKFELdiNCVov8e44Ia4u2byA+l3XtsAj+Q8tfCwO6BQ8iDojYdvoPMqsKDkuoOo+X6HZA0s0q11ANMQ8A==", + "cpu": [ + "arm" + ], + "license": "LGPL-3.0-or-later", + "optional": true, + "os": [ + "linux" + ], + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-libvips-linux-arm64": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm64/-/sharp-libvips-linux-arm64-1.2.4.tgz", + "integrity": "sha512-excjX8DfsIcJ10x1Kzr4RcWe1edC9PquDRRPx3YVCvQv+U5p7Yin2s32ftzikXojb1PIFc/9Mt28/y+iRklkrw==", + "cpu": [ + "arm64" + ], + "license": "LGPL-3.0-or-later", + "optional": true, + "os": [ + "linux" + ], + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-libvips-linux-ppc64": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-ppc64/-/sharp-libvips-linux-ppc64-1.2.4.tgz", + "integrity": "sha512-FMuvGijLDYG6lW+b/UvyilUWu5Ayu+3r2d1S8notiGCIyYU/76eig1UfMmkZ7vwgOrzKzlQbFSuQfgm7GYUPpA==", + "cpu": [ + "ppc64" + ], + "license": "LGPL-3.0-or-later", + "optional": true, + "os": [ + "linux" + ], + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-libvips-linux-riscv64": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-riscv64/-/sharp-libvips-linux-riscv64-1.2.4.tgz", + "integrity": "sha512-oVDbcR4zUC0ce82teubSm+x6ETixtKZBh/qbREIOcI3cULzDyb18Sr/Wcyx7NRQeQzOiHTNbZFF1UwPS2scyGA==", + "cpu": [ + "riscv64" + ], + "license": "LGPL-3.0-or-later", + "optional": true, + "os": [ + "linux" + ], + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-libvips-linux-s390x": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-s390x/-/sharp-libvips-linux-s390x-1.2.4.tgz", + "integrity": "sha512-qmp9VrzgPgMoGZyPvrQHqk02uyjA0/QrTO26Tqk6l4ZV0MPWIW6LTkqOIov+J1yEu7MbFQaDpwdwJKhbJvuRxQ==", + "cpu": [ + "s390x" + ], + "license": "LGPL-3.0-or-later", + "optional": true, + "os": [ + "linux" + ], + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-libvips-linux-x64": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-x64/-/sharp-libvips-linux-x64-1.2.4.tgz", + "integrity": "sha512-tJxiiLsmHc9Ax1bz3oaOYBURTXGIRDODBqhveVHonrHJ9/+k89qbLl0bcJns+e4t4rvaNBxaEZsFtSfAdquPrw==", + "cpu": [ + "x64" + ], + "license": "LGPL-3.0-or-later", + "optional": true, + "os": [ + "linux" + ], + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-libvips-linuxmusl-arm64": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-arm64/-/sharp-libvips-linuxmusl-arm64-1.2.4.tgz", + "integrity": "sha512-FVQHuwx1IIuNow9QAbYUzJ+En8KcVm9Lk5+uGUQJHaZmMECZmOlix9HnH7n1TRkXMS0pGxIJokIVB9SuqZGGXw==", + "cpu": [ + "arm64" + ], + "license": "LGPL-3.0-or-later", + "optional": true, + "os": [ + "linux" + ], + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-libvips-linuxmusl-x64": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-x64/-/sharp-libvips-linuxmusl-x64-1.2.4.tgz", + "integrity": "sha512-+LpyBk7L44ZIXwz/VYfglaX/okxezESc6UxDSoyo2Ks6Jxc4Y7sGjpgU9s4PMgqgjj1gZCylTieNamqA1MF7Dg==", + "cpu": [ + "x64" + ], + "license": "LGPL-3.0-or-later", + "optional": true, + "os": [ + "linux" + ], + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-linux-arm": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm/-/sharp-linux-arm-0.34.5.tgz", + "integrity": "sha512-9dLqsvwtg1uuXBGZKsxem9595+ujv0sJ6Vi8wcTANSFpwV/GONat5eCkzQo/1O6zRIkh0m/8+5BjrRr7jDUSZw==", + "cpu": [ + "arm" + ], + "license": "Apache-2.0", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + }, + "optionalDependencies": { + "@img/sharp-libvips-linux-arm": "1.2.4" + } + }, + "node_modules/@img/sharp-linux-arm64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm64/-/sharp-linux-arm64-0.34.5.tgz", + "integrity": "sha512-bKQzaJRY/bkPOXyKx5EVup7qkaojECG6NLYswgktOZjaXecSAeCWiZwwiFf3/Y+O1HrauiE3FVsGxFg8c24rZg==", + "cpu": [ + "arm64" + ], + "license": "Apache-2.0", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + }, + "optionalDependencies": { + "@img/sharp-libvips-linux-arm64": "1.2.4" + } + }, + "node_modules/@img/sharp-linux-ppc64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-ppc64/-/sharp-linux-ppc64-0.34.5.tgz", + "integrity": "sha512-7zznwNaqW6YtsfrGGDA6BRkISKAAE1Jo0QdpNYXNMHu2+0dTrPflTLNkpc8l7MUP5M16ZJcUvysVWWrMefZquA==", + "cpu": [ + "ppc64" + ], + "license": "Apache-2.0", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + }, + "optionalDependencies": { + "@img/sharp-libvips-linux-ppc64": "1.2.4" + } + }, + "node_modules/@img/sharp-linux-riscv64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-riscv64/-/sharp-linux-riscv64-0.34.5.tgz", + "integrity": "sha512-51gJuLPTKa7piYPaVs8GmByo7/U7/7TZOq+cnXJIHZKavIRHAP77e3N2HEl3dgiqdD/w0yUfiJnII77PuDDFdw==", + "cpu": [ + "riscv64" + ], + "license": "Apache-2.0", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + }, + "optionalDependencies": { + "@img/sharp-libvips-linux-riscv64": "1.2.4" + } + }, + "node_modules/@img/sharp-linux-s390x": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-s390x/-/sharp-linux-s390x-0.34.5.tgz", + "integrity": "sha512-nQtCk0PdKfho3eC5MrbQoigJ2gd1CgddUMkabUj+rBevs8tZ2cULOx46E7oyX+04WGfABgIwmMC0VqieTiR4jg==", + "cpu": [ + "s390x" + ], + "license": "Apache-2.0", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + }, + "optionalDependencies": { + "@img/sharp-libvips-linux-s390x": "1.2.4" + } + }, + "node_modules/@img/sharp-linux-x64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-x64/-/sharp-linux-x64-0.34.5.tgz", + "integrity": "sha512-MEzd8HPKxVxVenwAa+JRPwEC7QFjoPWuS5NZnBt6B3pu7EG2Ge0id1oLHZpPJdn3OQK+BQDiw9zStiHBTJQQQQ==", + "cpu": [ + "x64" + ], + "license": "Apache-2.0", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + }, + "optionalDependencies": { + "@img/sharp-libvips-linux-x64": "1.2.4" + } + }, + "node_modules/@img/sharp-linuxmusl-arm64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-arm64/-/sharp-linuxmusl-arm64-0.34.5.tgz", + "integrity": "sha512-fprJR6GtRsMt6Kyfq44IsChVZeGN97gTD331weR1ex1c1rypDEABN6Tm2xa1wE6lYb5DdEnk03NZPqA7Id21yg==", + "cpu": [ + "arm64" + ], + "license": "Apache-2.0", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + }, + "optionalDependencies": { + "@img/sharp-libvips-linuxmusl-arm64": "1.2.4" + } + }, + "node_modules/@img/sharp-linuxmusl-x64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-x64/-/sharp-linuxmusl-x64-0.34.5.tgz", + "integrity": "sha512-Jg8wNT1MUzIvhBFxViqrEhWDGzqymo3sV7z7ZsaWbZNDLXRJZoRGrjulp60YYtV4wfY8VIKcWidjojlLcWrd8Q==", + "cpu": [ + "x64" + ], + "license": "Apache-2.0", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + }, + "optionalDependencies": { + "@img/sharp-libvips-linuxmusl-x64": "1.2.4" + } + }, + "node_modules/@img/sharp-wasm32": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-wasm32/-/sharp-wasm32-0.34.5.tgz", + "integrity": "sha512-OdWTEiVkY2PHwqkbBI8frFxQQFekHaSSkUIJkwzclWZe64O1X4UlUjqqqLaPbUpMOQk6FBu/HtlGXNblIs0huw==", + "cpu": [ + "wasm32" + ], + "license": "Apache-2.0 AND LGPL-3.0-or-later AND MIT", + "optional": true, + "dependencies": { + "@emnapi/runtime": "^1.7.0" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-win32-arm64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-win32-arm64/-/sharp-win32-arm64-0.34.5.tgz", + "integrity": "sha512-WQ3AgWCWYSb2yt+IG8mnC6Jdk9Whs7O0gxphblsLvdhSpSTtmu69ZG1Gkb6NuvxsNACwiPV6cNSZNzt0KPsw7g==", + "cpu": [ + "arm64" + ], + "license": "Apache-2.0 AND LGPL-3.0-or-later", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-win32-ia32": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-win32-ia32/-/sharp-win32-ia32-0.34.5.tgz", + "integrity": "sha512-FV9m/7NmeCmSHDD5j4+4pNI8Cp3aW+JvLoXcTUo0IqyjSfAZJ8dIUmijx1qaJsIiU+Hosw6xM5KijAWRJCSgNg==", + "cpu": [ + "ia32" + ], + "license": "Apache-2.0 AND LGPL-3.0-or-later", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-win32-x64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-win32-x64/-/sharp-win32-x64-0.34.5.tgz", + "integrity": "sha512-+29YMsqY2/9eFEiW93eqWnuLcWcufowXewwSNIT6UwZdUUCrM3oFjMWH/Z6/TMmb4hlFenmfAVbpWeup2jryCw==", + "cpu": [ + "x64" + ], + "license": "Apache-2.0 AND LGPL-3.0-or-later", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@jackchen_me/open-multi-agent": { + "resolved": "../..", + "link": true + }, + "node_modules/@next/env": { + "version": "16.2.3", + "resolved": "https://registry.npmjs.org/@next/env/-/env-16.2.3.tgz", + "integrity": "sha512-ZWXyj4uNu4GCWQw9cjRxWlbD+33mcDszIo9iQxFnBX3Wmgq9ulaSJcl6VhuWx5pCWqqD+9W6Wfz7N0lM5lYPMA==", + "license": "MIT" + }, + "node_modules/@next/swc-darwin-arm64": { + "version": "16.2.3", + "resolved": "https://registry.npmjs.org/@next/swc-darwin-arm64/-/swc-darwin-arm64-16.2.3.tgz", + "integrity": "sha512-u37KDKTKQ+OQLvY+z7SNXixwo4Q2/IAJFDzU1fYe66IbCE51aDSAzkNDkWmLN0yjTUh4BKBd+hb69jYn6qqqSg==", + "cpu": [ + "arm64" + ], + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@next/swc-darwin-x64": { + "version": "16.2.3", + "resolved": "https://registry.npmjs.org/@next/swc-darwin-x64/-/swc-darwin-x64-16.2.3.tgz", + "integrity": "sha512-gHjL/qy6Q6CG3176FWbAKyKh9IfntKZTB3RY/YOJdDFpHGsUDXVH38U4mMNpHVGXmeYW4wj22dMp1lTfmu/bTQ==", + "cpu": [ + "x64" + ], + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@next/swc-linux-arm64-gnu": { + "version": "16.2.3", + "resolved": "https://registry.npmjs.org/@next/swc-linux-arm64-gnu/-/swc-linux-arm64-gnu-16.2.3.tgz", + "integrity": "sha512-U6vtblPtU/P14Y/b/n9ZY0GOxbbIhTFuaFR7F4/uMBidCi2nSdaOFhA0Go81L61Zd6527+yvuX44T4ksnf8T+Q==", + "cpu": [ + "arm64" + ], + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@next/swc-linux-arm64-musl": { + "version": "16.2.3", + "resolved": "https://registry.npmjs.org/@next/swc-linux-arm64-musl/-/swc-linux-arm64-musl-16.2.3.tgz", + "integrity": "sha512-/YV0LgjHUmfhQpn9bVoGc4x4nan64pkhWR5wyEV8yCOfwwrH630KpvRg86olQHTwHIn1z59uh6JwKvHq1h4QEw==", + "cpu": [ + "arm64" + ], + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@next/swc-linux-x64-gnu": { + "version": "16.2.3", + "resolved": "https://registry.npmjs.org/@next/swc-linux-x64-gnu/-/swc-linux-x64-gnu-16.2.3.tgz", + "integrity": "sha512-/HiWEcp+WMZ7VajuiMEFGZ6cg0+aYZPqCJD3YJEfpVWQsKYSjXQG06vJP6F1rdA03COD9Fef4aODs3YxKx+RDQ==", + "cpu": [ + "x64" + ], + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@next/swc-linux-x64-musl": { + "version": "16.2.3", + "resolved": "https://registry.npmjs.org/@next/swc-linux-x64-musl/-/swc-linux-x64-musl-16.2.3.tgz", + "integrity": "sha512-Kt44hGJfZSefebhk/7nIdivoDr3Ugp5+oNz9VvF3GUtfxutucUIHfIO0ZYO8QlOPDQloUVQn4NVC/9JvHRk9hw==", + "cpu": [ + "x64" + ], + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@next/swc-win32-arm64-msvc": { + "version": "16.2.3", + "resolved": "https://registry.npmjs.org/@next/swc-win32-arm64-msvc/-/swc-win32-arm64-msvc-16.2.3.tgz", + "integrity": "sha512-O2NZ9ie3Tq6xj5Z5CSwBT3+aWAMW2PIZ4egUi9MaWLkwaehgtB7YZjPm+UpcNpKOme0IQuqDcor7BsW6QBiQBw==", + "cpu": [ + "arm64" + ], + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@next/swc-win32-x64-msvc": { + "version": "16.2.3", + "resolved": "https://registry.npmjs.org/@next/swc-win32-x64-msvc/-/swc-win32-x64-msvc-16.2.3.tgz", + "integrity": "sha512-Ibm29/GgB/ab5n7XKqlStkm54qqZE8v2FnijUPBgrd67FWrac45o/RsNlaOWjme/B5UqeWt/8KM4aWBwA1D2Kw==", + "cpu": [ + "x64" + ], + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@opentelemetry/api": { + "version": "1.9.0", + "resolved": "https://registry.npmjs.org/@opentelemetry/api/-/api-1.9.0.tgz", + "integrity": "sha512-3giAOQvZiH5F9bMlMiv8+GSPMeqg0dbaeo58/0SlA9sxSqZhnUtxzX9/2FzyhS9sWQf5S0GJE0AKBrFqjpeYcg==", + "license": "Apache-2.0", + "engines": { + "node": ">=8.0.0" + } + }, + "node_modules/@standard-schema/spec": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/@standard-schema/spec/-/spec-1.1.0.tgz", + "integrity": "sha512-l2aFy5jALhniG5HgqrD6jXLi/rUWrKvqN/qJx6yoJsgKhblVd+iqqU4RCXavm/jPityDo5TCvKMnpjKnOriy0w==", + "license": "MIT" + }, + "node_modules/@swc/helpers": { + "version": "0.5.15", + "resolved": "https://registry.npmjs.org/@swc/helpers/-/helpers-0.5.15.tgz", + "integrity": "sha512-JQ5TuMi45Owi4/BIMAJBoSQoOJu12oOk/gADqlcUL9JEdHB8vyjUSsxqeNXnmXHjYKMi2WcYtezGEEhqUI/E2g==", + "license": "Apache-2.0", + "dependencies": { + "tslib": "^2.8.0" + } + }, + "node_modules/@types/node": { + "version": "22.19.17", + "resolved": "https://registry.npmjs.org/@types/node/-/node-22.19.17.tgz", + "integrity": "sha512-wGdMcf+vPYM6jikpS/qhg6WiqSV/OhG+jeeHT/KlVqxYfD40iYJf9/AE1uQxVWFvU7MipKRkRv8NSHiCGgPr8Q==", + "dev": true, + "license": "MIT", + "dependencies": { + "undici-types": "~6.21.0" + } + }, + "node_modules/@types/react": { + "version": "19.2.14", + "resolved": "https://registry.npmjs.org/@types/react/-/react-19.2.14.tgz", + "integrity": "sha512-ilcTH/UniCkMdtexkoCN0bI7pMcJDvmQFPvuPvmEaYA/NSfFTAgdUSLAoVjaRJm7+6PvcM+q1zYOwS4wTYMF9w==", + "dev": true, + "license": "MIT", + "dependencies": { + "csstype": "^3.2.2" + } + }, + "node_modules/@types/react-dom": { + "version": "19.2.3", + "resolved": "https://registry.npmjs.org/@types/react-dom/-/react-dom-19.2.3.tgz", + "integrity": "sha512-jp2L/eY6fn+KgVVQAOqYItbF0VY/YApe5Mz2F0aykSO8gx31bYCZyvSeYxCHKvzHG5eZjc+zyaS5BrBWya2+kQ==", + "dev": true, + "license": "MIT", + "peerDependencies": { + "@types/react": "^19.2.0" + } + }, + "node_modules/@vercel/oidc": { + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/@vercel/oidc/-/oidc-3.1.0.tgz", + "integrity": "sha512-Fw28YZpRnA3cAHHDlkt7xQHiJ0fcL+NRcIqsocZQUSmbzeIKRpwttJjik5ZGanXP+vlA4SbTg+AbA3bP363l+w==", + "license": "Apache-2.0", + "engines": { + "node": ">= 20" + } + }, + "node_modules/ai": { + "version": "6.0.161", + "resolved": "https://registry.npmjs.org/ai/-/ai-6.0.161.tgz", + "integrity": "sha512-ufhmijmx2YyWTPAicGgtpLOB/xD7mG8zKs1pT1Trj+JL/3r1rS8fkMi/cHZoChSAQSGB4pgmcWVxDrVTUvK2IQ==", + "license": "Apache-2.0", + "dependencies": { + "@ai-sdk/gateway": "3.0.98", + "@ai-sdk/provider": "3.0.8", + "@ai-sdk/provider-utils": "4.0.23", + "@opentelemetry/api": "1.9.0" + }, + "engines": { + "node": ">=18" + }, + "peerDependencies": { + "zod": "^3.25.76 || ^4.1.8" + } + }, + "node_modules/baseline-browser-mapping": { + "version": "2.10.19", + "resolved": "https://registry.npmjs.org/baseline-browser-mapping/-/baseline-browser-mapping-2.10.19.tgz", + "integrity": "sha512-qCkNLi2sfBOn8XhZQ0FXsT1Ki/Yo5P90hrkRamVFRS7/KV9hpfA4HkoWNU152+8w0zPjnxo5psx5NL3PSGgv5g==", + "license": "Apache-2.0", + "bin": { + "baseline-browser-mapping": "dist/cli.cjs" + }, + "engines": { + "node": ">=6.0.0" + } + }, + "node_modules/caniuse-lite": { + "version": "1.0.30001788", + "resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001788.tgz", + "integrity": "sha512-6q8HFp+lOQtcf7wBK+uEenxymVWkGKkjFpCvw5W25cmMwEDU45p1xQFBQv8JDlMMry7eNxyBaR+qxgmTUZkIRQ==", + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/browserslist" + }, + { + "type": "tidelift", + "url": "https://tidelift.com/funding/github/npm/caniuse-lite" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "CC-BY-4.0" + }, + "node_modules/client-only": { + "version": "0.0.1", + "resolved": "https://registry.npmjs.org/client-only/-/client-only-0.0.1.tgz", + "integrity": "sha512-IV3Ou0jSMzZrd3pZ48nLkT9DA7Ag1pnPzaiQhpW7c3RbcqqzvzzVu+L8gfqMp/8IM2MQtSiqaCxrrcfu8I8rMA==", + "license": "MIT" + }, + "node_modules/csstype": { + "version": "3.2.3", + "resolved": "https://registry.npmjs.org/csstype/-/csstype-3.2.3.tgz", + "integrity": "sha512-z1HGKcYy2xA8AGQfwrn0PAy+PB7X/GSj3UVJW9qKyn43xWa+gl5nXmU4qqLMRzWVLFC8KusUX8T/0kCiOYpAIQ==", + "dev": true, + "license": "MIT" + }, + "node_modules/dequal": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/dequal/-/dequal-2.0.3.tgz", + "integrity": "sha512-0je+qPKHEMohvfRTCEo3CrPG6cAzAYgmzKyxRiYSSDkS6eGJdyVJm7WaYA5ECaAD9wLB2T4EEeymA5aFVcYXCA==", + "license": "MIT", + "engines": { + "node": ">=6" + } + }, + "node_modules/detect-libc": { + "version": "2.1.2", + "resolved": "https://registry.npmjs.org/detect-libc/-/detect-libc-2.1.2.tgz", + "integrity": "sha512-Btj2BOOO83o3WyH59e8MgXsxEQVcarkUOpEYrubB0urwnN10yQ364rsiByU11nZlqWYZm05i/of7io4mzihBtQ==", + "license": "Apache-2.0", + "optional": true, + "engines": { + "node": ">=8" + } + }, + "node_modules/eventsource-parser": { + "version": "3.0.6", + "resolved": "https://registry.npmjs.org/eventsource-parser/-/eventsource-parser-3.0.6.tgz", + "integrity": "sha512-Vo1ab+QXPzZ4tCa8SwIHJFaSzy4R6SHf7BY79rFBDf0idraZWAkYrDjDj8uWaSm3S2TK+hJ7/t1CEmZ7jXw+pg==", + "license": "MIT", + "engines": { + "node": ">=18.0.0" + } + }, + "node_modules/json-schema": { + "version": "0.4.0", + "resolved": "https://registry.npmjs.org/json-schema/-/json-schema-0.4.0.tgz", + "integrity": "sha512-es94M3nTIfsEPisRafak+HDLfHXnKBhV3vU5eqPcS3flIWqcxJWgXHXiey3YrpaNsanY5ei1VoYEbOzijuq9BA==", + "license": "(AFL-2.1 OR BSD-3-Clause)" + }, + "node_modules/nanoid": { + "version": "3.3.11", + "resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.11.tgz", + "integrity": "sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w==", + "funding": [ + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "bin": { + "nanoid": "bin/nanoid.cjs" + }, + "engines": { + "node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1" + } + }, + "node_modules/next": { + "version": "16.2.3", + "resolved": "https://registry.npmjs.org/next/-/next-16.2.3.tgz", + "integrity": "sha512-9V3zV4oZFza3PVev5/poB9g0dEafVcgNyQ8eTRop8GvxZjV2G15FC5ARuG1eFD42QgeYkzJBJzHghNP8Ad9xtA==", + "license": "MIT", + "dependencies": { + "@next/env": "16.2.3", + "@swc/helpers": "0.5.15", + "baseline-browser-mapping": "^2.9.19", + "caniuse-lite": "^1.0.30001579", + "postcss": "8.4.31", + "styled-jsx": "5.1.6" + }, + "bin": { + "next": "dist/bin/next" + }, + "engines": { + "node": ">=20.9.0" + }, + "optionalDependencies": { + "@next/swc-darwin-arm64": "16.2.3", + "@next/swc-darwin-x64": "16.2.3", + "@next/swc-linux-arm64-gnu": "16.2.3", + "@next/swc-linux-arm64-musl": "16.2.3", + "@next/swc-linux-x64-gnu": "16.2.3", + "@next/swc-linux-x64-musl": "16.2.3", + "@next/swc-win32-arm64-msvc": "16.2.3", + "@next/swc-win32-x64-msvc": "16.2.3", + "sharp": "^0.34.5" + }, + "peerDependencies": { + "@opentelemetry/api": "^1.1.0", + "@playwright/test": "^1.51.1", + "babel-plugin-react-compiler": "*", + "react": "^18.2.0 || 19.0.0-rc-de68d2f4-20241204 || ^19.0.0", + "react-dom": "^18.2.0 || 19.0.0-rc-de68d2f4-20241204 || ^19.0.0", + "sass": "^1.3.0" + }, + "peerDependenciesMeta": { + "@opentelemetry/api": { + "optional": true + }, + "@playwright/test": { + "optional": true + }, + "babel-plugin-react-compiler": { + "optional": true + }, + "sass": { + "optional": true + } + } + }, + "node_modules/picocolors": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.1.1.tgz", + "integrity": "sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA==", + "license": "ISC" + }, + "node_modules/postcss": { + "version": "8.4.31", + "resolved": "https://registry.npmjs.org/postcss/-/postcss-8.4.31.tgz", + "integrity": "sha512-PS08Iboia9mts/2ygV3eLpY5ghnUcfLV/EXTOW1E2qYxJKGGBUtNjN76FYHnMs36RmARn41bC0AZmn+rR0OVpQ==", + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/postcss/" + }, + { + "type": "tidelift", + "url": "https://tidelift.com/funding/github/npm/postcss" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "dependencies": { + "nanoid": "^3.3.6", + "picocolors": "^1.0.0", + "source-map-js": "^1.0.2" + }, + "engines": { + "node": "^10 || ^12 || >=14" + } + }, + "node_modules/react": { + "version": "19.2.5", + "resolved": "https://registry.npmjs.org/react/-/react-19.2.5.tgz", + "integrity": "sha512-llUJLzz1zTUBrskt2pwZgLq59AemifIftw4aB7JxOqf1HY2FDaGDxgwpAPVzHU1kdWabH7FauP4i1oEeer2WCA==", + "license": "MIT", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/react-dom": { + "version": "19.2.5", + "resolved": "https://registry.npmjs.org/react-dom/-/react-dom-19.2.5.tgz", + "integrity": "sha512-J5bAZz+DXMMwW/wV3xzKke59Af6CHY7G4uYLN1OvBcKEsWOs4pQExj86BBKamxl/Ik5bx9whOrvBlSDfWzgSag==", + "license": "MIT", + "dependencies": { + "scheduler": "^0.27.0" + }, + "peerDependencies": { + "react": "^19.2.5" + } + }, + "node_modules/scheduler": { + "version": "0.27.0", + "resolved": "https://registry.npmjs.org/scheduler/-/scheduler-0.27.0.tgz", + "integrity": "sha512-eNv+WrVbKu1f3vbYJT/xtiF5syA5HPIMtf9IgY/nKg0sWqzAUEvqY/xm7OcZc/qafLx/iO9FgOmeSAp4v5ti/Q==", + "license": "MIT" + }, + "node_modules/semver": { + "version": "7.7.4", + "resolved": "https://registry.npmjs.org/semver/-/semver-7.7.4.tgz", + "integrity": "sha512-vFKC2IEtQnVhpT78h1Yp8wzwrf8CM+MzKMHGJZfBtzhZNycRFnXsHk6E5TxIkkMsgNS7mdX3AGB7x2QM2di4lA==", + "license": "ISC", + "optional": true, + "bin": { + "semver": "bin/semver.js" + }, + "engines": { + "node": ">=10" + } + }, + "node_modules/sharp": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/sharp/-/sharp-0.34.5.tgz", + "integrity": "sha512-Ou9I5Ft9WNcCbXrU9cMgPBcCK8LiwLqcbywW3t4oDV37n1pzpuNLsYiAV8eODnjbtQlSDwZ2cUEeQz4E54Hltg==", + "hasInstallScript": true, + "license": "Apache-2.0", + "optional": true, + "dependencies": { + "@img/colour": "^1.0.0", + "detect-libc": "^2.1.2", + "semver": "^7.7.3" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + }, + "optionalDependencies": { + "@img/sharp-darwin-arm64": "0.34.5", + "@img/sharp-darwin-x64": "0.34.5", + "@img/sharp-libvips-darwin-arm64": "1.2.4", + "@img/sharp-libvips-darwin-x64": "1.2.4", + "@img/sharp-libvips-linux-arm": "1.2.4", + "@img/sharp-libvips-linux-arm64": "1.2.4", + "@img/sharp-libvips-linux-ppc64": "1.2.4", + "@img/sharp-libvips-linux-riscv64": "1.2.4", + "@img/sharp-libvips-linux-s390x": "1.2.4", + "@img/sharp-libvips-linux-x64": "1.2.4", + "@img/sharp-libvips-linuxmusl-arm64": "1.2.4", + "@img/sharp-libvips-linuxmusl-x64": "1.2.4", + "@img/sharp-linux-arm": "0.34.5", + "@img/sharp-linux-arm64": "0.34.5", + "@img/sharp-linux-ppc64": "0.34.5", + "@img/sharp-linux-riscv64": "0.34.5", + "@img/sharp-linux-s390x": "0.34.5", + "@img/sharp-linux-x64": "0.34.5", + "@img/sharp-linuxmusl-arm64": "0.34.5", + "@img/sharp-linuxmusl-x64": "0.34.5", + "@img/sharp-wasm32": "0.34.5", + "@img/sharp-win32-arm64": "0.34.5", + "@img/sharp-win32-ia32": "0.34.5", + "@img/sharp-win32-x64": "0.34.5" + } + }, + "node_modules/source-map-js": { + "version": "1.2.1", + "resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz", + "integrity": "sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==", + "license": "BSD-3-Clause", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/styled-jsx": { + "version": "5.1.6", + "resolved": "https://registry.npmjs.org/styled-jsx/-/styled-jsx-5.1.6.tgz", + "integrity": "sha512-qSVyDTeMotdvQYoHWLNGwRFJHC+i+ZvdBRYosOFgC+Wg1vx4frN2/RG/NA7SYqqvKNLf39P2LSRA2pu6n0XYZA==", + "license": "MIT", + "dependencies": { + "client-only": "0.0.1" + }, + "engines": { + "node": ">= 12.0.0" + }, + "peerDependencies": { + "react": ">= 16.8.0 || 17.x.x || ^18.0.0-0 || ^19.0.0-0" + }, + "peerDependenciesMeta": { + "@babel/core": { + "optional": true + }, + "babel-plugin-macros": { + "optional": true + } + } + }, + "node_modules/swr": { + "version": "2.4.1", + "resolved": "https://registry.npmjs.org/swr/-/swr-2.4.1.tgz", + "integrity": "sha512-2CC6CiKQtEwaEeNiqWTAw9PGykW8SR5zZX8MZk6TeAvEAnVS7Visz8WzphqgtQ8v2xz/4Q5K+j+SeMaKXeeQIA==", + "license": "MIT", + "dependencies": { + "dequal": "^2.0.3", + "use-sync-external-store": "^1.6.0" + }, + "peerDependencies": { + "react": "^16.11.0 || ^17.0.0 || ^18.0.0 || ^19.0.0" + } + }, + "node_modules/throttleit": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/throttleit/-/throttleit-2.1.0.tgz", + "integrity": "sha512-nt6AMGKW1p/70DF/hGBdJB57B8Tspmbp5gfJ8ilhLnt7kkr2ye7hzD6NVG8GGErk2HWF34igrL2CXmNIkzKqKw==", + "license": "MIT", + "engines": { + "node": ">=18" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/tslib": { + "version": "2.8.1", + "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.8.1.tgz", + "integrity": "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w==", + "license": "0BSD" + }, + "node_modules/typescript": { + "version": "5.9.3", + "resolved": "https://registry.npmjs.org/typescript/-/typescript-5.9.3.tgz", + "integrity": "sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw==", + "dev": true, + "license": "Apache-2.0", + "bin": { + "tsc": "bin/tsc", + "tsserver": "bin/tsserver" + }, + "engines": { + "node": ">=14.17" + } + }, + "node_modules/undici-types": { + "version": "6.21.0", + "resolved": "https://registry.npmjs.org/undici-types/-/undici-types-6.21.0.tgz", + "integrity": "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ==", + "dev": true, + "license": "MIT" + }, + "node_modules/use-sync-external-store": { + "version": "1.6.0", + "resolved": "https://registry.npmjs.org/use-sync-external-store/-/use-sync-external-store-1.6.0.tgz", + "integrity": "sha512-Pp6GSwGP/NrPIrxVFAIkOQeyw8lFenOHijQWkUTrDvrF4ALqylP2C/KCkeS9dpUM3KvYRQhna5vt7IL95+ZQ9w==", + "license": "MIT", + "peerDependencies": { + "react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0" + } + }, + "node_modules/zod": { + "version": "4.3.6", + "resolved": "https://registry.npmjs.org/zod/-/zod-4.3.6.tgz", + "integrity": "sha512-rftlrkhHZOcjDwkGlnUtZZkvaPHCsDATp4pGpuOOMDaTdDDXF91wuVDJoWoPsKX/3YPQ5fHuF3STjcYyKr+Qhg==", + "license": "MIT", + "peer": true, + "funding": { + "url": "https://github.com/sponsors/colinhacks" + } + } + } +} diff --git a/examples/with-vercel-ai-sdk/package.json b/examples/with-vercel-ai-sdk/package.json new file mode 100644 index 0000000..90180cd --- /dev/null +++ b/examples/with-vercel-ai-sdk/package.json @@ -0,0 +1,25 @@ +{ + "name": "with-vercel-ai-sdk", + "private": true, + "scripts": { + "predev": "cd ../.. && npm run build", + "dev": "next dev", + "build": "next build", + "start": "next start" + }, + "dependencies": { + "@ai-sdk/openai-compatible": "^2.0.41", + "@ai-sdk/react": "^3.0.0", + "@jackchen_me/open-multi-agent": "file:../../", + "ai": "^6.0.0", + "next": "^16.0.0", + "react": "^19.0.0", + "react-dom": "^19.0.0" + }, + "devDependencies": { + "@types/node": "^22.0.0", + "@types/react": "^19.0.0", + "@types/react-dom": "^19.0.0", + "typescript": "^5.6.0" + } +} diff --git a/examples/with-vercel-ai-sdk/tsconfig.json b/examples/with-vercel-ai-sdk/tsconfig.json new file mode 100644 index 0000000..4a0480c --- /dev/null +++ b/examples/with-vercel-ai-sdk/tsconfig.json @@ -0,0 +1,41 @@ +{ + "compilerOptions": { + "target": "ES2022", + "lib": [ + "dom", + "dom.iterable", + "ES2022" + ], + "allowJs": true, + "skipLibCheck": true, + "strict": true, + "noEmit": true, + "esModuleInterop": true, + "module": "ESNext", + "moduleResolution": "bundler", + "resolveJsonModule": true, + "isolatedModules": true, + "jsx": "react-jsx", + "incremental": true, + "plugins": [ + { + "name": "next" + } + ], + "paths": { + "@/*": [ + "./*" + ] + } + }, + "include": [ + "next-env.d.ts", + "**/*.ts", + "**/*.tsx", + ".next/types/**/*.ts", + ".next/dev/types/**/*.ts" + ], + "exclude": [ + "node_modules" + ] +} diff --git a/src/agent/runner.ts b/src/agent/runner.ts index d1a1ebb..df1cbc0 100644 --- a/src/agent/runner.ts +++ b/src/agent/runner.ts @@ -448,8 +448,10 @@ export class AgentRunner { } // 3. Apply denylist filter if set - if (this.options.disallowedTools) { - const denied = new Set(this.options.disallowedTools) + const denied = this.options.disallowedTools + ? new Set(this.options.disallowedTools) + : undefined + if (denied) { filteredTools = filteredTools.filter(t => !denied.has(t.name)) } @@ -457,8 +459,11 @@ export class AgentRunner { const frameworkDenied = new Set(AGENT_FRAMEWORK_DISALLOWED) filteredTools = filteredTools.filter(t => !frameworkDenied.has(t.name)) - // Runtime-added custom tools stay available regardless of filtering rules. - return [...filteredTools, ...runtimeCustomTools] + // Runtime-added custom tools bypass preset / allowlist but respect denylist. + const finalRuntime = denied + ? runtimeCustomTools.filter(t => !denied.has(t.name)) + : runtimeCustomTools + return [...filteredTools, ...finalRuntime] } // ------------------------------------------------------------------------- diff --git a/src/cli/oma.ts b/src/cli/oma.ts index d56304a..859d9d5 100644 --- a/src/cli/oma.ts +++ b/src/cli/oma.ts @@ -50,6 +50,8 @@ const PROVIDER_REFERENCE: ReadonlyArray<{ { id: 'openai', apiKeyEnv: ['OPENAI_API_KEY'], baseUrlSupported: true, notes: 'Set baseURL for Ollama / vLLM / LM Studio; apiKey may be a placeholder.' }, { id: 'gemini', apiKeyEnv: ['GEMINI_API_KEY', 'GOOGLE_API_KEY'], baseUrlSupported: false }, { id: 'grok', apiKeyEnv: ['XAI_API_KEY'], baseUrlSupported: true }, + { id: 'minimax', apiKeyEnv: ['MINIMAX_API_KEY'], baseUrlSupported: true, notes: 'Global endpoint: https://api.minimax.io/v1 (default). China endpoint: https://api.minimaxi.com/v1. Set MINIMAX_BASE_URL to choose, or pass baseURL in agent config.' }, + { id: 'deepseek', apiKeyEnv: ['DEEPSEEK_API_KEY'], baseUrlSupported: true, notes: 'OpenAI-compatible endpoint at https://api.deepseek.com/v1. Models: deepseek-chat (V3), deepseek-reasoner (thinking).' }, { id: 'copilot', apiKeyEnv: ['GITHUB_COPILOT_TOKEN', 'GITHUB_TOKEN'], @@ -259,6 +261,8 @@ const DEFAULT_MODEL_HINT: Record = { gemini: 'gemini-2.0-flash', grok: 'grok-2-latest', copilot: 'gpt-4o', + minimax: 'MiniMax-M2.7', + deepseek: 'deepseek-chat', } async function cmdProvider(sub: string | undefined, arg: string | undefined, pretty: boolean): Promise { diff --git a/src/llm/adapter.ts b/src/llm/adapter.ts index dc4fe82..75426ef 100644 --- a/src/llm/adapter.ts +++ b/src/llm/adapter.ts @@ -38,7 +38,7 @@ import type { LLMAdapter } from '../types.js' * Additional providers can be integrated by implementing {@link LLMAdapter} * directly and bypassing this factory. */ -export type SupportedProvider = 'anthropic' | 'copilot' | 'grok' | 'openai' | 'gemini' +export type SupportedProvider = 'anthropic' | 'copilot' | 'deepseek' | 'grok' | 'minimax' | 'openai' | 'gemini' /** * Instantiate the appropriate {@link LLMAdapter} for the given provider. @@ -49,6 +49,8 @@ export type SupportedProvider = 'anthropic' | 'copilot' | 'grok' | 'openai' | 'g * - `openai` → `OPENAI_API_KEY` * - `gemini` → `GEMINI_API_KEY` / `GOOGLE_API_KEY` * - `grok` → `XAI_API_KEY` + * - `minimax` → `MINIMAX_API_KEY` + * - `deepseek` → `DEEPSEEK_API_KEY` * - `copilot` → `GITHUB_COPILOT_TOKEN` / `GITHUB_TOKEN`, or interactive * OAuth2 device flow if neither is set * @@ -89,6 +91,14 @@ export async function createAdapter( const { GrokAdapter } = await import('./grok.js') return new GrokAdapter(apiKey, baseURL) } + case 'minimax': { + const { MiniMaxAdapter } = await import('./minimax.js') + return new MiniMaxAdapter(apiKey, baseURL) + } + case 'deepseek': { + const { DeepSeekAdapter } = await import('./deepseek.js') + return new DeepSeekAdapter(apiKey, baseURL) + } default: { // The `never` cast here makes TypeScript enforce exhaustiveness. const _exhaustive: never = provider diff --git a/src/llm/deepseek.ts b/src/llm/deepseek.ts new file mode 100644 index 0000000..cb52a24 --- /dev/null +++ b/src/llm/deepseek.ts @@ -0,0 +1,29 @@ +/** + * @fileoverview DeepSeek adapter. + * + * Thin wrapper around OpenAIAdapter that hard-codes the official DeepSeek + * OpenAI-compatible endpoint and DEEPSEEK_API_KEY environment variable fallback. + */ + +import { OpenAIAdapter } from './openai.js' + +/** + * LLM adapter for DeepSeek models (deepseek-chat, deepseek-reasoner, and future models). + * + * Thread-safe. Can be shared across agents. + * + * Usage: + * provider: 'deepseek' + * model: 'deepseek-chat' (or 'deepseek-reasoner' for the thinking model) + */ +export class DeepSeekAdapter extends OpenAIAdapter { + readonly name = 'deepseek' + + constructor(apiKey?: string, baseURL?: string) { + // Allow override of baseURL (for proxies or future changes) but default to official DeepSeek endpoint. + super( + apiKey ?? process.env['DEEPSEEK_API_KEY'], + baseURL ?? 'https://api.deepseek.com/v1' + ) + } +} diff --git a/src/llm/minimax.ts b/src/llm/minimax.ts new file mode 100644 index 0000000..f912c1c --- /dev/null +++ b/src/llm/minimax.ts @@ -0,0 +1,29 @@ +/** + * @fileoverview MiniMax adapter. + * + * Thin wrapper around OpenAIAdapter that hard-codes the official MiniMax + * OpenAI-compatible endpoint and MINIMAX_API_KEY environment variable fallback. + */ + +import { OpenAIAdapter } from './openai.js' + +/** + * LLM adapter for MiniMax models (MiniMax-M2.7 series and future models). + * + * Thread-safe. Can be shared across agents. + * + * Usage: + * provider: 'minimax' + * model: 'MiniMax-M2.7' (or any current MiniMax model name) + */ +export class MiniMaxAdapter extends OpenAIAdapter { + readonly name = 'minimax' + + constructor(apiKey?: string, baseURL?: string) { + // Allow override of baseURL (for proxies or future changes) but default to official MiniMax endpoint. + super( + apiKey ?? process.env['MINIMAX_API_KEY'], + baseURL ?? process.env['MINIMAX_BASE_URL'] ?? 'https://api.minimax.io/v1' + ) + } +} diff --git a/src/orchestrator/orchestrator.ts b/src/orchestrator/orchestrator.ts index 71d9f29..12bcf2e 100644 --- a/src/orchestrator/orchestrator.ts +++ b/src/orchestrator/orchestrator.ts @@ -212,6 +212,11 @@ function resolveTokenBudget(primary?: number, fallback?: number): number | undef function buildAgent(config: AgentConfig): Agent { const registry = new ToolRegistry() registerBuiltInTools(registry) + if (config.customTools) { + for (const tool of config.customTools) { + registry.register(tool, { runtimeAdded: true }) + } + } const executor = new ToolExecutor(registry, { ...(config.maxToolOutputChars !== undefined ? { maxToolOutputChars: config.maxToolOutputChars } diff --git a/src/types.ts b/src/types.ts index 1ef7e92..2ebed59 100644 --- a/src/types.ts +++ b/src/types.ts @@ -229,6 +229,16 @@ export interface AgentConfig { /** API key override; falls back to the provider's standard env var. */ readonly apiKey?: string readonly systemPrompt?: string + /** + * Custom tool definitions to register alongside built-in tools. + * Created via `defineTool()`. Custom tools bypass `tools` (allowlist) + * and `toolPreset` filtering, but can still be blocked by `disallowedTools`. + * + * Tool names must not collide with built-in tool names; a duplicate name + * will throw at registration time. + */ + // eslint-disable-next-line @typescript-eslint/no-explicit-any + readonly customTools?: readonly ToolDefinition[] /** Names of tools (from the tool registry) available to this agent. */ readonly tools?: readonly string[] /** Names of tools explicitly disallowed for this agent. */ diff --git a/tests/deepseek-adapter.test.ts b/tests/deepseek-adapter.test.ts new file mode 100644 index 0000000..2d68427 --- /dev/null +++ b/tests/deepseek-adapter.test.ts @@ -0,0 +1,74 @@ +import { describe, it, expect, vi, beforeEach } from 'vitest' + +// --------------------------------------------------------------------------- +// Mock OpenAI constructor (must be hoisted for Vitest) +// --------------------------------------------------------------------------- +const OpenAIMock = vi.hoisted(() => vi.fn()) + +vi.mock('openai', () => ({ + default: OpenAIMock, +})) + +import { DeepSeekAdapter } from '../src/llm/deepseek.js' +import { createAdapter } from '../src/llm/adapter.js' + +// --------------------------------------------------------------------------- +// DeepSeekAdapter tests +// --------------------------------------------------------------------------- + +describe('DeepSeekAdapter', () => { + beforeEach(() => { + OpenAIMock.mockClear() + }) + + it('has name "deepseek"', () => { + const adapter = new DeepSeekAdapter() + expect(adapter.name).toBe('deepseek') + }) + + it('uses DEEPSEEK_API_KEY by default', () => { + const original = process.env['DEEPSEEK_API_KEY'] + process.env['DEEPSEEK_API_KEY'] = 'deepseek-test-key-123' + + try { + new DeepSeekAdapter() + expect(OpenAIMock).toHaveBeenCalledWith( + expect.objectContaining({ + apiKey: 'deepseek-test-key-123', + baseURL: 'https://api.deepseek.com/v1', + }) + ) + } finally { + if (original === undefined) { + delete process.env['DEEPSEEK_API_KEY'] + } else { + process.env['DEEPSEEK_API_KEY'] = original + } + } + }) + + it('uses official DeepSeek baseURL by default', () => { + new DeepSeekAdapter('some-key') + expect(OpenAIMock).toHaveBeenCalledWith( + expect.objectContaining({ + apiKey: 'some-key', + baseURL: 'https://api.deepseek.com/v1', + }) + ) + }) + + it('allows overriding apiKey and baseURL', () => { + new DeepSeekAdapter('custom-key', 'https://custom.endpoint/v1') + expect(OpenAIMock).toHaveBeenCalledWith( + expect.objectContaining({ + apiKey: 'custom-key', + baseURL: 'https://custom.endpoint/v1', + }) + ) + }) + + it('createAdapter("deepseek") returns DeepSeekAdapter instance', async () => { + const adapter = await createAdapter('deepseek') + expect(adapter).toBeInstanceOf(DeepSeekAdapter) + }) +}) diff --git a/tests/minimax-adapter.test.ts b/tests/minimax-adapter.test.ts new file mode 100644 index 0000000..3773e1f --- /dev/null +++ b/tests/minimax-adapter.test.ts @@ -0,0 +1,95 @@ +import { describe, it, expect, vi, beforeEach } from 'vitest' + +// --------------------------------------------------------------------------- +// Mock OpenAI constructor (must be hoisted for Vitest) +// --------------------------------------------------------------------------- +const OpenAIMock = vi.hoisted(() => vi.fn()) + +vi.mock('openai', () => ({ + default: OpenAIMock, +})) + +import { MiniMaxAdapter } from '../src/llm/minimax.js' +import { createAdapter } from '../src/llm/adapter.js' + +// --------------------------------------------------------------------------- +// MiniMaxAdapter tests +// --------------------------------------------------------------------------- + +describe('MiniMaxAdapter', () => { + beforeEach(() => { + OpenAIMock.mockClear() + }) + + it('has name "minimax"', () => { + const adapter = new MiniMaxAdapter() + expect(adapter.name).toBe('minimax') + }) + + it('uses MINIMAX_API_KEY by default', () => { + const original = process.env['MINIMAX_API_KEY'] + process.env['MINIMAX_API_KEY'] = 'minimax-test-key-123' + + try { + new MiniMaxAdapter() + expect(OpenAIMock).toHaveBeenCalledWith( + expect.objectContaining({ + apiKey: 'minimax-test-key-123', + baseURL: 'https://api.minimax.io/v1', + }) + ) + } finally { + if (original === undefined) { + delete process.env['MINIMAX_API_KEY'] + } else { + process.env['MINIMAX_API_KEY'] = original + } + } + }) + + it('uses official MiniMax global baseURL by default', () => { + new MiniMaxAdapter('some-key') + expect(OpenAIMock).toHaveBeenCalledWith( + expect.objectContaining({ + apiKey: 'some-key', + baseURL: 'https://api.minimax.io/v1', + }) + ) + }) + + it('uses MINIMAX_BASE_URL env var when set', () => { + const original = process.env['MINIMAX_BASE_URL'] + process.env['MINIMAX_BASE_URL'] = 'https://api.minimaxi.com/v1' + + try { + new MiniMaxAdapter('some-key') + expect(OpenAIMock).toHaveBeenCalledWith( + expect.objectContaining({ + apiKey: 'some-key', + baseURL: 'https://api.minimaxi.com/v1', + }) + ) + } finally { + if (original === undefined) { + delete process.env['MINIMAX_BASE_URL'] + } else { + process.env['MINIMAX_BASE_URL'] = original + } + } + }) + + it('allows overriding apiKey and baseURL', () => { + new MiniMaxAdapter('custom-key', 'https://custom.endpoint/v1') + expect(OpenAIMock).toHaveBeenCalledWith( + expect.objectContaining({ + apiKey: 'custom-key', + baseURL: 'https://custom.endpoint/v1', + }) + ) + }) + + it('createAdapter("minimax") returns MiniMaxAdapter instance', async () => { + const adapter = await createAdapter('minimax') + expect(adapter).toBeInstanceOf(MiniMaxAdapter) + }) +}) diff --git a/tests/orchestrator.test.ts b/tests/orchestrator.test.ts index 5ef0052..0210647 100644 --- a/tests/orchestrator.test.ts +++ b/tests/orchestrator.test.ts @@ -155,6 +155,80 @@ describe('OpenMultiAgent', () => { expect(oma.getStatus().completedTasks).toBe(1) }) + it('registers customTools so they are available to the LLM', async () => { + mockAdapterResponses = ['used custom tool'] + + const { z } = await import('zod') + const { defineTool } = await import('../src/tool/framework.js') + + const myTool = defineTool({ + name: 'my_custom_tool', + description: 'A custom tool for testing', + inputSchema: z.object({ query: z.string() }), + execute: async ({ query }) => ({ data: query }), + }) + + const oma = new OpenMultiAgent({ defaultModel: 'mock-model' }) + await oma.runAgent( + { ...agentConfig('solo'), customTools: [myTool] }, + 'Use the custom tool', + ) + + const toolNames = capturedChatOptions[0]?.tools?.map(t => t.name) ?? [] + expect(toolNames).toContain('my_custom_tool') + }) + + it('customTools bypass tools allowlist and toolPreset filtering', async () => { + mockAdapterResponses = ['done'] + + const { z } = await import('zod') + const { defineTool } = await import('../src/tool/framework.js') + + const myTool = defineTool({ + name: 'my_custom_tool', + description: 'A custom tool for testing', + inputSchema: z.object({ query: z.string() }), + execute: async ({ query }) => ({ data: query }), + }) + + const oma = new OpenMultiAgent({ defaultModel: 'mock-model' }) + + // toolPreset 'readonly' only allows file_read, grep, glob — custom tool should still appear + await oma.runAgent( + { ...agentConfig('solo'), customTools: [myTool], toolPreset: 'readonly' }, + 'test', + ) + + const toolNames = capturedChatOptions[0]?.tools?.map(t => t.name) ?? [] + expect(toolNames).toContain('my_custom_tool') + // built-in tools outside the preset should be filtered + expect(toolNames).not.toContain('bash') + }) + + it('customTools can be blocked by disallowedTools', async () => { + mockAdapterResponses = ['done'] + + const { z } = await import('zod') + const { defineTool } = await import('../src/tool/framework.js') + + const myTool = defineTool({ + name: 'my_custom_tool', + description: 'A custom tool for testing', + inputSchema: z.object({ query: z.string() }), + execute: async ({ query }) => ({ data: query }), + }) + + const oma = new OpenMultiAgent({ defaultModel: 'mock-model' }) + + await oma.runAgent( + { ...agentConfig('solo'), customTools: [myTool], disallowedTools: ['my_custom_tool'] }, + 'test', + ) + + const toolNames = capturedChatOptions[0]?.tools?.map(t => t.name) ?? [] + expect(toolNames).not.toContain('my_custom_tool') + }) + it('fires onProgress events', async () => { mockAdapterResponses = ['done'] diff --git a/tests/tool-filtering.test.ts b/tests/tool-filtering.test.ts index f42b8e1..418e702 100644 --- a/tests/tool-filtering.test.ts +++ b/tests/tool-filtering.test.ts @@ -216,8 +216,8 @@ describe('Tool filtering', () => { const tools = (runner as any).resolveTools() as LLMToolDef[] const toolNames = tools.map((t: LLMToolDef) => t.name).sort() + // custom_tool is runtime-added but disallowedTools still blocks it expect(toolNames).toEqual([ - 'custom_tool', 'file_edit', 'file_read', 'file_write', @@ -286,7 +286,7 @@ describe('Tool filtering', () => { expect(toolNames).toEqual(['custom_tool']) }) - it('runtime-added tools bypass filtering regardless of tool name', () => { + it('runtime-added tools are blocked by disallowedTools', () => { const runtimeBuiltinNamedRegistry = new ToolRegistry() runtimeBuiltinNamedRegistry.register(defineTool({ name: 'file_read', @@ -306,7 +306,7 @@ describe('Tool filtering', () => { ) const tools = (runtimeBuiltinNamedRunner as any).resolveTools() as LLMToolDef[] - expect(tools.map(t => t.name)).toEqual(['file_read']) + expect(tools.map(t => t.name)).toEqual([]) }) })