Support DeepSeek

This commit is contained in:
hkalex 2026-04-16 11:36:03 +10:00
parent 5305cb2321
commit fa4533e8d0
8 changed files with 306 additions and 8 deletions

View File

@ -17,7 +17,7 @@ CrewAI is Python. LangGraph makes you draw the graph by hand. `open-multi-agent`
- **Goal to result in one call.** `runTeam(team, "Build a REST API")` kicks off a coordinator agent that decomposes the goal into a task DAG, resolves dependencies, runs independent tasks in parallel, and synthesizes the final output. No graph to draw, no tasks to wire up.
- **TypeScript-native, three runtime dependencies.** `@anthropic-ai/sdk`, `openai`, `zod`. That is the whole runtime. Embed in Express, Next.js, serverless functions, or CI/CD pipelines. No Python runtime, no subprocess bridge, no cloud sidecar.
- **Multi-model teams.** Claude, GPT, Gemini, Grok, MiniMax, Copilot, or any OpenAI-compatible local model (Ollama, vLLM, LM Studio, llama.cpp) in the same team. Run the architect on Opus 4.6, the developer on GPT-5.4, the reviewer on local Gemma 4, all in one `runTeam()` call. Gemini ships as an optional peer dependency: `npm install @google/genai` to enable.
- **Multi-model teams.** Claude, GPT, Gemini, Grok, MiniMax, DeepSeek, Copilot, or any OpenAI-compatible local model (Ollama, vLLM, LM Studio, llama.cpp) in the same team. Run the architect on Opus 4.6, the developer on GPT-5.4, the reviewer on local Gemma 4, all in one `runTeam()` call. Gemini ships as an optional peer dependency: `npm install @google/genai` to enable.
Other features (MCP integration, context strategies, structured output, task retry, human-in-the-loop, lifecycle hooks, loop detection, observability) live below the fold and in [`examples/`](./examples/).
@ -74,6 +74,7 @@ Set the API key for your provider. Local models via Ollama require no API key
- `XAI_API_KEY` (for Grok)
- `MINIMAX_API_KEY` (for MiniMax)
- `MINIMAX_BASE_URL` (for MiniMax — optional, selects endpoint)
- `DEEPSEEK_API_KEY` (for DeepSeek)
- `GITHUB_TOKEN` (for Copilot)
**CLI (`oma`).** For shell and CI, the package exposes a JSON-first binary. See [docs/cli.md](./docs/cli.md) for `oma run`, `oma task`, `oma provider`, exit codes, and file formats.
@ -141,13 +142,14 @@ For MapReduce-style fan-out without task dependencies, use `AgentPool.runParalle
## Examples
17 runnable scripts and 1 full-stack demo in [`examples/`](./examples/). Start with these:
18 runnable scripts and 1 full-stack demo in [`examples/`](./examples/). Start with these:
- [02 — Team Collaboration](examples/02-team-collaboration.ts): `runTeam()` coordinator pattern.
- [06 — Local Model](examples/06-local-model.ts): Ollama and Claude in one pipeline via `baseURL`.
- [09 — Structured Output](examples/09-structured-output.ts): any agent returns Zod-validated JSON.
- [11 — Trace Observability](examples/11-trace-observability.ts): `onTrace` spans for LLM calls, tools, and tasks.
- [17 — MiniMax](examples/17-minimax.ts): three-agent team using MiniMax M2.7.
- [18 — DeepSeek](examples/18-deepseek.ts): three-agent team using DeepSeek Chat.
- [with-vercel-ai-sdk](examples/with-vercel-ai-sdk/): Next.js app — OMA `runTeam()` + AI SDK `useChat` streaming.
Run scripts with `npx tsx examples/02-team-collaboration.ts`.
@ -187,6 +189,7 @@ Run scripts with `npx tsx examples/02-team-collaboration.ts`.
│ │ - GeminiAdapter │
│ │ - GrokAdapter │
│ │ - MiniMaxAdapter │
│ │ - DeepSeekAdapter │
│ └──────────────────────┘
┌────────▼──────────┐
│ AgentRunner │ ┌──────────────────────┐
@ -288,6 +291,7 @@ Notes:
| Grok (xAI) | `provider: 'grok'` | `XAI_API_KEY` | Verified |
| MiniMax (global) | `provider: 'minimax'` | `MINIMAX_API_KEY` | Verified |
| MiniMax (China) | `provider: 'minimax'` + `MINIMAX_BASE_URL` | `MINIMAX_API_KEY` | Verified |
| DeepSeek | `provider: 'deepseek'` | `DEEPSEEK_API_KEY` | Verified |
| GitHub Copilot | `provider: 'copilot'` | `GITHUB_TOKEN` | Verified |
| Gemini | `provider: 'gemini'` | `GEMINI_API_KEY` | Verified |
| Ollama / vLLM / LM Studio | `provider: 'openai'` + `baseURL` | — | Verified |
@ -297,7 +301,7 @@ Gemini requires `npm install @google/genai` (optional peer dependency).
Verified local models with tool-calling: **Gemma 4** (see [example 08](examples/08-gemma4-local.ts)).
Any OpenAI-compatible API should work via `provider: 'openai'` + `baseURL` (DeepSeek, Groq, Mistral, Qwen, etc.). **Grok and MiniMax now have first-class support** via `provider: 'grok'` and `provider: 'minimax'`.
Any OpenAI-compatible API should work via `provider: 'openai'` + `baseURL` (Groq, Mistral, Qwen, etc.). **Grok, MiniMax, and DeepSeek now have first-class support** via `provider: 'grok'`, `provider: 'minimax'`, and `provider: 'deepseek'`.
### Local Model Tool-Calling
@ -355,6 +359,17 @@ Set `MINIMAX_API_KEY`. The adapter selects the endpoint via `MINIMAX_BASE_URL`:
You can also pass `baseURL` directly in `AgentConfig` to override the env var.
```typescript
const deepseekAgent: AgentConfig = {
name: 'deepseek-agent',
provider: 'deepseek',
model: 'deepseek-chat',
systemPrompt: 'You are a helpful assistant.',
}
```
Set `DEEPSEEK_API_KEY`. Available models: `deepseek-chat` (DeepSeek-V3, recommended for coding) and `deepseek-reasoner` (thinking mode).
## Contributing
Issues, feature requests, and PRs are welcome. Some areas where contributions would be especially valuable:

View File

@ -17,7 +17,7 @@ CrewAI 是 Python。LangGraph 需要你自己画图。`open-multi-agent` 是你
- **一次调用从目标到结果。** `runTeam(team, "构建一个 REST API")` 启动一个协调者 agent把目标拆成任务 DAG解析依赖独立任务并行执行最终合成输出。不需要画图不需要手动连任务。
- **TypeScript 原生3 个运行时依赖。** `@anthropic-ai/sdk`、`openai`、`zod`。这就是全部运行时。可嵌入 Express、Next.js、Serverless 函数或 CI/CD 流水线。没有 Python 运行时,没有子进程桥接,没有云端 sidecar。
- **多模型团队。** Claude、GPT、Gemini、Grok、MiniMax、Copilot或任何 OpenAI 兼容的本地模型Ollama、vLLM、LM Studio、llama.cpp可以在同一个团队中使用。让架构师用 Opus 4.6,开发者用 GPT-5.4,评审用本地的 Gemma 4一次 `runTeam()` 调用全部搞定。Gemini 作为 optional peer dependency 提供:使用前需 `npm install @google/genai`
- **多模型团队。** Claude、GPT、Gemini、Grok、MiniMax、DeepSeek、Copilot或任何 OpenAI 兼容的本地模型Ollama、vLLM、LM Studio、llama.cpp可以在同一个团队中使用。让架构师用 Opus 4.6,开发者用 GPT-5.4,评审用本地的 Gemma 4一次 `runTeam()` 调用全部搞定。Gemini 作为 optional peer dependency 提供:使用前需 `npm install @google/genai`
其他能力MCP 集成、上下文策略、结构化输出、任务重试、人机协同、生命周期钩子、循环检测、可观测性)在下方章节和 [`examples/`](./examples/) 里。
@ -74,6 +74,7 @@ npm install @jackchen_me/open-multi-agent
- `XAI_API_KEY`Grok
- `MINIMAX_API_KEY`MiniMax
- `MINIMAX_BASE_URL`MiniMax — 可选,用于选择接入端点)
- `DEEPSEEK_API_KEY`DeepSeek
- `GITHUB_TOKEN`Copilot
三个智能体,一个目标——框架处理剩下的一切:
@ -139,13 +140,14 @@ Tokens: 12847 output tokens
## 示例
[`examples/`](./examples/) 里有 17 个可运行脚本和 1 个完整项目。推荐从这几个开始:
[`examples/`](./examples/) 里有 18 个可运行脚本和 1 个完整项目。推荐从这几个开始:
- [02 — 团队协作](examples/02-team-collaboration.ts)`runTeam()` 协调者模式。
- [06 — 本地模型](examples/06-local-model.ts):通过 `baseURL` 把 Ollama 和 Claude 放在同一条管线。
- [09 — 结构化输出](examples/09-structured-output.ts):任意 agent 产出 Zod 校验过的 JSON。
- [11 — 可观测性](examples/11-trace-observability.ts)`onTrace` 回调,为 LLM 调用、工具、任务发出结构化 span。
- [17 — MiniMax](examples/17-minimax.ts):使用 MiniMax M2.7 的三智能体团队。
- [18 — DeepSeek](examples/18-deepseek.ts):使用 DeepSeek Chat 的三智能体团队。
- [with-vercel-ai-sdk](examples/with-vercel-ai-sdk/)Next.js 应用 — OMA `runTeam()` + AI SDK `useChat` 流式输出。
`npx tsx examples/02-team-collaboration.ts` 运行脚本示例。
@ -185,6 +187,7 @@ Tokens: 12847 output tokens
│ │ - GeminiAdapter │
│ │ - GrokAdapter │
│ │ - MiniMaxAdapter │
│ │ - DeepSeekAdapter │
│ └──────────────────────┘
┌────────▼──────────┐
│ AgentRunner │ ┌──────────────────────┐
@ -262,6 +265,7 @@ const customAgent: AgentConfig = {
| Grok (xAI) | `provider: 'grok'` | `XAI_API_KEY` | 已验证 |
| MiniMax全球 | `provider: 'minimax'` | `MINIMAX_API_KEY` | 已验证 |
| MiniMax国内 | `provider: 'minimax'` + `MINIMAX_BASE_URL` | `MINIMAX_API_KEY` | 已验证 |
| DeepSeek | `provider: 'deepseek'` | `DEEPSEEK_API_KEY` | 已验证 |
| GitHub Copilot | `provider: 'copilot'` | `GITHUB_TOKEN` | 已验证 |
| Gemini | `provider: 'gemini'` | `GEMINI_API_KEY` | 已验证 |
| Ollama / vLLM / LM Studio | `provider: 'openai'` + `baseURL` | — | 已验证 |
@ -271,7 +275,7 @@ Gemini 需要 `npm install @google/genai`optional peer dependency
已验证支持 tool-calling 的本地模型:**Gemma 4**(见[示例 08](examples/08-gemma4-local.ts))。
任何 OpenAI 兼容 API 均可通过 `provider: 'openai'` + `baseURL` 接入(DeepSeek、Groq、Mistral、Qwen 等)。**Grok 和 MiniMax 现已原生支持**,分别使用 `provider: 'grok'``provider: 'minimax'`。
任何 OpenAI 兼容 API 均可通过 `provider: 'openai'` + `baseURL` 接入(Groq、Mistral、Qwen 等)。**Grok、MiniMax 和 DeepSeek 现已原生支持**,分别使用 `provider: 'grok'`、`provider: 'minimax'` 和 `provider: 'deepseek'`。
### 本地模型 Tool-Calling
@ -329,6 +333,17 @@ const minimaxAgent: AgentConfig = {
也可在 `AgentConfig` 中直接传入 `baseURL` 覆盖环境变量。
```typescript
const deepseekAgent: AgentConfig = {
name: 'deepseek-agent',
provider: 'deepseek',
model: 'deepseek-chat',
systemPrompt: '你是一个有用的助手。',
}
```
设置 `DEEPSEEK_API_KEY`。可用模型:`deepseek-chat`DeepSeek-V3推荐用于编码任务`deepseek-reasoner`(思考模式)。
## 参与贡献
欢迎提 Issue、功能需求和 PR。以下方向的贡献尤其有价值

View File

@ -55,7 +55,7 @@ Global flags: [`--pretty`](#output-flags), [`--include-messages`](#output-flags)
Read-only helper for wiring JSON configs and env vars.
- **`oma provider`** or **`oma provider list`** — Prints JSON: built-in provider ids, API key environment variable names, whether `baseURL` is supported, and short notes (e.g. OpenAI-compatible servers, Copilot in CI).
- **`oma provider template <provider>`** — Prints a JSON object with example `orchestrator` and `agent` fields plus placeholder `env` entries. `<provider>` is one of: `anthropic`, `openai`, `gemini`, `grok`, `minimax`, `copilot`.
- **`oma provider template <provider>`** — Prints a JSON object with example `orchestrator` and `agent` fields plus placeholder `env` entries. `<provider>` is one of: `anthropic`, `openai`, `gemini`, `grok`, `minimax`, `deepseek`, `copilot`.
Supports `--pretty`.

158
examples/18-deepseek.ts Normal file
View File

@ -0,0 +1,158 @@
/**
* Example 18 Multi-Agent Team Collaboration with DeepSeek
*
* Three specialized agents (architect, developer, reviewer) collaborate via `runTeam()`
* to build a minimal Express.js REST API. Every agent uses DeepSeek's flagship model.
*
* Run:
* npx tsx examples/18-deepseek.ts
*
* Prerequisites:
* DEEPSEEK_API_KEY environment variable must be set.
*
* Available models:
* deepseek-chat DeepSeek-V3 (non-thinking mode, recommended for coding tasks)
* deepseek-reasoner DeepSeek-V3 (thinking mode, for complex reasoning)
*/
import { OpenMultiAgent } from '../src/index.js'
import type { AgentConfig, OrchestratorEvent } from '../src/types.js'
// ---------------------------------------------------------------------------
// Agent definitions (all using deepseek-chat)
// ---------------------------------------------------------------------------
const architect: AgentConfig = {
name: 'architect',
model: 'deepseek-chat',
provider: 'deepseek',
systemPrompt: `You are a software architect with deep experience in Node.js and REST API design.
Your job is to design clear, production-quality API contracts and file/directory structures.
Output concise plans in markdown no unnecessary prose.`,
tools: ['bash', 'file_write'],
maxTurns: 5,
temperature: 0.2,
}
const developer: AgentConfig = {
name: 'developer',
model: 'deepseek-chat',
provider: 'deepseek',
systemPrompt: `You are a TypeScript/Node.js developer. You implement what the architect specifies.
Write clean, runnable code with proper error handling. Use the tools to write files and run tests.`,
tools: ['bash', 'file_read', 'file_write', 'file_edit'],
maxTurns: 12,
temperature: 0.1,
}
const reviewer: AgentConfig = {
name: 'reviewer',
model: 'deepseek-chat',
provider: 'deepseek',
systemPrompt: `You are a senior code reviewer. Review code for correctness, security, and clarity.
Provide a structured review with: LGTM items, suggestions, and any blocking issues.
Read files using the tools before reviewing.`,
tools: ['bash', 'file_read', 'grep'],
maxTurns: 5,
temperature: 0.3,
}
// ---------------------------------------------------------------------------
// Progress tracking
// ---------------------------------------------------------------------------
const startTimes = new Map<string, number>()
function handleProgress(event: OrchestratorEvent): void {
const ts = new Date().toISOString().slice(11, 23) // HH:MM:SS.mmm
switch (event.type) {
case 'agent_start':
startTimes.set(event.agent ?? '', Date.now())
console.log(`[${ts}] AGENT START → ${event.agent}`)
break
case 'agent_complete': {
const elapsed = Date.now() - (startTimes.get(event.agent ?? '') ?? Date.now())
console.log(`[${ts}] AGENT DONE ← ${event.agent} (${elapsed}ms)`)
break
}
case 'task_start':
console.log(`[${ts}] TASK START ↓ ${event.task}`)
break
case 'task_complete':
console.log(`[${ts}] TASK DONE ↑ ${event.task}`)
break
case 'message':
console.log(`[${ts}] MESSAGE • ${event.agent} → (team)`)
break
case 'error':
console.error(`[${ts}] ERROR ✗ agent=${event.agent} task=${event.task}`)
if (event.data instanceof Error) console.error(` ${event.data.message}`)
break
}
}
// ---------------------------------------------------------------------------
// Orchestrate
// ---------------------------------------------------------------------------
const orchestrator = new OpenMultiAgent({
defaultModel: 'deepseek-chat',
defaultProvider: 'deepseek',
maxConcurrency: 1, // sequential for readable output
onProgress: handleProgress,
})
const team = orchestrator.createTeam('api-team', {
name: 'api-team',
agents: [architect, developer, reviewer],
sharedMemory: true,
maxConcurrency: 1,
})
console.log(`Team "${team.name}" created with agents: ${team.getAgents().map(a => a.name).join(', ')}`)
console.log('\nStarting team run...\n')
console.log('='.repeat(60))
const goal = `Create a minimal Express.js REST API in /tmp/express-api/ with:
- GET /health { status: "ok" }
- GET /users returns a hardcoded array of 2 user objects
- POST /users accepts { name, email } body, logs it, returns 201
- Proper error handling middleware
- The server should listen on port 3001
- Include a package.json with the required dependencies`
const result = await orchestrator.runTeam(team, goal)
console.log('\n' + '='.repeat(60))
// ---------------------------------------------------------------------------
// Results
// ---------------------------------------------------------------------------
console.log('\nTeam run complete.')
console.log(`Success: ${result.success}`)
console.log(`Total tokens — input: ${result.totalTokenUsage.input_tokens}, output: ${result.totalTokenUsage.output_tokens}`)
console.log('\nPer-agent results:')
for (const [agentName, agentResult] of result.agentResults) {
const status = agentResult.success ? 'OK' : 'FAILED'
const tools = agentResult.toolCalls.length
console.log(` ${agentName.padEnd(12)} [${status}] tool_calls=${tools}`)
if (!agentResult.success) {
console.log(` Error: ${agentResult.output.slice(0, 120)}`)
}
}
// Sample outputs
const developerResult = result.agentResults.get('developer')
if (developerResult?.success) {
console.log('\nDeveloper output (last 600 chars):')
console.log('─'.repeat(60))
const out = developerResult.output
console.log(out.length > 600 ? '...' + out.slice(-600) : out)
console.log('─'.repeat(60))
}
const reviewerResult = result.agentResults.get('reviewer')
if (reviewerResult?.success) {
console.log('\nReviewer output:')
console.log('─'.repeat(60))
console.log(reviewerResult.output)
console.log('─'.repeat(60))
}

View File

@ -51,6 +51,7 @@ const PROVIDER_REFERENCE: ReadonlyArray<{
{ id: 'gemini', apiKeyEnv: ['GEMINI_API_KEY', 'GOOGLE_API_KEY'], baseUrlSupported: false },
{ id: 'grok', apiKeyEnv: ['XAI_API_KEY'], baseUrlSupported: true },
{ id: 'minimax', apiKeyEnv: ['MINIMAX_API_KEY'], baseUrlSupported: true, notes: 'Global endpoint: https://api.minimax.io/v1 (default). China endpoint: https://api.minimaxi.com/v1. Set MINIMAX_BASE_URL to choose, or pass baseURL in agent config.' },
{ id: 'deepseek', apiKeyEnv: ['DEEPSEEK_API_KEY'], baseUrlSupported: true, notes: 'OpenAI-compatible endpoint at https://api.deepseek.com. Models: deepseek-chat (V3), deepseek-reasoner (thinking).' },
{
id: 'copilot',
apiKeyEnv: ['GITHUB_COPILOT_TOKEN', 'GITHUB_TOKEN'],
@ -261,6 +262,7 @@ const DEFAULT_MODEL_HINT: Record<SupportedProvider, string> = {
grok: 'grok-2-latest',
copilot: 'gpt-4o',
minimax: 'MiniMax-M2.7',
deepseek: 'deepseek-chat',
}
async function cmdProvider(sub: string | undefined, arg: string | undefined, pretty: boolean): Promise<number> {

View File

@ -38,7 +38,7 @@ import type { LLMAdapter } from '../types.js'
* Additional providers can be integrated by implementing {@link LLMAdapter}
* directly and bypassing this factory.
*/
export type SupportedProvider = 'anthropic' | 'copilot' | 'grok' | 'minimax' | 'openai' | 'gemini'
export type SupportedProvider = 'anthropic' | 'copilot' | 'deepseek' | 'grok' | 'minimax' | 'openai' | 'gemini'
/**
* Instantiate the appropriate {@link LLMAdapter} for the given provider.
@ -50,6 +50,7 @@ export type SupportedProvider = 'anthropic' | 'copilot' | 'grok' | 'minimax' | '
* - `gemini` `GEMINI_API_KEY` / `GOOGLE_API_KEY`
* - `grok` `XAI_API_KEY`
* - `minimax` `MINIMAX_API_KEY`
* - `deepseek` `DEEPSEEK_API_KEY`
* - `copilot` `GITHUB_COPILOT_TOKEN` / `GITHUB_TOKEN`, or interactive
* OAuth2 device flow if neither is set
*
@ -94,6 +95,10 @@ export async function createAdapter(
const { MiniMaxAdapter } = await import('./minimax.js')
return new MiniMaxAdapter(apiKey, baseURL)
}
case 'deepseek': {
const { DeepSeekAdapter } = await import('./deepseek.js')
return new DeepSeekAdapter(apiKey, baseURL)
}
default: {
// The `never` cast here makes TypeScript enforce exhaustiveness.
const _exhaustive: never = provider

29
src/llm/deepseek.ts Normal file
View File

@ -0,0 +1,29 @@
/**
* @fileoverview DeepSeek adapter.
*
* Thin wrapper around OpenAIAdapter that hard-codes the official DeepSeek
* OpenAI-compatible endpoint and DEEPSEEK_API_KEY environment variable fallback.
*/
import { OpenAIAdapter } from './openai.js'
/**
* LLM adapter for DeepSeek models (deepseek-chat, deepseek-reasoner, and future models).
*
* Thread-safe. Can be shared across agents.
*
* Usage:
* provider: 'deepseek'
* model: 'deepseek-chat' (or 'deepseek-reasoner' for the thinking model)
*/
export class DeepSeekAdapter extends OpenAIAdapter {
readonly name = 'deepseek'
constructor(apiKey?: string, baseURL?: string) {
// Allow override of baseURL (for proxies or future changes) but default to official DeepSeek endpoint.
super(
apiKey ?? process.env['DEEPSEEK_API_KEY'],
baseURL ?? 'https://api.deepseek.com'
)
}
}

View File

@ -0,0 +1,74 @@
import { describe, it, expect, vi, beforeEach } from 'vitest'
// ---------------------------------------------------------------------------
// Mock OpenAI constructor (must be hoisted for Vitest)
// ---------------------------------------------------------------------------
const OpenAIMock = vi.hoisted(() => vi.fn())
vi.mock('openai', () => ({
default: OpenAIMock,
}))
import { DeepSeekAdapter } from '../src/llm/deepseek.js'
import { createAdapter } from '../src/llm/adapter.js'
// ---------------------------------------------------------------------------
// DeepSeekAdapter tests
// ---------------------------------------------------------------------------
describe('DeepSeekAdapter', () => {
beforeEach(() => {
OpenAIMock.mockClear()
})
it('has name "deepseek"', () => {
const adapter = new DeepSeekAdapter()
expect(adapter.name).toBe('deepseek')
})
it('uses DEEPSEEK_API_KEY by default', () => {
const original = process.env['DEEPSEEK_API_KEY']
process.env['DEEPSEEK_API_KEY'] = 'deepseek-test-key-123'
try {
new DeepSeekAdapter()
expect(OpenAIMock).toHaveBeenCalledWith(
expect.objectContaining({
apiKey: 'deepseek-test-key-123',
baseURL: 'https://api.deepseek.com',
})
)
} finally {
if (original === undefined) {
delete process.env['DEEPSEEK_API_KEY']
} else {
process.env['DEEPSEEK_API_KEY'] = original
}
}
})
it('uses official DeepSeek baseURL by default', () => {
new DeepSeekAdapter('some-key')
expect(OpenAIMock).toHaveBeenCalledWith(
expect.objectContaining({
apiKey: 'some-key',
baseURL: 'https://api.deepseek.com',
})
)
})
it('allows overriding apiKey and baseURL', () => {
new DeepSeekAdapter('custom-key', 'https://custom.endpoint/v1')
expect(OpenAIMock).toHaveBeenCalledWith(
expect.objectContaining({
apiKey: 'custom-key',
baseURL: 'https://custom.endpoint/v1',
})
)
})
it('createAdapter("deepseek") returns DeepSeekAdapter instance', async () => {
const adapter = await createAdapter('deepseek')
expect(adapter).toBeInstanceOf(DeepSeekAdapter)
})
})