Examples grew to 19 flat files mixing basics, provider demos, orchestration patterns, and integrations, with two files colliding on the number 16. Reorganized into category folders so the structure scales as new providers and patterns get added. Layout: examples/basics/ core execution modes (4 files) examples/providers/ one example per supported model provider (8 files) examples/patterns/ reusable orchestration patterns (6 files) examples/integrations/ MCP, observability, AI SDK (3 entries) examples/production/ placeholder for end-to-end use cases Notable changes: - Dropped numeric prefixes; folder + filename now signal category and intent. - Rewrote former smoke-test scripts (copilot, gemini) into proper three-agent team examples matching the deepseek/grok/minimax/groq template. Adapter unit tests in tests/ already cover correctness, so this only improves documentation quality. - Added examples/README.md as the categorized index plus maintenance rules for new submissions. - Added examples/production/README.md with acceptance criteria for the new production category. - Updated all internal npx tsx paths and import paths (../src/ to ../../src/). - Updated README.md and README_zh.md links. - Fixed stale cd paths inside examples/integrations/with-vercel-ai-sdk/README.md. |
||
|---|---|---|
| .. | ||
| app | ||
| .gitignore | ||
| README.md | ||
| next-env.d.ts | ||
| next.config.ts | ||
| package-lock.json | ||
| package.json | ||
| tsconfig.json | ||
README.md
with-vercel-ai-sdk
A Next.js demo showing open-multi-agent (OMA) and Vercel AI SDK working together:
- OMA orchestrates a research team (researcher agent + writer agent) via
runTeam() - AI SDK streams the result to a chat UI via
useChat+streamText
How it works
User message
│
▼
API route (app/api/chat/route.ts)
│
├─ Phase 1: OMA runTeam()
│ coordinator decomposes goal → researcher gathers info → writer drafts article
│
└─ Phase 2: AI SDK streamText()
streams the team's output to the browser
│
▼
Chat UI (app/page.tsx) — useChat hook renders streamed response
Setup
# 1. From repo root, install OMA dependencies
cd ../../..
npm install
# 2. Back to this example
cd examples/integrations/with-vercel-ai-sdk
npm install
# 3. Set your API key
export ANTHROPIC_API_KEY=sk-ant-...
# 4. Run
npm run dev
npm run dev automatically builds OMA before starting Next.js (via the predev script).
Open http://localhost:3000, type a topic, and watch the research team work.
Prerequisites
- Node.js >= 18
ANTHROPIC_API_KEYenvironment variable (used by both OMA and AI SDK)
Key files
| File | Role |
|---|---|
app/api/chat/route.ts |
Backend — OMA orchestration + AI SDK streaming |
app/page.tsx |
Frontend — chat UI with useChat hook |
package.json |
References OMA via file:../../ (local link) |