A family of TypeScript and Python packages for building Copass-backed agents in your framework of choice
The Copass SDK is an open-source family of packages — one for each major agent framework — plus a standalone MCP server, a create-copass-agent scaffold, and a shared core client. Every package gives the LLM window-aware retrieval over your knowledge graph with a couple lines of wiring.
copass-harness on GitHub
Source, READMEs, and release notes for every package below.
Every integration is thin glue over the same HTTP API. Pick based on the stack you already use.
Want a hosted agent with OAuth’d integrations? Use the Agent Router — one import, one call to router.integrations.connect('github', …), one call to router.run({ provider, model, … }). Swap between Anthropic and Google on a per-call flag.
Building with an LLM framework and running the loop yourself? Use an adapter — the model picks between discover / interpret / search on each turn, and retrieval is window-aware automatically.
npm install -g @copass/clicopass login # email OTPcopass setup # creates a sandbox, writes .olane/refs.jsoncopass apikey create --name my-app # prints an olk_... key — shown once, save itcopass ingest README.md # something real to retrieve
2
Scaffold the agent
npx create-copass-agent my-appcd my-app# .env has COPASS_SANDBOX_ID + COPASS_PROJECT_ID auto-filled from# ../.olane/refs.json. Paste the two secrets by hand:echo "COPASS_API_KEY=olk_your_key_from_step_1" >> .envecho "ANTHROPIC_API_KEY=sk-ant-your-key" >> .envpnpm installpnpm dev
3
Chat
Open http://localhost:3000. You get an embedded chat UI wired to a Hono server running the Claude Agent SDK with @copass/mcp exposing the full Copass retrieval surface as tools. Every turn is window-aware.
The scaffold is ~150 lines across four files. Everything is editable — swap the model, tweak the system prompt, wire a different UI (Assistant UI is a drop-in upgrade). See the package README for the production-limitations checklist.
The Vercel AI SDK pattern below lives inside a normal generateText call. Two commented blocks show exactly which four lines are Copass-specific — the rest is vanilla Vercel AI SDK that any LLM-app developer already writes.
import { CopassClient } from '@copass/core';import { copassTools } from '@copass/ai-sdk';import { generateText } from 'ai';import { anthropic } from '@ai-sdk/anthropic';// ── Copass (these four lines are the entire integration) ──const copass = new CopassClient({ auth: { type: 'api-key', key: process.env.COPASS_API_KEY! },});const window = await copass.contextWindow.create({ sandbox_id: process.env.COPASS_SANDBOX_ID!,});// ── Standard Vercel AI SDK call. Only `tools: copassTools(...)` is new ──const { text } = await generateText({ model: anthropic('claude-opus-4-7'), tools: copassTools({ client: copass, sandbox_id: window.sandboxId, window }), maxSteps: 5, prompt: 'why is checkout flaky?',});console.log(text);
What Copass is actually doing here:
new CopassClient({ auth }) — authenticates against the Copass REST API.
contextWindow.create(...) — opens an ephemeral data source for this conversation. Retrieval will know about turns inside this window.
copassTools({ ... }) — returns three tools (discover, interpret, search) that Claude can decide when to call on its own. Each one hits Copass’s knowledge graph with the window attached.
Everything else — generateText, model:, maxSteps:, prompt: — is vanilla Vercel AI SDK. If you already build with the AI SDK, you’re adding four Copass lines and one tools entry.
For multi-turn chat where retrieval should know what prior turns surfaced, wrap with createWindowTracker — it auto-mirrors the conversation into the window so turn 2 excludes items turn 1 already delivered. See the @copass/ai-sdk README for the ~3-line upgrade.
Drop this into your MCP client’s config (Claude Code .mcp.json, Claude Desktop’s claude_desktop_config.json, Cursor’s ~/.cursor/mcp.json, or your own Claude Agent SDK app):
Restart the client. Claude now has discover, interpret, search, context_window_*, and ingest as first-class tools — automatically window-aware when a Context Window is active.Detailed setup for each client: @copass/mcp README.
Context Window — an agent conversation wrapped as an ephemeral data source. Retrieval is automatically window-aware across turns. See Window-Aware Retrieval.
Each package ships its own README with install, quickstart, full tool surface, and a production-limitations section. The fastest way in is the repo’s root README — it’s a directory; every typescript/packages/*/ and python/*/ subdir has its own deep-dive.
Agent Router — hosted agent runtime with one-call OAuth integrations
Portable Context — why the runtime is swappable in the first place
CLI — the copass command (published as @copass/cli)
Claude Code — install the MCP server in Claude Code specifically