A data source is the unit of attribution inside a sandbox. Anything feeding content in — an OAuth’d integration, a folder of files, a custom firehose — is registered as a data source first, and every byte ingested through it is tagged with theDocumentation Index
Fetch the complete documentation index at: https://docs.copass.com/llms.txt
Use this file to discover all available pages before exploring further.
data_source_id.
Sources are the things you list, pause, resume, and disconnect when you want to control what’s flowing into your sandbox.
What you can do
- Register a new data source (Pipedream-backed integration, custom firehose, or custom MCP).
- Pause a source temporarily (stops ingestion; data stays).
- Resume a paused source.
- Disconnect a source (revokes provider credentials; keeps the ingested data in your sandbox).
- Permanently delete a source (removes the source row and all its ingested data).
- Inspect a source’s status, last sync, and adapter config.
Via the Concierge (recommended)
“What data sources do I have?” “Pause my GitHub source.” “Disconnect the Slack source — keep the data.”For destructive operations (
destroy, delete), the Concierge surfaces the equivalent CLI command rather than running it. See Concierge → What it won’t do.
Via the CLI
Via the SDK
Ingestion modes
A data source declares how content flows in:| Mode | Driver | When to use |
|---|---|---|
manual | Your code calls ingest when you have new content | Scripted ingestion, batch jobs, ad-hoc pushes |
polling | Your workers call ingest on a schedule | Scheduled refresh of an external system |
realtime | Webhook handlers call ingest on provider events | OAuth’d integrations with native webhooks (default for Pipedream) |
batch | One-shot bulk backfill | Initial loads |
Common patterns
OAuth integration
Slack, GitHub, Notion, etc. — provisioned with realtime mode out of the box. See Integrations.
Folder mirror
Mirror a local directory into the sandbox with the filesystem driver. See Filesystem driver.
Custom firehose
Register a
custom source and push to it from your own code or CI pipeline. Use ingestion_mode: manual.Custom MCP server
Plug your own MCP server in as a tool source. See Custom MCP.
Next steps
- Ingestion — push content through a data source into the knowledge graph.
- Integrations — OAuth’d third-party apps land as sources.
- Custom MCP — register your own MCP server.
- Sandboxes — the tenancy data sources live inside.

