SDK: Add ensureServer() for automatic server recovery (#260)

* SDK sandbox provisioning: built-in providers, docs restructure, and quickstart overhaul

- Add built-in sandbox providers (local, docker, e2b, daytona, vercel, cloudflare) to the TypeScript SDK so users import directly instead of passing client instances
- Restructure docs: rename architecture to orchestration-architecture, add new architecture page for server overview, improve getting started flow
- Rewrite quickstart to be TypeScript-first with provider CodeGroup and custom provider accordion
- Update all examples to use new provider APIs
- Update persist drivers and foundry for new SDK surface

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* Fix SDK typecheck errors and update persist drivers for insertEvent signature

- Fix insertEvent call in client.ts to pass sessionId as first argument
- Update Daytona provider create options to use Partial type (image has default)
- Update StrictUniqueSessionPersistDriver in tests to match new insertEvent signature
- Sync persist packages, openapi spec, and docs with upstream changes

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* Add Modal and ComputeSDK built-in providers, update examples and docs

- Add `sandbox-agent/modal` provider using Modal SDK with node:22-slim image
- Add `sandbox-agent/computesdk` provider using ComputeSDK's unified sandbox API
- Update Modal and ComputeSDK examples to use new SDK providers
- Update Modal and ComputeSDK deploy docs with provider-based examples
- Add Modal to quickstart CodeGroup and docs.json navigation
- Add provider test entries for Modal and ComputeSDK
- Remove old standalone example files (modal.ts, computesdk.ts)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* Fix Modal provider: pre-install agents in image, fire-and-forget exec for server

- Pre-install agents in Dockerfile commands so they are cached across creates
- Use fire-and-forget exec (no wait) to keep server alive in Modal sandbox
- Add memoryMiB option (default 2GB) to avoid OOM during agent install

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* Sync upstream changes: multiplayer docs, logos, openapi spec, foundry config

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* SDK: Add ensureServer() for automatic server recovery

Add ensureServer() to SandboxProvider interface to handle cases where the
sandbox-agent server stops or goes to sleep. The SDK now calls this method
after 3 consecutive health-check failures, allowing providers to restart the
server if needed. Most built-in providers (E2B, Daytona, Vercel, Modal,
ComputeSDK) implement this. Docker and Cloudflare manage server lifecycle
differently, and Local uses managed child processes.

Also update docs for quickstart, architecture, multiplayer, and session
persistence; mark persist-* packages as deprecated; and add ensureServer
implementations to all applicable providers.

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>

* wip

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Nathan Flurry 2026-03-15 20:29:28 -07:00 committed by GitHub
parent 3426cbc6ec
commit cf7e2a92c6
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
112 changed files with 3739 additions and 3537 deletions

View file

@ -43,9 +43,42 @@
- Regenerate `docs/openapi.json` when HTTP contracts change. - Regenerate `docs/openapi.json` when HTTP contracts change.
- Keep `docs/inspector.mdx` and `docs/sdks/typescript.mdx` aligned with implementation. - Keep `docs/inspector.mdx` and `docs/sdks/typescript.mdx` aligned with implementation.
- Append blockers/decisions to `research/acp/friction.md` during ACP work. - Append blockers/decisions to `research/acp/friction.md` during ACP work.
- `docs/agent-capabilities.mdx` lists models/modes/thought levels per agent. Update it when adding a new agent or changing `fallback_config_options`. If its "Last updated" date is >2 weeks old, re-run `cd scripts/agent-configs && npx tsx dump.ts` and update the doc to match. Source data: `scripts/agent-configs/resources/*.json` and hardcoded entries in `server/packages/sandbox-agent/src/router/support.rs` (`fallback_config_options`). - Each agent has its own doc page at `docs/agents/<name>.mdx` listing models, modes, and thought levels. Update the relevant page when changing `fallback_config_options`. To regenerate capability data, run `cd scripts/agent-configs && npx tsx dump.ts`. Source data: `scripts/agent-configs/resources/*.json` and hardcoded entries in `server/packages/sandbox-agent/src/router/support.rs` (`fallback_config_options`).
- Some agent models are gated by subscription (e.g. Claude `opus`). The live report only shows models available to the current credentials. The static doc and JSON resource files should list all known models regardless of subscription tier. - Some agent models are gated by subscription (e.g. Claude `opus`). The live report only shows models available to the current credentials. The static doc and JSON resource files should list all known models regardless of subscription tier.
## Adding Providers
When adding a new sandbox provider, update all of the following:
- `sdks/typescript/src/providers/<name>.ts` — provider implementation
- `sdks/typescript/package.json` — add `./<name>` export, peerDependencies, peerDependenciesMeta, devDependencies
- `sdks/typescript/tsup.config.ts` — add entry point and external
- `sdks/typescript/tests/providers.test.ts` — add test entry
- `examples/<name>/` — create example with `src/index.ts` and `tests/<name>.test.ts`
- `docs/deploy/<name>.mdx` — create deploy guide
- `docs/docs.json` — add to Deploy pages navigation
- `docs/quickstart.mdx` — add tab in "Start the sandbox" step, add credentials entry in "Passing LLM credentials" accordion
## Adding Agents
When adding a new agent, update all of the following:
- `docs/agents/<name>.mdx` — create agent page with usage snippet and capabilities table
- `docs/docs.json` — add to the Agents group under Agent
- `docs/quickstart.mdx` — add tab in the "Create a session and send a prompt" CodeGroup
## Persist Packages (Deprecated)
- The `@sandbox-agent/persist-*` npm packages (`persist-sqlite`, `persist-postgres`, `persist-indexeddb`, `persist-rivet`) are deprecated stubs. They still publish to npm but throw a deprecation error at import time.
- Driver implementations now live inline in examples and consuming packages:
- SQLite: `examples/persist-sqlite/src/persist.ts`
- Postgres: `examples/persist-postgres/src/persist.ts`
- IndexedDB: `frontend/packages/inspector/src/persist-indexeddb.ts`
- Rivet: inlined in `docs/multiplayer.mdx`
- In-memory: built into the main `sandbox-agent` SDK (`InMemorySessionPersistDriver`)
- Docs (`docs/session-persistence.mdx`) link to the example implementations on GitHub instead of referencing the packages.
- Do not re-add `@sandbox-agent/persist-*` as dependencies anywhere. New persist drivers should be copied into the consuming project directly.
## Install Version References ## Install Version References
- Channel policy: - Channel policy:
@ -74,6 +107,7 @@
- `examples/docker/src/index.ts` - `examples/docker/src/index.ts`
- `examples/e2b/src/index.ts` - `examples/e2b/src/index.ts`
- `examples/vercel/src/index.ts` - `examples/vercel/src/index.ts`
- `sdks/typescript/src/providers/shared.ts`
- `scripts/release/main.ts` - `scripts/release/main.ts`
- `scripts/release/promote-artifacts.ts` - `scripts/release/promote-artifacts.ts`
- `scripts/release/sdk.ts` - `scripts/release/sdk.ts`

View file

@ -10,7 +10,6 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./
COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/
COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/
COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/
COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/
COPY sdks/react/package.json ./sdks/react/ COPY sdks/react/package.json ./sdks/react/
COPY sdks/typescript/package.json ./sdks/typescript/ COPY sdks/typescript/package.json ./sdks/typescript/
@ -21,15 +20,13 @@ RUN pnpm install --filter @sandbox-agent/inspector...
COPY docs/openapi.json ./docs/ COPY docs/openapi.json ./docs/
COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/cli-shared ./sdks/cli-shared
COPY sdks/acp-http-client ./sdks/acp-http-client COPY sdks/acp-http-client ./sdks/acp-http-client
COPY sdks/persist-indexeddb ./sdks/persist-indexeddb
COPY sdks/react ./sdks/react COPY sdks/react ./sdks/react
COPY sdks/typescript ./sdks/typescript COPY sdks/typescript ./sdks/typescript
# Build cli-shared, acp-http-client, SDK, then persist-indexeddb and react (depends on SDK) # Build cli-shared, acp-http-client, SDK, then react (depends on SDK)
RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/cli-shared && pnpm exec tsup
RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup
RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup
RUN cd sdks/persist-indexeddb && pnpm exec tsup
RUN cd sdks/react && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup
# Copy inspector source and build # Copy inspector source and build

View file

@ -10,7 +10,6 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./
COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/
COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/
COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/
COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/
COPY sdks/react/package.json ./sdks/react/ COPY sdks/react/package.json ./sdks/react/
COPY sdks/typescript/package.json ./sdks/typescript/ COPY sdks/typescript/package.json ./sdks/typescript/
@ -21,15 +20,13 @@ RUN pnpm install --filter @sandbox-agent/inspector...
COPY docs/openapi.json ./docs/ COPY docs/openapi.json ./docs/
COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/cli-shared ./sdks/cli-shared
COPY sdks/acp-http-client ./sdks/acp-http-client COPY sdks/acp-http-client ./sdks/acp-http-client
COPY sdks/persist-indexeddb ./sdks/persist-indexeddb
COPY sdks/react ./sdks/react COPY sdks/react ./sdks/react
COPY sdks/typescript ./sdks/typescript COPY sdks/typescript ./sdks/typescript
# Build cli-shared, acp-http-client, SDK, then persist-indexeddb and react (depends on SDK) # Build cli-shared, acp-http-client, SDK, then react (depends on SDK)
RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/cli-shared && pnpm exec tsup
RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup
RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup
RUN cd sdks/persist-indexeddb && pnpm exec tsup
RUN cd sdks/react && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup
# Copy inspector source and build # Copy inspector source and build

View file

@ -10,7 +10,6 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./
COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/
COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/
COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/
COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/
COPY sdks/react/package.json ./sdks/react/ COPY sdks/react/package.json ./sdks/react/
COPY sdks/typescript/package.json ./sdks/typescript/ COPY sdks/typescript/package.json ./sdks/typescript/
@ -21,15 +20,13 @@ RUN pnpm install --filter @sandbox-agent/inspector...
COPY docs/openapi.json ./docs/ COPY docs/openapi.json ./docs/
COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/cli-shared ./sdks/cli-shared
COPY sdks/acp-http-client ./sdks/acp-http-client COPY sdks/acp-http-client ./sdks/acp-http-client
COPY sdks/persist-indexeddb ./sdks/persist-indexeddb
COPY sdks/react ./sdks/react COPY sdks/react ./sdks/react
COPY sdks/typescript ./sdks/typescript COPY sdks/typescript ./sdks/typescript
# Build cli-shared, acp-http-client, SDK, then persist-indexeddb and react (depends on SDK) # Build cli-shared, acp-http-client, SDK, then react (depends on SDK)
RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/cli-shared && pnpm exec tsup
RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup
RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup
RUN cd sdks/persist-indexeddb && pnpm exec tsup
RUN cd sdks/react && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup
# Copy inspector source and build # Copy inspector source and build

View file

@ -10,7 +10,6 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./
COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/
COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/
COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/
COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/
COPY sdks/react/package.json ./sdks/react/ COPY sdks/react/package.json ./sdks/react/
COPY sdks/typescript/package.json ./sdks/typescript/ COPY sdks/typescript/package.json ./sdks/typescript/
@ -21,15 +20,13 @@ RUN pnpm install --filter @sandbox-agent/inspector...
COPY docs/openapi.json ./docs/ COPY docs/openapi.json ./docs/
COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/cli-shared ./sdks/cli-shared
COPY sdks/acp-http-client ./sdks/acp-http-client COPY sdks/acp-http-client ./sdks/acp-http-client
COPY sdks/persist-indexeddb ./sdks/persist-indexeddb
COPY sdks/react ./sdks/react COPY sdks/react ./sdks/react
COPY sdks/typescript ./sdks/typescript COPY sdks/typescript ./sdks/typescript
# Build cli-shared, acp-http-client, SDK, then persist-indexeddb and react (depends on SDK) # Build cli-shared, acp-http-client, SDK, then react (depends on SDK)
RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/cli-shared && pnpm exec tsup
RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup
RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup
RUN cd sdks/persist-indexeddb && pnpm exec tsup
RUN cd sdks/react && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup
# Copy inspector source and build # Copy inspector source and build

View file

@ -10,7 +10,6 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./
COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/
COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/
COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/
COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/
COPY sdks/react/package.json ./sdks/react/ COPY sdks/react/package.json ./sdks/react/
COPY sdks/typescript/package.json ./sdks/typescript/ COPY sdks/typescript/package.json ./sdks/typescript/
@ -21,15 +20,13 @@ RUN pnpm install --filter @sandbox-agent/inspector...
COPY docs/openapi.json ./docs/ COPY docs/openapi.json ./docs/
COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/cli-shared ./sdks/cli-shared
COPY sdks/acp-http-client ./sdks/acp-http-client COPY sdks/acp-http-client ./sdks/acp-http-client
COPY sdks/persist-indexeddb ./sdks/persist-indexeddb
COPY sdks/react ./sdks/react COPY sdks/react ./sdks/react
COPY sdks/typescript ./sdks/typescript COPY sdks/typescript ./sdks/typescript
# Build cli-shared, acp-http-client, SDK, then persist-indexeddb and react (depends on SDK) # Build cli-shared, acp-http-client, SDK, then react (depends on SDK)
RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/cli-shared && pnpm exec tsup
RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup
RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup
RUN cd sdks/persist-indexeddb && pnpm exec tsup
RUN cd sdks/react && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup
# Copy inspector source and build # Copy inspector source and build

View file

@ -12,7 +12,6 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./
COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/
COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/
COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/
COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/
COPY sdks/react/package.json ./sdks/react/ COPY sdks/react/package.json ./sdks/react/
COPY sdks/typescript/package.json ./sdks/typescript/ COPY sdks/typescript/package.json ./sdks/typescript/
@ -23,7 +22,6 @@ RUN pnpm install --filter @sandbox-agent/inspector...
COPY docs/openapi.json ./docs/ COPY docs/openapi.json ./docs/
COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/cli-shared ./sdks/cli-shared
COPY sdks/acp-http-client ./sdks/acp-http-client COPY sdks/acp-http-client ./sdks/acp-http-client
COPY sdks/persist-indexeddb ./sdks/persist-indexeddb
COPY sdks/react ./sdks/react COPY sdks/react ./sdks/react
COPY sdks/typescript ./sdks/typescript COPY sdks/typescript ./sdks/typescript
@ -31,7 +29,6 @@ COPY sdks/typescript ./sdks/typescript
RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/cli-shared && pnpm exec tsup
RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup
RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup
RUN cd sdks/persist-indexeddb && pnpm exec tsup
RUN cd sdks/react && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup
# Copy inspector source and build # Copy inspector source and build

View file

@ -11,7 +11,6 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./
COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/
COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/
COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/
COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/
COPY sdks/react/package.json ./sdks/react/ COPY sdks/react/package.json ./sdks/react/
COPY sdks/typescript/package.json ./sdks/typescript/ COPY sdks/typescript/package.json ./sdks/typescript/
@ -20,14 +19,12 @@ RUN pnpm install --filter @sandbox-agent/inspector...
COPY docs/openapi.json ./docs/ COPY docs/openapi.json ./docs/
COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/cli-shared ./sdks/cli-shared
COPY sdks/acp-http-client ./sdks/acp-http-client COPY sdks/acp-http-client ./sdks/acp-http-client
COPY sdks/persist-indexeddb ./sdks/persist-indexeddb
COPY sdks/react ./sdks/react COPY sdks/react ./sdks/react
COPY sdks/typescript ./sdks/typescript COPY sdks/typescript ./sdks/typescript
RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/cli-shared && pnpm exec tsup
RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup
RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup
RUN cd sdks/persist-indexeddb && pnpm exec tsup
RUN cd sdks/react && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup
COPY frontend/packages/inspector ./frontend/packages/inspector COPY frontend/packages/inspector ./frontend/packages/inspector

View file

@ -1,127 +0,0 @@
---
title: "Agent Capabilities"
description: "Models, modes, and thought levels supported by each agent."
---
Capabilities are subject to change as the agents are updated. See [Agent Sessions](/agent-sessions) for full session configuration API details.
<Info>
_Last updated: March 5th, 2026. See [Generating a live report](#generating-a-live-report) for up-to-date reference._
</Info>
## Claude
| Category | Values |
|----------|--------|
| **Models** | `default`, `sonnet`, `opus`, `haiku` |
| **Modes** | `default`, `acceptEdits`, `plan`, `dontAsk`, `bypassPermissions` |
| **Thought levels** | Unsupported |
### Configuring Effort Level For Claude
Claude does not natively support changing effort level after a session starts, so configure it in the filesystem before creating the session.
```ts
import { mkdir, writeFile } from "node:fs/promises";
import path from "node:path";
import { SandboxAgent } from "sandbox-agent";
const cwd = "/path/to/workspace";
await mkdir(path.join(cwd, ".claude"), { recursive: true });
await writeFile(
path.join(cwd, ".claude", "settings.json"),
JSON.stringify({ effortLevel: "high" }, null, 2),
);
const sdk = await SandboxAgent.connect({ baseUrl: "http://127.0.0.1:2468" });
await sdk.createSession({
agent: "claude",
sessionInit: { cwd, mcpServers: [] },
});
```
<Accordion title="Supported file locations (highest precedence last)">
1. `~/.claude/settings.json`
2. `<session cwd>/.claude/settings.json`
3. `<session cwd>/.claude/settings.local.json`
</Accordion>
## Codex
| Category | Values |
|----------|--------|
| **Models** | `gpt-5.3-codex` (default), `gpt-5.3-codex-spark`, `gpt-5.2-codex`, `gpt-5.1-codex-max`, `gpt-5.2`, `gpt-5.1-codex-mini` |
| **Modes** | `read-only` (default), `auto`, `full-access` |
| **Thought levels** | `low`, `medium`, `high` (default), `xhigh` |
## OpenCode
| Category | Values |
|----------|--------|
| **Models** | See below |
| **Modes** | `build` (default), `plan` |
| **Thought levels** | Unsupported |
<Accordion title="See all models">
| Provider | Models |
|----------|--------|
| **Anthropic** | `anthropic/claude-3-5-haiku-20241022`, `anthropic/claude-3-5-haiku-latest`, `anthropic/claude-3-5-sonnet-20240620`, `anthropic/claude-3-5-sonnet-20241022`, `anthropic/claude-3-7-sonnet-20250219`, `anthropic/claude-3-7-sonnet-latest`, `anthropic/claude-3-haiku-20240307`, `anthropic/claude-3-opus-20240229`, `anthropic/claude-3-sonnet-20240229`, `anthropic/claude-haiku-4-5`, `anthropic/claude-haiku-4-5-20251001`, `anthropic/claude-opus-4-0`, `anthropic/claude-opus-4-1`, `anthropic/claude-opus-4-1-20250805`, `anthropic/claude-opus-4-20250514`, `anthropic/claude-opus-4-5`, `anthropic/claude-opus-4-5-20251101`, `anthropic/claude-opus-4-6`, `anthropic/claude-sonnet-4-0`, `anthropic/claude-sonnet-4-20250514`, `anthropic/claude-sonnet-4-5`, `anthropic/claude-sonnet-4-5-20250929` |
| **OpenAI** | `openai/gpt-5.1-codex`, `openai/gpt-5.1-codex-max`, `openai/gpt-5.1-codex-mini`, `openai/gpt-5.2`, `openai/gpt-5.2-codex`, `openai/gpt-5.3-codex` |
| **Cerebras** | `cerebras/gpt-oss-120b`, `cerebras/qwen-3-235b-a22b-instruct-2507`, `cerebras/zai-glm-4.7` |
| **OpenCode Zen** | `opencode/big-pickle`, `opencode/claude-3-5-haiku`, `opencode/claude-haiku-4-5`, `opencode/claude-opus-4-1`, `opencode/claude-opus-4-5`, `opencode/claude-opus-4-6`, `opencode/claude-sonnet-4`, `opencode/claude-sonnet-4-5`, `opencode/gemini-3-flash`, `opencode/gemini-3-pro` (default), `opencode/glm-4.6`, `opencode/glm-4.7`, `opencode/gpt-5`, `opencode/gpt-5-codex`, `opencode/gpt-5-nano`, `opencode/gpt-5.1`, `opencode/gpt-5.1-codex`, `opencode/gpt-5.1-codex-max`, `opencode/gpt-5.1-codex-mini`, `opencode/gpt-5.2`, `opencode/gpt-5.2-codex`, `opencode/kimi-k2`, `opencode/kimi-k2-thinking`, `opencode/kimi-k2.5`, `opencode/kimi-k2.5-free`, `opencode/minimax-m2.1`, `opencode/minimax-m2.1-free`, `opencode/trinity-large-preview-free` |
</Accordion>
## Cursor
| Category | Values |
|----------|--------|
| **Models** | See below |
| **Modes** | Unsupported |
| **Thought levels** | Unsupported |
<Accordion title="See all models">
| Group | Models |
|-------|--------|
| **Auto** | `auto` |
| **Composer** | `composer-1.5`, `composer-1` |
| **GPT-5.3 Codex** | `gpt-5.3-codex`, `gpt-5.3-codex-low`, `gpt-5.3-codex-high`, `gpt-5.3-codex-xhigh`, `gpt-5.3-codex-fast`, `gpt-5.3-codex-low-fast`, `gpt-5.3-codex-high-fast`, `gpt-5.3-codex-xhigh-fast` |
| **GPT-5.2** | `gpt-5.2`, `gpt-5.2-high`, `gpt-5.2-codex`, `gpt-5.2-codex-low`, `gpt-5.2-codex-high`, `gpt-5.2-codex-xhigh`, `gpt-5.2-codex-fast`, `gpt-5.2-codex-low-fast`, `gpt-5.2-codex-high-fast`, `gpt-5.2-codex-xhigh-fast` |
| **GPT-5.1** | `gpt-5.1-high`, `gpt-5.1-codex-max`, `gpt-5.1-codex-max-high` |
| **Claude** | `opus-4.6-thinking` (default), `opus-4.6`, `opus-4.5`, `opus-4.5-thinking`, `sonnet-4.5`, `sonnet-4.5-thinking` |
| **Other** | `gemini-3-pro`, `gemini-3-flash`, `grok` |
</Accordion>
## Amp
| Category | Values |
|----------|--------|
| **Models** | `amp-default` |
| **Modes** | `default`, `bypass` |
| **Thought levels** | Unsupported |
## Pi
| Category | Values |
|----------|--------|
| **Models** | `default` |
| **Modes** | Unsupported |
| **Thought levels** | Unsupported |
## Generating a live report
Requires a running Sandbox Agent server. `--endpoint` defaults to `http://127.0.0.1:2468`.
```bash
sandbox-agent api agents report
```
<Note>
The live report reflects what the agent adapter returns for the current credentials. Some models may be gated by subscription (e.g. Claude's `opus` requires a paid plan) and will not appear in the report if the credentials don't have access.
</Note>

View file

@ -21,10 +21,7 @@ const sdk = await SandboxAgent.connect({
const session = await sdk.createSession({ const session = await sdk.createSession({
agent: "codex", agent: "codex",
sessionInit: { cwd: "/",
cwd: "/",
mcpServers: [],
},
}); });
console.log(session.id, session.agentSessionId); console.log(session.id, session.agentSessionId);

20
docs/agents/amp.mdx Normal file
View file

@ -0,0 +1,20 @@
---
title: "Amp"
description: "Use Amp as a sandbox agent."
---
## Usage
```typescript
const session = await client.createSession({
agent: "amp",
});
```
## Capabilities
| Category | Values |
|----------|--------|
| **Models** | `amp-default` |
| **Modes** | `default`, `bypass` |
| **Thought levels** | Unsupported |

49
docs/agents/claude.mdx Normal file
View file

@ -0,0 +1,49 @@
---
title: "Claude"
description: "Use Claude Code as a sandbox agent."
---
## Usage
```typescript
const session = await client.createSession({
agent: "claude",
});
```
## Capabilities
| Category | Values |
|----------|--------|
| **Models** | `default`, `sonnet`, `opus`, `haiku` |
| **Modes** | `default`, `acceptEdits`, `plan`, `dontAsk`, `bypassPermissions` |
| **Thought levels** | Unsupported |
## Configuring effort level
Claude does not support changing effort level after a session starts. Configure it in the filesystem before creating the session.
```ts
import { mkdir, writeFile } from "node:fs/promises";
import path from "node:path";
const cwd = "/path/to/workspace";
await mkdir(path.join(cwd, ".claude"), { recursive: true });
await writeFile(
path.join(cwd, ".claude", "settings.json"),
JSON.stringify({ effortLevel: "high" }, null, 2),
);
const session = await client.createSession({
agent: "claude",
cwd,
});
```
<Accordion title="Supported settings file locations (highest precedence last)">
1. `~/.claude/settings.json`
2. `<session cwd>/.claude/settings.json`
3. `<session cwd>/.claude/settings.local.json`
</Accordion>

20
docs/agents/codex.mdx Normal file
View file

@ -0,0 +1,20 @@
---
title: "Codex"
description: "Use OpenAI Codex as a sandbox agent."
---
## Usage
```typescript
const session = await client.createSession({
agent: "codex",
});
```
## Capabilities
| Category | Values |
|----------|--------|
| **Models** | `gpt-5.3-codex` (default), `gpt-5.3-codex-spark`, `gpt-5.2-codex`, `gpt-5.1-codex-max`, `gpt-5.2`, `gpt-5.1-codex-mini` |
| **Modes** | `read-only` (default), `auto`, `full-access` |
| **Thought levels** | `low`, `medium`, `high` (default), `xhigh` |

34
docs/agents/cursor.mdx Normal file
View file

@ -0,0 +1,34 @@
---
title: "Cursor"
description: "Use Cursor as a sandbox agent."
---
## Usage
```typescript
const session = await client.createSession({
agent: "cursor",
});
```
## Capabilities
| Category | Values |
|----------|--------|
| **Models** | See below |
| **Modes** | Unsupported |
| **Thought levels** | Unsupported |
<Accordion title="All models">
| Group | Models |
|-------|--------|
| **Auto** | `auto` |
| **Composer** | `composer-1.5`, `composer-1` |
| **GPT-5.3 Codex** | `gpt-5.3-codex`, `gpt-5.3-codex-low`, `gpt-5.3-codex-high`, `gpt-5.3-codex-xhigh`, `gpt-5.3-codex-fast`, `gpt-5.3-codex-low-fast`, `gpt-5.3-codex-high-fast`, `gpt-5.3-codex-xhigh-fast` |
| **GPT-5.2** | `gpt-5.2`, `gpt-5.2-high`, `gpt-5.2-codex`, `gpt-5.2-codex-low`, `gpt-5.2-codex-high`, `gpt-5.2-codex-xhigh`, `gpt-5.2-codex-fast`, `gpt-5.2-codex-low-fast`, `gpt-5.2-codex-high-fast`, `gpt-5.2-codex-xhigh-fast` |
| **GPT-5.1** | `gpt-5.1-high`, `gpt-5.1-codex-max`, `gpt-5.1-codex-max-high` |
| **Claude** | `opus-4.6-thinking` (default), `opus-4.6`, `opus-4.5`, `opus-4.5-thinking`, `sonnet-4.5`, `sonnet-4.5-thinking` |
| **Other** | `gemini-3-pro`, `gemini-3-flash`, `grok` |
</Accordion>

31
docs/agents/opencode.mdx Normal file
View file

@ -0,0 +1,31 @@
---
title: "OpenCode"
description: "Use OpenCode as a sandbox agent."
---
## Usage
```typescript
const session = await client.createSession({
agent: "opencode",
});
```
## Capabilities
| Category | Values |
|----------|--------|
| **Models** | See below |
| **Modes** | `build` (default), `plan` |
| **Thought levels** | Unsupported |
<Accordion title="All models">
| Provider | Models |
|----------|--------|
| **Anthropic** | `anthropic/claude-3-5-haiku-20241022`, `anthropic/claude-3-5-haiku-latest`, `anthropic/claude-3-5-sonnet-20240620`, `anthropic/claude-3-5-sonnet-20241022`, `anthropic/claude-3-7-sonnet-20250219`, `anthropic/claude-3-7-sonnet-latest`, `anthropic/claude-3-haiku-20240307`, `anthropic/claude-3-opus-20240229`, `anthropic/claude-3-sonnet-20240229`, `anthropic/claude-haiku-4-5`, `anthropic/claude-haiku-4-5-20251001`, `anthropic/claude-opus-4-0`, `anthropic/claude-opus-4-1`, `anthropic/claude-opus-4-1-20250805`, `anthropic/claude-opus-4-20250514`, `anthropic/claude-opus-4-5`, `anthropic/claude-opus-4-5-20251101`, `anthropic/claude-opus-4-6`, `anthropic/claude-sonnet-4-0`, `anthropic/claude-sonnet-4-20250514`, `anthropic/claude-sonnet-4-5`, `anthropic/claude-sonnet-4-5-20250929` |
| **OpenAI** | `openai/gpt-5.1-codex`, `openai/gpt-5.1-codex-max`, `openai/gpt-5.1-codex-mini`, `openai/gpt-5.2`, `openai/gpt-5.2-codex`, `openai/gpt-5.3-codex` |
| **Cerebras** | `cerebras/gpt-oss-120b`, `cerebras/qwen-3-235b-a22b-instruct-2507`, `cerebras/zai-glm-4.7` |
| **OpenCode Zen** | `opencode/big-pickle`, `opencode/claude-3-5-haiku`, `opencode/claude-haiku-4-5`, `opencode/claude-opus-4-1`, `opencode/claude-opus-4-5`, `opencode/claude-opus-4-6`, `opencode/claude-sonnet-4`, `opencode/claude-sonnet-4-5`, `opencode/gemini-3-flash`, `opencode/gemini-3-pro` (default), `opencode/glm-4.6`, `opencode/glm-4.7`, `opencode/gpt-5`, `opencode/gpt-5-codex`, `opencode/gpt-5-nano`, `opencode/gpt-5.1`, `opencode/gpt-5.1-codex`, `opencode/gpt-5.1-codex-max`, `opencode/gpt-5.1-codex-mini`, `opencode/gpt-5.2`, `opencode/gpt-5.2-codex`, `opencode/kimi-k2`, `opencode/kimi-k2-thinking`, `opencode/kimi-k2.5`, `opencode/kimi-k2.5-free`, `opencode/minimax-m2.1`, `opencode/minimax-m2.1-free`, `opencode/trinity-large-preview-free` |
</Accordion>

20
docs/agents/pi.mdx Normal file
View file

@ -0,0 +1,20 @@
---
title: "Pi"
description: "Use Pi as a sandbox agent."
---
## Usage
```typescript
const session = await client.createSession({
agent: "pi",
});
```
## Capabilities
| Category | Values |
|----------|--------|
| **Models** | `default` |
| **Modes** | Unsupported |
| **Thought levels** | Unsupported |

View file

@ -1,64 +1,63 @@
--- ---
title: "Architecture" title: "Architecture"
description: "How the client, sandbox, server, and agent fit together." description: "How the Sandbox Agent server, SDK, and agent processes fit together."
icon: "microchip"
--- ---
Sandbox Agent runs as an HTTP server inside your sandbox. Your app talks to it remotely. Sandbox Agent is a lightweight HTTP server that runs **inside** a sandbox. It:
- **Agent management**: Installs, spawns, and stops coding agent processes
- **Sessions**: Routes prompts to agents and streams events back in real time
- **Sandbox APIs**: Filesystem, process, and terminal access for the sandbox environment
## Components ## Components
- `Your client`: your app code using the `sandbox-agent` SDK. ```mermaid
- `Sandbox`: isolated runtime (E2B, Daytona, Docker, etc.). flowchart LR
- `Sandbox Agent server`: process inside the sandbox exposing HTTP transport. CLIENT["Your App"]
- `Agent`: Claude/Codex/OpenCode/Amp process managed by Sandbox Agent.
```mermaid placement="top-right"
flowchart LR
CLIENT["Sandbox Agent SDK"]
SERVER["Sandbox Agent server"]
AGENT["Agent process"]
subgraph SANDBOX["Sandbox"] subgraph SANDBOX["Sandbox"]
direction TB direction TB
SERVER --> AGENT SERVER["Sandbox Agent Server"]
AGENT["Agent Process<br/>(Claude, Codex, etc.)"]
SERVER --> AGENT
end end
CLIENT -->|HTTP| SERVER CLIENT -->|"SDK (HTTP)"| SERVER
``` ```
## Suggested Topology - **Your app**: Uses the `sandbox-agent` TypeScript SDK to talk to the server over HTTP.
- **Sandbox**: An isolated runtime (local process, Docker, E2B, Daytona, Vercel, Cloudflare).
- **Sandbox Agent server**: A single binary inside the sandbox that manages agent lifecycles, routes prompts, streams events, and exposes filesystem/process/terminal APIs.
- **Agent process**: A coding agent (Claude Code, Codex, etc.) spawned by the server. Each session maps to one agent process.
Run the SDK on your backend, then call it from your frontend. ## What `SandboxAgent.start()` does
This extra hop is recommended because it keeps auth/token logic on the backend and makes persistence simpler. 1. **Provision**: The provider creates a sandbox (starts a container, creates a VM, etc.)
2. **Install**: The Sandbox Agent binary is installed inside the sandbox
3. **Boot**: The server starts listening on an HTTP port
4. **Health check**: The SDK waits for `/v1/health` to respond
5. **Ready**: The SDK returns a connected client
```mermaid placement="top-right" For the `local` provider, provisioning is a no-op and the server runs as a local subprocess.
flowchart LR
BROWSER["Browser"]
subgraph BACKEND["Your backend"]
direction TB
SDK["Sandbox Agent SDK"]
end
subgraph SANDBOX_SIMPLE["Sandbox"]
SERVER_SIMPLE["Sandbox Agent server"]
end
BROWSER --> BACKEND ### Server recovery
BACKEND --> SDK --> SERVER_SIMPLE
If the server process stops, the SDK automatically calls the provider's `ensureServer()` after 3 consecutive health-check failures. Most built-in providers implement this. Custom providers can add `ensureServer(sandboxId)` to their `SandboxProvider` object.
## Server HTTP API
See the [HTTP API reference](/api-reference) for the full list of server endpoints.
## Agent installation
Agents are installed lazily on first use. To avoid the cold-start delay, pre-install them:
```bash
sandbox-agent install-agent --all
``` ```
### Backend requirements The `rivetdev/sandbox-agent:0.3.2-full` Docker image ships with all agents pre-installed.
Your backend layer needs to handle: ## Production-ready agent orchestration
- **Long-running connections**: prompts can take minutes. For production deployments, see [Orchestration Architecture](/orchestration-architecture) for recommended topology, backend requirements, and session persistence patterns.
- **Session affinity**: follow-up messages must reach the same session.
- **State between requests**: session metadata and event history must persist across requests.
- **Graceful recovery**: sessions should resume after backend restarts.
We recommend [Rivet](https://rivet.dev) over serverless because actors natively support the long-lived connections, session routing, and state persistence that agent workloads require.
## Session persistence
For storage driver options and replay behavior, see [Persisting Sessions](/session-persistence).

View file

@ -259,7 +259,7 @@ Example output:
} }
``` ```
See [Agent Capabilities](/agent-capabilities) for a full reference of supported models, modes, and thought levels per agent. See individual agent pages (e.g. [Claude](/agents/claude), [Codex](/agents/codex)) for supported models, modes, and thought levels.
#### api agents install #### api agents install

View file

@ -80,9 +80,7 @@ await sdk.setMcpConfig(
const session = await sdk.createSession({ const session = await sdk.createSession({
agent: "claude", agent: "claude",
sessionInit: { cwd: "/workspace",
cwd: "/workspace",
},
}); });
await session.prompt([ await session.prompt([
@ -145,9 +143,7 @@ await sdk.writeFsFile({ path: "/opt/skills/random-number/SKILL.md" }, skill);
```ts ```ts
const session = await sdk.createSession({ const session = await sdk.createSession({
agent: "claude", agent: "claude",
sessionInit: { cwd: "/workspace",
cwd: "/workspace",
},
}); });
await session.prompt([ await session.prompt([

View file

@ -31,7 +31,38 @@ RUN sandbox-agent install-agent claude && sandbox-agent install-agent codex
EXPOSE 8000 EXPOSE 8000
``` ```
## TypeScript example ## TypeScript example (with provider)
For standalone scripts, use the `cloudflare` provider:
```bash
npm install sandbox-agent@0.3.x @cloudflare/sandbox
```
```typescript
import { SandboxAgent } from "sandbox-agent";
import { cloudflare } from "sandbox-agent/cloudflare";
const sdk = await SandboxAgent.start({
sandbox: cloudflare(),
});
try {
const session = await sdk.createSession({ agent: "codex" });
const response = await session.prompt([
{ type: "text", text: "Summarize this repository" },
]);
console.log(response.stopReason);
} finally {
await sdk.destroySandbox();
}
```
The `cloudflare` provider uses `containerFetch` under the hood, automatically stripping `AbortSignal` to avoid dropped streaming updates.
## TypeScript example (Durable Objects)
For Workers with Durable Objects, use `SandboxAgent.connect(...)` with a custom `fetch` backed by `sandbox.containerFetch(...)`:
```typescript ```typescript
import { getSandbox, type Sandbox } from "@cloudflare/sandbox"; import { getSandbox, type Sandbox } from "@cloudflare/sandbox";
@ -109,7 +140,6 @@ app.all("*", (c) => c.env.ASSETS.fetch(c.req.raw));
export default app; export default app;
``` ```
Create the SDK client inside the Worker using custom `fetch` backed by `sandbox.containerFetch(...)`.
This keeps all Sandbox Agent calls inside the Cloudflare sandbox routing path and does not require a `baseUrl`. This keeps all Sandbox Agent calls inside the Cloudflare sandbox routing path and does not require a `baseUrl`.
## Troubleshooting streaming updates ## Troubleshooting streaming updates

View file

@ -1,160 +1,61 @@
--- ---
title: "ComputeSDK" title: "ComputeSDK"
description: "Deploy the daemon using ComputeSDK's provider-agnostic sandbox API." description: "Deploy Sandbox Agent using ComputeSDK's provider-agnostic sandbox API."
--- ---
[ComputeSDK](https://computesdk.com) provides a unified interface for managing sandboxes across multiple providers. Write once, deploy anywhere—switch providers by changing environment variables. [ComputeSDK](https://computesdk.com) provides a unified interface for managing sandboxes across multiple providers. Write once, deploy anywhere by changing environment variables.
## Prerequisites ## Prerequisites
- `COMPUTESDK_API_KEY` from [console.computesdk.com](https://console.computesdk.com) - `COMPUTESDK_API_KEY` from [console.computesdk.com](https://console.computesdk.com)
- Provider API key (one of: `E2B_API_KEY`, `DAYTONA_API_KEY`, `VERCEL_TOKEN`, `MODAL_TOKEN_ID` + `MODAL_TOKEN_SECRET`, `BLAXEL_API_KEY`, `CSB_API_KEY`) - Provider API key (one of: `E2B_API_KEY`, `DAYTONA_API_KEY`, `VERCEL_TOKEN`, `MODAL_TOKEN_ID` + `MODAL_TOKEN_SECRET`, `BLAXEL_API_KEY`, `CSB_API_KEY`)
- `ANTHROPIC_API_KEY` or `OPENAI_API_KEY` for the coding agents - `ANTHROPIC_API_KEY` or `OPENAI_API_KEY`
## TypeScript Example ## TypeScript example
```bash
npm install sandbox-agent@0.3.x computesdk
```
```typescript ```typescript
import {
compute,
detectProvider,
getMissingEnvVars,
getProviderConfigFromEnv,
isProviderAuthComplete,
isValidProvider,
PROVIDER_NAMES,
type ExplicitComputeConfig,
type ProviderName,
} from "computesdk";
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { computesdk } from "sandbox-agent/computesdk";
const PORT = 3000;
const REQUEST_TIMEOUT_MS =
Number.parseInt(process.env.COMPUTESDK_TIMEOUT_MS || "", 10) || 120_000;
/**
* Detects and validates the provider to use.
* Priority: COMPUTESDK_PROVIDER env var > auto-detection from API keys
*/
function resolveProvider(): ProviderName {
const providerOverride = process.env.COMPUTESDK_PROVIDER;
if (providerOverride) {
if (!isValidProvider(providerOverride)) {
throw new Error(
`Unsupported provider "${providerOverride}". Supported: ${PROVIDER_NAMES.join(", ")}`
);
}
if (!isProviderAuthComplete(providerOverride)) {
const missing = getMissingEnvVars(providerOverride);
throw new Error(
`Missing credentials for "${providerOverride}". Set: ${missing.join(", ")}`
);
}
return providerOverride as ProviderName;
}
const detected = detectProvider();
if (!detected) {
throw new Error(
`No provider credentials found. Set one of: ${PROVIDER_NAMES.map((p) => getMissingEnvVars(p).join(", ")).join(" | ")}`
);
}
return detected as ProviderName;
}
function configureComputeSDK(): void {
const provider = resolveProvider();
const config: ExplicitComputeConfig = {
provider,
computesdkApiKey: process.env.COMPUTESDK_API_KEY,
requestTimeoutMs: REQUEST_TIMEOUT_MS,
};
// Add provider-specific config from environment
const providerConfig = getProviderConfigFromEnv(provider);
if (Object.keys(providerConfig).length > 0) {
(config as any)[provider] = providerConfig;
}
compute.setConfig(config);
}
configureComputeSDK();
// Build environment variables to pass to sandbox
const envs: Record<string, string> = {}; const envs: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
// Create sandbox const sdk = await SandboxAgent.start({
const sandbox = await compute.sandbox.create({ sandbox: computesdk({
envs: Object.keys(envs).length > 0 ? envs : undefined, create: { envs },
}),
}); });
// Helper to run commands with error handling try {
const run = async (cmd: string, options?: { background?: boolean }) => { const session = await sdk.createSession({ agent: "claude" });
const result = await sandbox.runCommand(cmd, options); const response = await session.prompt([
if (typeof result?.exitCode === "number" && result.exitCode !== 0) { { type: "text", text: "Summarize this repository" },
throw new Error(`Command failed: ${cmd} (exit ${result.exitCode})\n${result.stderr || ""}`); ]);
} console.log(response.stopReason);
return result; } finally {
}; await sdk.destroySandbox();
// Install sandbox-agent
await run("curl -fsSL https://releases.rivet.dev/sandbox-agent/latest/install.sh | sh");
// Install agents conditionally based on available API keys
if (envs.ANTHROPIC_API_KEY) {
await run("sandbox-agent install-agent claude");
} }
if (envs.OPENAI_API_KEY) {
await run("sandbox-agent install-agent codex");
}
// Start the server in the background
await run(`sandbox-agent server --no-token --host 0.0.0.0 --port ${PORT}`, { background: true });
// Get the public URL for the sandbox
const baseUrl = await sandbox.getUrl({ port: PORT });
// Wait for server to be ready
const deadline = Date.now() + REQUEST_TIMEOUT_MS;
while (Date.now() < deadline) {
try {
const response = await fetch(`${baseUrl}/v1/health`);
if (response.ok) {
const data = await response.json();
if (data?.status === "ok") break;
}
} catch {
// Server not ready yet
}
await new Promise((r) => setTimeout(r, 500));
}
// Connect to the server
const client = await SandboxAgent.connect({ baseUrl });
// Detect which agent to use based on available API keys
const agent = envs.ANTHROPIC_API_KEY ? "claude" : "codex";
// Create a session and start coding
await client.createSession("my-session", { agent });
await client.postMessage("my-session", {
message: "Summarize this repository",
});
for await (const event of client.streamEvents("my-session")) {
console.log(event.type, event.data);
}
// Cleanup
await sandbox.destroy();
``` ```
## Supported Providers The `computesdk` provider handles sandbox creation, Sandbox Agent installation, agent setup, and server startup automatically. ComputeSDK routes to your configured provider behind the scenes.
Before calling `SandboxAgent.start()`, configure ComputeSDK with your provider:
```typescript
import { compute } from "computesdk";
compute.setConfig({
provider: "e2b", // or auto-detect via detectProvider()
computesdkApiKey: process.env.COMPUTESDK_API_KEY,
});
```
## Supported providers
ComputeSDK auto-detects your provider from environment variables: ComputeSDK auto-detects your provider from environment variables:
@ -169,46 +70,7 @@ ComputeSDK auto-detects your provider from environment variables:
## Notes ## Notes
- **Provider resolution order**: `COMPUTESDK_PROVIDER` env var takes priority, otherwise auto-detection from API keys. - **Provider resolution**: Set `COMPUTESDK_PROVIDER` to force a specific provider, or let ComputeSDK auto-detect from API keys.
- **Conditional agent installation**: Only agents with available API keys are installed, reducing setup time.
- **Command error handling**: The example validates exit codes and throws on failures for easier debugging.
- `sandbox.runCommand(..., { background: true })` keeps the server running while your app continues. - `sandbox.runCommand(..., { background: true })` keeps the server running while your app continues.
- `sandbox.getUrl({ port })` returns a public URL for the sandbox port. - `sandbox.getUrl({ port })` returns a public URL for the sandbox port.
- Always destroy the sandbox when you are done to avoid leaking resources. - Always destroy the sandbox when done to avoid leaking resources.
- If sandbox creation times out, set `COMPUTESDK_TIMEOUT_MS` to a higher value (default: 120000ms).
## Explicit Provider Selection
To force a specific provider instead of auto-detection, set the `COMPUTESDK_PROVIDER` environment variable:
```bash
export COMPUTESDK_PROVIDER=e2b
```
Or configure programmatically using `getProviderConfigFromEnv()`:
```typescript
import { compute, getProviderConfigFromEnv, type ExplicitComputeConfig } from "computesdk";
const config: ExplicitComputeConfig = {
provider: "e2b",
computesdkApiKey: process.env.COMPUTESDK_API_KEY,
requestTimeoutMs: 120_000,
};
// Automatically populate provider-specific config from environment
const providerConfig = getProviderConfigFromEnv("e2b");
if (Object.keys(providerConfig).length > 0) {
(config as any).e2b = providerConfig;
}
compute.setConfig(config);
```
## Direct Mode (No ComputeSDK API Key)
To bypass the ComputeSDK gateway and use provider SDKs directly, see the provider-specific examples:
- [E2B](/deploy/e2b)
- [Daytona](/deploy/daytona)
- [Vercel](/deploy/vercel)

View file

@ -15,40 +15,37 @@ See [Daytona network limits](https://www.daytona.io/docs/en/network-limits/).
## TypeScript example ## TypeScript example
```typescript ```bash
import { Daytona } from "@daytonaio/sdk"; npm install sandbox-agent@0.3.x @daytonaio/sdk
import { SandboxAgent } from "sandbox-agent"; ```
const daytona = new Daytona(); ```typescript
import { SandboxAgent } from "sandbox-agent";
import { daytona } from "sandbox-agent/daytona";
const envVars: Record<string, string> = {}; const envVars: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) envVars.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.ANTHROPIC_API_KEY) envVars.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) envVars.OPENAI_API_KEY = process.env.OPENAI_API_KEY; if (process.env.OPENAI_API_KEY) envVars.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const sandbox = await daytona.create({ envVars }); const sdk = await SandboxAgent.start({
sandbox: daytona({
create: { envVars },
}),
});
await sandbox.process.executeCommand( try {
"curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh" const session = await sdk.createSession({ agent: "claude" });
); const response = await session.prompt([
{ type: "text", text: "Summarize this repository" },
await sandbox.process.executeCommand("sandbox-agent install-agent claude"); ]);
await sandbox.process.executeCommand("sandbox-agent install-agent codex"); console.log(response.stopReason);
} finally {
await sandbox.process.executeCommand( await sdk.destroySandbox();
"nohup sandbox-agent server --no-token --host 0.0.0.0 --port 3000 >/tmp/sandbox-agent.log 2>&1 &" }
);
await new Promise((r) => setTimeout(r, 2000));
const baseUrl = (await sandbox.getSignedPreviewUrl(3000, 4 * 60 * 60)).url;
const sdk = await SandboxAgent.connect({ baseUrl });
const session = await sdk.createSession({ agent: "claude" });
await session.prompt([{ type: "text", text: "Summarize this repository" }]);
await sandbox.delete();
``` ```
The `daytona` provider uses the `rivetdev/sandbox-agent:0.3.2-full` image by default and starts the server automatically.
## Using snapshots for faster startup ## Using snapshots for faster startup
```typescript ```typescript

View file

@ -15,43 +15,43 @@ Run the published full image with all supported agents pre-installed:
docker run --rm -p 3000:3000 \ docker run --rm -p 3000:3000 \
-e ANTHROPIC_API_KEY="$ANTHROPIC_API_KEY" \ -e ANTHROPIC_API_KEY="$ANTHROPIC_API_KEY" \
-e OPENAI_API_KEY="$OPENAI_API_KEY" \ -e OPENAI_API_KEY="$OPENAI_API_KEY" \
rivetdev/sandbox-agent:0.3.1-full \ rivetdev/sandbox-agent:0.3.2-full \
server --no-token --host 0.0.0.0 --port 3000 server --no-token --host 0.0.0.0 --port 3000
``` ```
The `0.3.1-full` tag pins the exact version. The moving `full` tag is also published for contributors who want the latest full image. The `0.3.2-full` tag pins the exact version. The moving `full` tag is also published for contributors who want the latest full image.
## TypeScript with dockerode ## TypeScript with the Docker provider
```bash
npm install sandbox-agent@0.3.x dockerode get-port
```
```typescript ```typescript
import Docker from "dockerode";
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { docker } from "sandbox-agent/docker";
const docker = new Docker(); const sdk = await SandboxAgent.start({
const PORT = 3000; sandbox: docker({
env: [
const container = await docker.createContainer({ `ANTHROPIC_API_KEY=${process.env.ANTHROPIC_API_KEY}`,
Image: "rivetdev/sandbox-agent:0.3.1-full", `OPENAI_API_KEY=${process.env.OPENAI_API_KEY}`,
Cmd: ["server", "--no-token", "--host", "0.0.0.0", "--port", `${PORT}`], ].filter(Boolean),
Env: [ }),
`ANTHROPIC_API_KEY=${process.env.ANTHROPIC_API_KEY}`,
`OPENAI_API_KEY=${process.env.OPENAI_API_KEY}`,
`CODEX_API_KEY=${process.env.CODEX_API_KEY}`,
].filter(Boolean),
ExposedPorts: { [`${PORT}/tcp`]: {} },
HostConfig: {
AutoRemove: true,
PortBindings: { [`${PORT}/tcp`]: [{ HostPort: `${PORT}` }] },
},
}); });
await container.start(); try {
const session = await sdk.createSession({ agent: "codex" });
await session.prompt([{ type: "text", text: "Summarize this repository." }]);
} finally {
await sdk.destroySandbox();
}
```
const baseUrl = `http://127.0.0.1:${PORT}`; The `docker` provider uses the `rivetdev/sandbox-agent:0.3.2-full` image by default. Override with `image`:
const sdk = await SandboxAgent.connect({ baseUrl });
const session = await sdk.createSession({ agent: "codex" }); ```typescript
await session.prompt([{ type: "text", text: "Summarize this repository." }]); docker({ image: "my-custom-image:latest" })
``` ```
## Building a custom image with everything preinstalled ## Building a custom image with everything preinstalled

View file

@ -10,42 +10,37 @@ description: "Deploy Sandbox Agent inside an E2B sandbox."
## TypeScript example ## TypeScript example
```bash
npm install sandbox-agent@0.3.x @e2b/code-interpreter
```
```typescript ```typescript
import { Sandbox } from "@e2b/code-interpreter";
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { e2b } from "sandbox-agent/e2b";
const envs: Record<string, string> = {}; const envs: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const sandbox = await Sandbox.create({ allowInternetAccess: true, envs }); const sdk = await SandboxAgent.start({
sandbox: e2b({
await sandbox.commands.run( create: { envs },
"curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh" }),
);
await sandbox.commands.run("sandbox-agent install-agent claude");
await sandbox.commands.run("sandbox-agent install-agent codex");
await sandbox.commands.run(
"sandbox-agent server --no-token --host 0.0.0.0 --port 3000",
{ background: true, timeoutMs: 0 }
);
const baseUrl = `https://${sandbox.getHost(3000)}`;
const sdk = await SandboxAgent.connect({ baseUrl });
const session = await sdk.createSession({ agent: "claude" });
const off = session.onEvent((event) => {
console.log(event.sender, event.payload);
}); });
await session.prompt([{ type: "text", text: "Summarize this repository" }]); try {
off(); const session = await sdk.createSession({ agent: "claude" });
const response = await session.prompt([
await sandbox.kill(); { type: "text", text: "Summarize this repository" },
]);
console.log(response.stopReason);
} finally {
await sdk.destroySandbox();
}
``` ```
The `e2b` provider handles sandbox creation, Sandbox Agent installation, agent setup, and server startup automatically.
## Faster cold starts ## Faster cold starts
For faster startup, create a custom E2B template with Sandbox Agent and target agents pre-installed. For faster startup, create a custom E2B template with Sandbox Agent and target agents pre-installed.

View file

@ -32,12 +32,15 @@ Or with npm/Bun:
## With the TypeScript SDK ## With the TypeScript SDK
The SDK can spawn and manage the server as a subprocess: The SDK can spawn and manage the server as a subprocess using the `local` provider:
```typescript ```typescript
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { local } from "sandbox-agent/local";
const sdk = await SandboxAgent.start(); const sdk = await SandboxAgent.start({
sandbox: local(),
});
const session = await sdk.createSession({ const session = await sdk.createSession({
agent: "claude", agent: "claude",
@ -47,7 +50,21 @@ await session.prompt([
{ type: "text", text: "Summarize this repository." }, { type: "text", text: "Summarize this repository." },
]); ]);
await sdk.dispose(); await sdk.destroySandbox();
``` ```
This starts the server on an available local port and connects automatically. This starts the server on an available local port and connects automatically.
Pass options to customize the local provider:
```typescript
const sdk = await SandboxAgent.start({
sandbox: local({
port: 3000,
log: "inherit",
env: {
ANTHROPIC_API_KEY: process.env.MY_ANTHROPIC_KEY,
},
}),
});
```

View file

@ -10,88 +10,43 @@ description: "Deploy Sandbox Agent inside a Modal sandbox."
## TypeScript example ## TypeScript example
```typescript ```bash
import { ModalClient } from "modal"; npm install sandbox-agent@0.3.x modal
import { SandboxAgent } from "sandbox-agent";
const modal = new ModalClient();
const app = await modal.apps.fromName("sandbox-agent", { createIfMissing: true });
const image = modal.images
.fromRegistry("ubuntu:22.04")
.dockerfileCommands([
"RUN apt-get update && apt-get install -y curl ca-certificates",
"RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.2.x/install.sh | sh",
]);
const envs: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const secrets = Object.keys(envs).length > 0
? [await modal.secrets.fromObject(envs)]
: [];
const sb = await modal.sandboxes.create(app, image, {
encryptedPorts: [3000],
secrets,
});
const exec = async (cmd: string) => {
const p = await sb.exec(["bash", "-c", cmd], { stdout: "pipe", stderr: "pipe" });
const exitCode = await p.wait();
if (exitCode !== 0) {
const stderr = await p.stderr.readText();
throw new Error(`Command failed (exit ${exitCode}): ${cmd}\n${stderr}`);
}
};
await exec("sandbox-agent install-agent claude");
await exec("sandbox-agent install-agent codex");
await sb.exec(
["bash", "-c", "sandbox-agent server --no-token --host 0.0.0.0 --port 3000 &"],
);
const tunnels = await sb.tunnels();
const baseUrl = tunnels[3000].url;
const sdk = await SandboxAgent.connect({ baseUrl });
const session = await sdk.createSession({ agent: "claude" });
const off = session.onEvent((event) => {
console.log(event.sender, event.payload);
});
await session.prompt([{ type: "text", text: "Summarize this repository" }]);
off();
await sb.terminate();
``` ```
```typescript
import { SandboxAgent } from "sandbox-agent";
import { modal } from "sandbox-agent/modal";
const secrets: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) secrets.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) secrets.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const sdk = await SandboxAgent.start({
sandbox: modal({
create: { secrets },
}),
});
try {
const session = await sdk.createSession({ agent: "claude" });
const response = await session.prompt([
{ type: "text", text: "Summarize this repository" },
]);
console.log(response.stopReason);
} finally {
await sdk.destroySandbox();
}
```
The `modal` provider handles app creation, image building, sandbox provisioning, agent installation, server startup, and tunnel networking automatically.
## Faster cold starts ## Faster cold starts
Modal caches image layers, so the `dockerfileCommands` that install `curl` and `sandbox-agent` only run on the first build. Subsequent sandbox creates reuse the cached image. Modal caches image layers, so the Dockerfile commands that install `curl` and `sandbox-agent` only run on the first build. Subsequent sandbox creates reuse the cached image.
## Running the test
The example includes a health-check test. First, build the SDK:
```bash
pnpm --filter sandbox-agent build
```
Then run the test with your Modal credentials:
```bash
MODAL_TOKEN_ID=<your-token-id> MODAL_TOKEN_SECRET=<your-token-secret> npx vitest run
```
Run from `examples/modal/`. The test will skip if credentials are not set.
## Notes ## Notes
- Modal sandboxes use [gVisor](https://gvisor.dev/) for strong isolation. - Modal sandboxes use [gVisor](https://gvisor.dev/) for strong isolation.
- Ports are exposed via encrypted tunnels (`encryptedPorts`). Use `sb.tunnels()` to get the public HTTPS URL. - Ports are exposed via encrypted tunnels (`encryptedPorts`). The provider uses `sb.tunnels()` to get the public HTTPS URL.
- Environment variables (API keys) are passed as Modal [Secrets](https://modal.com/docs/guide/secrets) rather than plain env vars for security. - Environment variables (API keys) are passed as Modal [Secrets](https://modal.com/docs/guide/secrets) for security.
- Always call `sb.terminate()` when done to avoid leaking sandbox resources.

View file

@ -10,52 +10,40 @@ description: "Deploy Sandbox Agent inside a Vercel Sandbox."
## TypeScript example ## TypeScript example
```typescript ```bash
import { Sandbox } from "@vercel/sandbox"; npm install sandbox-agent@0.3.x @vercel/sandbox
import { SandboxAgent } from "sandbox-agent";
const envs: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const sandbox = await Sandbox.create({
runtime: "node24",
ports: [3000],
});
const run = async (cmd: string, args: string[] = []) => {
const result = await sandbox.runCommand({ cmd, args, env: envs });
if (result.exitCode !== 0) {
throw new Error(`Command failed: ${cmd} ${args.join(" ")}`);
}
};
await run("sh", ["-c", "curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh"]);
await run("sandbox-agent", ["install-agent", "claude"]);
await run("sandbox-agent", ["install-agent", "codex"]);
await sandbox.runCommand({
cmd: "sandbox-agent",
args: ["server", "--no-token", "--host", "0.0.0.0", "--port", "3000"],
env: envs,
detached: true,
});
const baseUrl = sandbox.domain(3000);
const sdk = await SandboxAgent.connect({ baseUrl });
const session = await sdk.createSession({ agent: "claude" });
const off = session.onEvent((event) => {
console.log(event.sender, event.payload);
});
await session.prompt([{ type: "text", text: "Summarize this repository" }]);
off();
await sandbox.stop();
``` ```
```typescript
import { SandboxAgent } from "sandbox-agent";
import { vercel } from "sandbox-agent/vercel";
const env: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) env.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) env.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const sdk = await SandboxAgent.start({
sandbox: vercel({
create: {
runtime: "node24",
env,
},
}),
});
try {
const session = await sdk.createSession({ agent: "claude" });
const response = await session.prompt([
{ type: "text", text: "Summarize this repository" },
]);
console.log(response.stopReason);
} finally {
await sdk.destroySandbox();
}
```
The `vercel` provider handles sandbox creation, Sandbox Agent installation, agent setup, and server startup automatically.
## Authentication ## Authentication
Vercel Sandboxes support OIDC token auth (recommended) and access-token auth. Vercel Sandboxes support OIDC token auth (recommended) and access-token auth.

View file

@ -58,20 +58,32 @@
"icon": "server", "icon": "server",
"pages": [ "pages": [
"deploy/local", "deploy/local",
"deploy/computesdk",
"deploy/e2b", "deploy/e2b",
"deploy/daytona", "deploy/daytona",
"deploy/vercel", "deploy/vercel",
"deploy/cloudflare", "deploy/cloudflare",
"deploy/docker", "deploy/docker",
"deploy/boxlite" "deploy/modal",
"deploy/boxlite",
"deploy/computesdk"
] ]
} }
] ]
}, },
{ {
"group": "Agent", "group": "Agent",
"pages": ["agent-sessions", "attachments", "skills-config", "mcp-config", "custom-tools"] "pages": [
"agent-sessions",
{
"group": "Agents",
"icon": "robot",
"pages": ["agents/claude", "agents/codex", "agents/opencode", "agents/cursor", "agents/amp", "agents/pi"]
},
"attachments",
"skills-config",
"mcp-config",
"custom-tools"
]
}, },
{ {
"group": "System", "group": "System",
@ -79,12 +91,12 @@
}, },
{ {
"group": "Orchestration", "group": "Orchestration",
"pages": ["architecture", "session-persistence", "observability", "multiplayer", "security"] "pages": ["orchestration-architecture", "session-persistence", "observability", "multiplayer", "security"]
}, },
{ {
"group": "Reference", "group": "Reference",
"pages": [ "pages": [
"agent-capabilities", "architecture",
"cli", "cli",
"inspector", "inspector",
"opencode-compatibility", "opencode-compatibility",

View file

@ -27,9 +27,7 @@ await sdk.setMcpConfig(
// Create a session using the configured MCP servers // Create a session using the configured MCP servers
const session = await sdk.createSession({ const session = await sdk.createSession({
agent: "claude", agent: "claude",
sessionInit: { cwd: "/workspace",
cwd: "/workspace",
},
}); });
await session.prompt([ await session.prompt([

View file

@ -20,8 +20,40 @@ Use [actor keys](https://rivet.dev/docs/actors/keys) to map each workspace to on
```ts Actor (server) ```ts Actor (server)
import { actor, setup } from "rivetkit"; import { actor, setup } from "rivetkit";
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent, type SessionPersistDriver, type SessionRecord, type SessionEvent, type ListPageRequest, type ListPage, type ListEventsRequest } from "sandbox-agent";
import { RivetSessionPersistDriver, type RivetPersistState } from "@sandbox-agent/persist-rivet";
interface RivetPersistData { sessions: Record<string, SessionRecord>; events: Record<string, SessionEvent[]>; }
type RivetPersistState = { _sandboxAgentPersist: RivetPersistData };
class RivetSessionPersistDriver implements SessionPersistDriver {
private readonly stateKey: string;
private readonly ctx: { state: Record<string, unknown> };
constructor(ctx: { state: Record<string, unknown> }, options: { stateKey?: string } = {}) {
this.ctx = ctx;
this.stateKey = options.stateKey ?? "_sandboxAgentPersist";
if (!this.ctx.state[this.stateKey]) {
this.ctx.state[this.stateKey] = { sessions: {}, events: {} };
}
}
private get data(): RivetPersistData { return this.ctx.state[this.stateKey] as RivetPersistData; }
async getSession(id: string) { const s = this.data.sessions[id]; return s ? { ...s } : undefined; }
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
const sorted = Object.values(this.data.sessions).sort((a, b) => a.createdAt - b.createdAt || a.id.localeCompare(b.id));
const offset = Number(request.cursor ?? 0);
const limit = request.limit ?? 100;
const slice = sorted.slice(offset, offset + limit);
return { items: slice, nextCursor: offset + slice.length < sorted.length ? String(offset + slice.length) : undefined };
}
async updateSession(session: SessionRecord) { this.data.sessions[session.id] = { ...session }; if (!this.data.events[session.id]) this.data.events[session.id] = []; }
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
const all = [...(this.data.events[request.sessionId] ?? [])].sort((a, b) => a.eventIndex - b.eventIndex || a.id.localeCompare(b.id));
const offset = Number(request.cursor ?? 0);
const limit = request.limit ?? 100;
const slice = all.slice(offset, offset + limit);
return { items: slice, nextCursor: offset + slice.length < all.length ? String(offset + slice.length) : undefined };
}
async insertEvent(sessionId: string, event: SessionEvent) { const events = this.data.events[sessionId] ?? []; events.push({ ...event, payload: JSON.parse(JSON.stringify(event.payload)) }); this.data.events[sessionId] = events; }
}
type WorkspaceState = RivetPersistState & { type WorkspaceState = RivetPersistState & {
sandboxId: string; sandboxId: string;
@ -111,5 +143,5 @@ await conn.prompt({
## Notes ## Notes
- Keep sandbox calls actor-only. Browser clients should not call Sandbox Agent directly. - Keep sandbox calls actor-only. Browser clients should not call Sandbox Agent directly.
- Use `@sandbox-agent/persist-rivet` so session history persists in actor state. - Copy the Rivet persist driver from the example above into your project so session history persists in actor state.
- For client connection patterns, see [Rivet JavaScript client](https://rivet.dev/docs/clients/javascript). - For client connection patterns, see [Rivet JavaScript client](https://rivet.dev/docs/clients/javascript).

View file

@ -0,0 +1,43 @@
---
title: "Orchestration Architecture"
description: "Production topology, backend requirements, and session persistence."
icon: "sitemap"
---
This page covers production topology and backend requirements. Read [Architecture](/architecture) first for an overview of how the server, SDK, and agent processes fit together.
## Suggested Topology
Run the SDK on your backend, then call it from your frontend.
This extra hop is recommended because it keeps auth/token logic on the backend and makes persistence simpler.
```mermaid placement="top-right"
flowchart LR
BROWSER["Browser"]
subgraph BACKEND["Your backend"]
direction TB
SDK["Sandbox Agent SDK"]
end
subgraph SANDBOX_SIMPLE["Sandbox"]
SERVER_SIMPLE["Sandbox Agent server"]
end
BROWSER --> BACKEND
BACKEND --> SDK --> SERVER_SIMPLE
```
### Backend requirements
Your backend layer needs to handle:
- **Long-running connections**: prompts can take minutes.
- **Session affinity**: follow-up messages must reach the same session.
- **State between requests**: session metadata and event history must persist across requests.
- **Graceful recovery**: sessions should resume after backend restarts.
We recommend [Rivet](https://rivet.dev) over serverless because actors natively support the long-lived connections, session routing, and state persistence that agent workloads require.
## Session persistence
For storage driver options and replay behavior, see [Persisting Sessions](/session-persistence).

View file

@ -1,281 +1,370 @@
--- ---
title: "Quickstart" title: "Quickstart"
description: "Start the server and send your first message." description: "Get a coding agent running in a sandbox in under a minute."
icon: "rocket" icon: "rocket"
--- ---
<Steps> <Steps>
<Step title="Install skill (optional)"> <Step title="Install">
<Tabs> <Tabs>
<Tab title="npx"> <Tab title="npm">
```bash ```bash
npx skills add rivet-dev/skills -s sandbox-agent npm install sandbox-agent@0.3.x
``` ```
</Tab> </Tab>
<Tab title="bunx"> <Tab title="bun">
```bash ```bash
bunx skills add rivet-dev/skills -s sandbox-agent bun add sandbox-agent@0.3.x
```
</Tab>
</Tabs>
</Step>
<Step title="Set environment variables">
Each coding agent requires API keys to connect to their respective LLM providers.
<Tabs>
<Tab title="Local shell">
```bash
export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."
```
</Tab>
<Tab title="E2B">
```typescript
import { Sandbox } from "@e2b/code-interpreter";
const envs: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const sandbox = await Sandbox.create({ envs });
```
</Tab>
<Tab title="Daytona">
```typescript
import { Daytona } from "@daytonaio/sdk";
const envVars: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) envVars.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) envVars.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const daytona = new Daytona();
const sandbox = await daytona.create({
snapshot: "sandbox-agent-ready",
envVars,
});
```
</Tab>
<Tab title="Docker">
```bash
docker run -p 2468:2468 \
-e ANTHROPIC_API_KEY="sk-ant-..." \
-e OPENAI_API_KEY="sk-..." \
rivetdev/sandbox-agent:0.3.1-full \
server --no-token --host 0.0.0.0 --port 2468
```
</Tab>
</Tabs>
<AccordionGroup>
<Accordion title="Extracting API keys from current machine">
Use `sandbox-agent credentials extract-env --export` to extract your existing API keys (Anthropic, OpenAI, etc.) from local Claude Code or Codex config files.
</Accordion>
<Accordion title="Testing without API keys">
Use the `mock` agent for SDK and integration testing without provider credentials.
</Accordion>
<Accordion title="Multi-tenant and per-user billing">
For per-tenant token tracking, budget enforcement, or usage-based billing, see [LLM Credentials](/llm-credentials) for gateway options like OpenRouter, LiteLLM, and Portkey.
</Accordion>
</AccordionGroup>
</Step>
<Step title="Run the server">
<Tabs>
<Tab title="curl">
Install and run the binary directly.
```bash
curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh
sandbox-agent server --no-token --host 0.0.0.0 --port 2468
```
</Tab>
<Tab title="npx">
Run without installing globally.
```bash
npx @sandbox-agent/cli@0.3.x server --no-token --host 0.0.0.0 --port 2468
```
</Tab>
<Tab title="bunx">
Run without installing globally.
```bash
bunx @sandbox-agent/cli@0.3.x server --no-token --host 0.0.0.0 --port 2468
```
</Tab>
<Tab title="npm i -g">
Install globally, then run.
```bash
npm install -g @sandbox-agent/cli@0.3.x
sandbox-agent server --no-token --host 0.0.0.0 --port 2468
```
</Tab>
<Tab title="bun add -g">
Install globally, then run.
```bash
bun add -g @sandbox-agent/cli@0.3.x
# Allow Bun to run postinstall scripts for native binaries (required for SandboxAgent.start()). # Allow Bun to run postinstall scripts for native binaries (required for SandboxAgent.start()).
bun pm -g trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64 bun pm trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64
sandbox-agent server --no-token --host 0.0.0.0 --port 2468
``` ```
</Tab> </Tab>
</Tabs>
</Step>
<Tab title="Node.js (local)"> <Step title="Start the sandbox">
For local development, use `SandboxAgent.start()` to spawn and manage the server as a subprocess. `SandboxAgent.start()` provisions a sandbox, starts a lightweight [Sandbox Agent server](/architecture) inside it, and connects your SDK client.
<Tabs>
<Tab title="Local">
```bash ```bash
npm install sandbox-agent@0.3.x npm install sandbox-agent@0.3.x
``` ```
```typescript ```typescript
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { local } from "sandbox-agent/local";
const sdk = await SandboxAgent.start(); // Runs on your machine. Inherits process.env automatically.
const client = await SandboxAgent.start({
sandbox: local(),
});
``` ```
See [Local deploy guide](/deploy/local)
</Tab> </Tab>
<Tab title="Bun (local)"> <Tab title="E2B">
For local development, use `SandboxAgent.start()` to spawn and manage the server as a subprocess.
```bash ```bash
bun add sandbox-agent@0.3.x npm install sandbox-agent@0.3.x @e2b/code-interpreter
# Allow Bun to run postinstall scripts for native binaries (required for SandboxAgent.start()).
bun pm trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64
``` ```
```typescript ```typescript
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { e2b } from "sandbox-agent/e2b";
const sdk = await SandboxAgent.start(); // Provisions a cloud sandbox on E2B, installs the server, and connects.
const client = await SandboxAgent.start({
sandbox: e2b(),
});
``` ```
See [E2B deploy guide](/deploy/e2b)
</Tab> </Tab>
<Tab title="Build from source"> <Tab title="Daytona">
If you're running from source instead of the installed CLI.
```bash ```bash
cargo run -p sandbox-agent -- server --no-token --host 0.0.0.0 --port 2468 npm install sandbox-agent@0.3.x @daytonaio/sdk
``` ```
```typescript
import { SandboxAgent } from "sandbox-agent";
import { daytona } from "sandbox-agent/daytona";
// Provisions a Daytona workspace with the server pre-installed.
const client = await SandboxAgent.start({
sandbox: daytona(),
});
```
See [Daytona deploy guide](/deploy/daytona)
</Tab>
<Tab title="Vercel">
```bash
npm install sandbox-agent@0.3.x @vercel/sandbox
```
```typescript
import { SandboxAgent } from "sandbox-agent";
import { vercel } from "sandbox-agent/vercel";
// Provisions a Vercel sandbox with the server installed on boot.
const client = await SandboxAgent.start({
sandbox: vercel(),
});
```
See [Vercel deploy guide](/deploy/vercel)
</Tab>
<Tab title="Modal">
```bash
npm install sandbox-agent@0.3.x modal
```
```typescript
import { SandboxAgent } from "sandbox-agent";
import { modal } from "sandbox-agent/modal";
// Builds a container image with agents pre-installed (cached after first run),
// starts a Modal sandbox from that image, and connects.
const client = await SandboxAgent.start({
sandbox: modal(),
});
```
See [Modal deploy guide](/deploy/modal)
</Tab>
<Tab title="Cloudflare">
```bash
npm install sandbox-agent@0.3.x @cloudflare/sandbox
```
```typescript
import { SandboxAgent } from "sandbox-agent";
import { cloudflare } from "sandbox-agent/cloudflare";
import { SandboxClient } from "@cloudflare/sandbox";
// Uses the Cloudflare Sandbox SDK to provision and connect.
// The Cloudflare SDK handles server lifecycle internally.
const cfSandboxClient = new SandboxClient();
const client = await SandboxAgent.start({
sandbox: cloudflare({ sdk: cfSandboxClient }),
});
```
See [Cloudflare deploy guide](/deploy/cloudflare)
</Tab>
<Tab title="Docker">
```bash
npm install sandbox-agent@0.3.x dockerode get-port
```
```typescript
import { SandboxAgent } from "sandbox-agent";
import { docker } from "sandbox-agent/docker";
// Runs a Docker container locally. Good for testing.
const client = await SandboxAgent.start({
sandbox: docker(),
});
```
See [Docker deploy guide](/deploy/docker)
</Tab> </Tab>
</Tabs> </Tabs>
Binding to `0.0.0.0` allows the server to accept connections from any network interface, which is required when running inside a sandbox where clients connect remotely. <div style={{ height: "1rem" }} />
**More info:**
<AccordionGroup> <AccordionGroup>
<Accordion title="Configuring token"> <Accordion title="Passing LLM credentials">
Tokens are usually not required. Most sandbox providers (E2B, Daytona, etc.) already secure networking at the infrastructure layer. Agents need API keys for their LLM provider. Each provider passes credentials differently:
If you expose the server publicly, use `--token "$SANDBOX_TOKEN"` to require authentication: ```typescript
// Local — inherits process.env automatically
```bash // E2B
sandbox-agent server --token "$SANDBOX_TOKEN" --host 0.0.0.0 --port 2468 e2b({ create: { envs: { ANTHROPIC_API_KEY: "..." } } })
// Daytona
daytona({ create: { envVars: { ANTHROPIC_API_KEY: "..." } } })
// Vercel
vercel({ create: { env: { ANTHROPIC_API_KEY: "..." } } })
// Modal
modal({ create: { secrets: { ANTHROPIC_API_KEY: "..." } } })
// Docker
docker({ env: ["ANTHROPIC_API_KEY=..."] })
``` ```
Then pass the token when connecting: For multi-tenant billing, per-user keys, and gateway options, see [LLM Credentials](/llm-credentials).
</Accordion>
<Accordion title="Implementing a custom provider">
Implement the `SandboxProvider` interface to use any sandbox platform:
```typescript
import { SandboxAgent, type SandboxProvider } from "sandbox-agent";
const myProvider: SandboxProvider = {
name: "my-provider",
async create() {
// Provision a sandbox, install & start the server, return an ID
return "sandbox-123";
},
async destroy(sandboxId) {
// Tear down the sandbox
},
async getUrl(sandboxId) {
// Return the Sandbox Agent server URL
return `https://${sandboxId}.my-platform.dev:3000`;
},
};
const client = await SandboxAgent.start({
sandbox: myProvider,
});
```
</Accordion>
<Accordion title="Connecting to an existing server">
If you already have a Sandbox Agent server running, connect directly:
```typescript
const client = await SandboxAgent.connect({
baseUrl: "http://127.0.0.1:2468",
});
```
</Accordion>
<Accordion title="Starting the server manually">
<Tabs> <Tabs>
<Tab title="TypeScript">
```typescript
import { SandboxAgent } from "sandbox-agent";
const sdk = await SandboxAgent.connect({
baseUrl: "http://your-server:2468",
token: process.env.SANDBOX_TOKEN,
});
```
</Tab>
<Tab title="curl"> <Tab title="curl">
```bash ```bash
curl "http://your-server:2468/v1/health" \ curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh
-H "Authorization: Bearer $SANDBOX_TOKEN" sandbox-agent server --no-token --host 0.0.0.0 --port 2468
``` ```
</Tab> </Tab>
<Tab title="npx">
<Tab title="CLI">
```bash ```bash
sandbox-agent --token "$SANDBOX_TOKEN" api agents list \ npx @sandbox-agent/cli@0.3.x server --no-token --host 0.0.0.0 --port 2468
--endpoint http://your-server:2468 ```
</Tab>
<Tab title="Docker">
```bash
docker run -p 2468:2468 \
-e ANTHROPIC_API_KEY="sk-ant-..." \
-e OPENAI_API_KEY="sk-..." \
rivetdev/sandbox-agent:0.3.2-full \
server --no-token --host 0.0.0.0 --port 2468
``` ```
</Tab> </Tab>
</Tabs> </Tabs>
</Accordion> </Accordion>
<Accordion title="CORS">
If you're calling the server from a browser, see the [CORS configuration guide](/cors).
</Accordion>
</AccordionGroup> </AccordionGroup>
</Step> </Step>
<Step title="Install agents (optional)"> <Step title="Create a session and send a prompt">
Supported agent IDs: `claude`, `codex`, `opencode`, `amp`, `pi`, `cursor`, `mock`. <CodeGroup>
To preinstall agents: ```typescript Claude
const session = await client.createSession({
agent: "claude",
});
```bash session.onEvent((event) => {
sandbox-agent install-agent --all console.log(event.sender, event.payload);
``` });
If agents are not installed up front, they are lazily installed when creating a session. const result = await session.prompt([
{ type: "text", text: "Summarize the repository and suggest next steps." },
]);
console.log(result.stopReason);
```
```typescript Codex
const session = await client.createSession({
agent: "codex",
});
session.onEvent((event) => {
console.log(event.sender, event.payload);
});
const result = await session.prompt([
{ type: "text", text: "Summarize the repository and suggest next steps." },
]);
console.log(result.stopReason);
```
```typescript OpenCode
const session = await client.createSession({
agent: "opencode",
});
session.onEvent((event) => {
console.log(event.sender, event.payload);
});
const result = await session.prompt([
{ type: "text", text: "Summarize the repository and suggest next steps." },
]);
console.log(result.stopReason);
```
```typescript Cursor
const session = await client.createSession({
agent: "cursor",
});
session.onEvent((event) => {
console.log(event.sender, event.payload);
});
const result = await session.prompt([
{ type: "text", text: "Summarize the repository and suggest next steps." },
]);
console.log(result.stopReason);
```
```typescript Amp
const session = await client.createSession({
agent: "amp",
});
session.onEvent((event) => {
console.log(event.sender, event.payload);
});
const result = await session.prompt([
{ type: "text", text: "Summarize the repository and suggest next steps." },
]);
console.log(result.stopReason);
```
```typescript Pi
const session = await client.createSession({
agent: "pi",
});
session.onEvent((event) => {
console.log(event.sender, event.payload);
});
const result = await session.prompt([
{ type: "text", text: "Summarize the repository and suggest next steps." },
]);
console.log(result.stopReason);
```
</CodeGroup>
See [Agent Sessions](/agent-sessions) for the full sessions API.
</Step> </Step>
<Step title="Create a session"> <Step title="Clean up">
```typescript ```typescript
import { SandboxAgent } from "sandbox-agent"; await client.destroySandbox(); // tears down the sandbox and disconnects
const sdk = await SandboxAgent.connect({
baseUrl: "http://127.0.0.1:2468",
});
const session = await sdk.createSession({
agent: "claude",
sessionInit: {
cwd: "/",
mcpServers: [],
},
});
console.log(session.id);
``` ```
Use `client.dispose()` instead to disconnect without destroying the sandbox (for reconnecting later).
</Step> </Step>
<Step title="Send a message"> <Step title="Inspect with the UI">
```typescript Open the Inspector at `/ui/` on your server (e.g. `http://localhost:2468/ui/`) to view sessions and events in a GUI.
const result = await session.prompt([
{ type: "text", text: "Summarize the repository and suggest next steps." },
]);
console.log(result.stopReason);
```
</Step>
<Step title="Read events">
```typescript
const off = session.onEvent((event) => {
console.log(event.sender, event.payload);
});
const page = await sdk.getEvents({
sessionId: session.id,
limit: 50,
});
console.log(page.items.length);
off();
```
</Step>
<Step title="Test with Inspector">
Open the Inspector UI at `/ui/` on your server (for example, `http://localhost:2468/ui/`) to inspect sessions and events in a GUI.
<Frame> <Frame>
<img src="/images/inspector.png" alt="Sandbox Agent Inspector" /> <img src="/images/inspector.png" alt="Sandbox Agent Inspector" />
@ -283,16 +372,44 @@ icon: "rocket"
</Step> </Step>
</Steps> </Steps>
## Full example
```typescript
import { SandboxAgent } from "sandbox-agent";
import { e2b } from "sandbox-agent/e2b";
const client = await SandboxAgent.start({
sandbox: e2b({
create: {
envs: { ANTHROPIC_API_KEY: process.env.ANTHROPIC_API_KEY },
},
}),
});
try {
const session = await client.createSession({ agent: "claude" });
session.onEvent((event) => {
console.log(`[${event.sender}]`, JSON.stringify(event.payload));
});
const result = await session.prompt([
{ type: "text", text: "Write a function that checks if a number is prime." },
]);
console.log("Done:", result.stopReason);
} finally {
await client.destroySandbox();
}
```
## Next steps ## Next steps
<CardGroup cols={3}> <CardGroup cols={2}>
<Card title="Session Persistence" icon="database" href="/session-persistence"> <Card title="SDK Overview" icon="compass" href="/sdk-overview">
Configure in-memory, Rivet Actor state, IndexedDB, SQLite, and Postgres persistence. Full TypeScript SDK API surface.
</Card> </Card>
<Card title="Deploy to a Sandbox" icon="box" href="/deploy/local"> <Card title="Deploy to a Sandbox" icon="box" href="/deploy/local">
Deploy your agent to E2B, Daytona, Docker, Vercel, or Cloudflare. Deploy to E2B, Daytona, Docker, Vercel, or Cloudflare.
</Card>
<Card title="SDK Overview" icon="compass" href="/sdk-overview">
Use the latest TypeScript SDK API.
</Card> </Card>
</CardGroup> </CardGroup>

View file

@ -23,12 +23,6 @@ The TypeScript SDK is centered on `sandbox-agent` and its `SandboxAgent` class.
</Tab> </Tab>
</Tabs> </Tabs>
## Optional persistence drivers
```bash
npm install @sandbox-agent/persist-indexeddb@0.3.x @sandbox-agent/persist-sqlite@0.3.x @sandbox-agent/persist-postgres@0.3.x
```
## Optional React components ## Optional React components
```bash ```bash
@ -68,15 +62,12 @@ const sdk = await SandboxAgent.connect({
controller.abort(); controller.abort();
``` ```
With persistence: With persistence (see [Persisting Sessions](/session-persistence) for driver options):
```ts ```ts
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent, InMemorySessionPersistDriver } from "sandbox-agent";
import { SQLiteSessionPersistDriver } from "@sandbox-agent/persist-sqlite";
const persist = new SQLiteSessionPersistDriver({ const persist = new InMemorySessionPersistDriver();
filename: "./sessions.db",
});
const sdk = await SandboxAgent.connect({ const sdk = await SandboxAgent.connect({
baseUrl: "http://127.0.0.1:2468", baseUrl: "http://127.0.0.1:2468",
@ -84,25 +75,40 @@ const sdk = await SandboxAgent.connect({
}); });
``` ```
Local autospawn (Node.js only): Local spawn with a sandbox provider:
```ts ```ts
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { local } from "sandbox-agent/local";
const localSdk = await SandboxAgent.start(); const sdk = await SandboxAgent.start({
sandbox: local(),
});
await localSdk.dispose(); // sdk.sandboxId — prefixed provider ID (e.g. "local/127.0.0.1:2468")
await sdk.destroySandbox(); // tears down sandbox + disposes client
``` ```
`SandboxAgent.start(...)` requires a `sandbox` provider. Built-in providers:
| Import | Provider |
|--------|----------|
| `sandbox-agent/local` | Local subprocess |
| `sandbox-agent/docker` | Docker container |
| `sandbox-agent/e2b` | E2B sandbox |
| `sandbox-agent/daytona` | Daytona workspace |
| `sandbox-agent/vercel` | Vercel Sandbox |
| `sandbox-agent/cloudflare` | Cloudflare Sandbox |
Use `sdk.dispose()` to disconnect without destroying the sandbox, or `sdk.destroySandbox()` to tear down both.
## Session flow ## Session flow
```ts ```ts
const session = await sdk.createSession({ const session = await sdk.createSession({
agent: "mock", agent: "mock",
sessionInit: { cwd: "/",
cwd: "/",
mcpServers: [],
},
}); });
const prompt = await session.prompt([ const prompt = await session.prompt([
@ -223,6 +229,7 @@ Parameters:
- `token` (optional): Bearer token for authenticated servers - `token` (optional): Bearer token for authenticated servers
- `headers` (optional): Additional request headers - `headers` (optional): Additional request headers
- `fetch` (optional): Custom fetch implementation used by SDK HTTP and session calls - `fetch` (optional): Custom fetch implementation used by SDK HTTP and session calls
- `skipHealthCheck` (optional): set `true` to skip the startup `/v1/health` wait
- `waitForHealth` (optional, defaults to enabled): waits for `/v1/health` before HTTP helpers and session setup proceed; pass `false` to disable or `{ timeoutMs }` to bound the wait - `waitForHealth` (optional, defaults to enabled): waits for `/v1/health` before HTTP helpers and session setup proceed; pass `false` to disable or `{ timeoutMs }` to bound the wait
- `signal` (optional): aborts the startup `/v1/health` wait used by `connect()` - `signal` (optional): aborts the startup `/v1/health` wait used by `connect()`

View file

@ -4,7 +4,7 @@ description: "Backend-first auth and access control patterns."
icon: "shield" icon: "shield"
--- ---
As covered in [Architecture](/architecture), run the Sandbox Agent client on your backend, not in the browser. As covered in [Orchestration Architecture](/orchestration-architecture), run the Sandbox Agent client on your backend, not in the browser.
This keeps sandbox credentials private and gives you one place for authz, rate limiting, and audit logging. This keeps sandbox credentials private and gives you one place for authz, rate limiting, and audit logging.
@ -92,7 +92,7 @@ export const workspace = actor({
const session = await sdk.createSession({ const session = await sdk.createSession({
agent: "claude", agent: "claude",
sessionInit: { cwd: "/workspace" }, cwd: "/workspace",
}); });
session.onEvent((event) => { session.onEvent((event) => {

View file

@ -10,14 +10,22 @@ With persistence enabled, sessions can be restored after runtime/session loss. S
Each driver stores: Each driver stores:
- `SessionRecord` (`id`, `agent`, `agentSessionId`, `lastConnectionId`, `createdAt`, optional `destroyedAt`, optional `sessionInit`) - `SessionRecord` (`id`, `agent`, `agentSessionId`, `lastConnectionId`, `createdAt`, optional `destroyedAt`, optional `sandboxId`, optional `sessionInit`, optional `configOptions`, optional `modes`)
- `SessionEvent` (`id`, `eventIndex`, `sessionId`, `connectionId`, `sender`, `payload`, `createdAt`) - `SessionEvent` (`id`, `eventIndex`, `sessionId`, `connectionId`, `sender`, `payload`, `createdAt`)
## Persistence drivers ## Persistence drivers
### In-memory ### Rivet
Best for local dev and ephemeral workloads. Recommended for sandbox orchestration with actor state. See [Multiplayer](/multiplayer) for a full Rivet actor example with persistence in actor state.
### IndexedDB (browser)
Best for browser apps that should survive reloads. See the [Inspector source](https://github.com/rivet-dev/sandbox-agent/tree/main/frontend/packages/inspector/src/persist-indexeddb.ts) for a complete IndexedDB driver you can copy into your project.
### In-memory (built-in)
Best for local dev and ephemeral workloads. No extra dependencies required.
```ts ```ts
import { InMemorySessionPersistDriver, SandboxAgent } from "sandbox-agent"; import { InMemorySessionPersistDriver, SandboxAgent } from "sandbox-agent";
@ -33,91 +41,17 @@ const sdk = await SandboxAgent.connect({
}); });
``` ```
### Rivet
Recommended for sandbox orchestration with actor state.
```bash
npm install @sandbox-agent/persist-rivet@0.3.x
```
```ts
import { actor } from "rivetkit";
import { SandboxAgent } from "sandbox-agent";
import { RivetSessionPersistDriver, type RivetPersistState } from "@sandbox-agent/persist-rivet";
type PersistedState = RivetPersistState & {
sandboxId: string;
baseUrl: string;
};
export default actor({
createState: async () => {
return {
sandboxId: "sbx_123",
baseUrl: "http://127.0.0.1:2468",
} satisfies Partial<PersistedState>;
},
createVars: async (c) => {
const persist = new RivetSessionPersistDriver(c);
const sdk = await SandboxAgent.connect({
baseUrl: c.state.baseUrl,
persist,
});
const session = await sdk.resumeOrCreateSession({ id: "default", agent: "codex" });
const unsubscribe = session.onEvent((event) => {
c.broadcast("session.event", event);
});
return { sdk, session, unsubscribe };
},
actions: {
sendMessage: async (c, message: string) => {
await c.vars.session.prompt([{ type: "text", text: message }]);
},
},
onSleep: async (c) => {
c.vars.unsubscribe?.();
await c.vars.sdk.dispose();
},
});
```
### IndexedDB
Best for browser apps that should survive reloads.
```bash
npm install @sandbox-agent/persist-indexeddb@0.3.x
```
```ts
import { SandboxAgent } from "sandbox-agent";
import { IndexedDbSessionPersistDriver } from "@sandbox-agent/persist-indexeddb";
const persist = new IndexedDbSessionPersistDriver({
databaseName: "sandbox-agent-session-store",
});
const sdk = await SandboxAgent.connect({
baseUrl: "http://127.0.0.1:2468",
persist,
});
```
### SQLite ### SQLite
Best for local/server Node apps that need durable storage without a DB server. Best for local/server Node apps that need durable storage without a DB server.
```bash ```bash
npm install @sandbox-agent/persist-sqlite@0.3.x npm install better-sqlite3
``` ```
```ts ```ts
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { SQLiteSessionPersistDriver } from "@sandbox-agent/persist-sqlite"; import { SQLiteSessionPersistDriver } from "./persist.ts";
const persist = new SQLiteSessionPersistDriver({ const persist = new SQLiteSessionPersistDriver({
filename: "./sandbox-agent.db", filename: "./sandbox-agent.db",
@ -129,17 +63,19 @@ const sdk = await SandboxAgent.connect({
}); });
``` ```
See the [full SQLite example](https://github.com/rivet-dev/sandbox-agent/tree/main/examples/persist-sqlite) for the complete driver implementation you can copy into your project.
### Postgres ### Postgres
Use when you already run Postgres and want shared relational storage. Use when you already run Postgres and want shared relational storage.
```bash ```bash
npm install @sandbox-agent/persist-postgres@0.3.x npm install pg
``` ```
```ts ```ts
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { PostgresSessionPersistDriver } from "@sandbox-agent/persist-postgres"; import { PostgresSessionPersistDriver } from "./persist.ts";
const persist = new PostgresSessionPersistDriver({ const persist = new PostgresSessionPersistDriver({
connectionString: process.env.DATABASE_URL, connectionString: process.env.DATABASE_URL,
@ -152,6 +88,8 @@ const sdk = await SandboxAgent.connect({
}); });
``` ```
See the [full Postgres example](https://github.com/rivet-dev/sandbox-agent/tree/main/examples/persist-postgres) for the complete driver implementation you can copy into your project.
### Custom driver ### Custom driver
Implement `SessionPersistDriver` for custom backends. Implement `SessionPersistDriver` for custom backends.
@ -160,11 +98,11 @@ Implement `SessionPersistDriver` for custom backends.
import type { SessionPersistDriver } from "sandbox-agent"; import type { SessionPersistDriver } from "sandbox-agent";
class MyDriver implements SessionPersistDriver { class MyDriver implements SessionPersistDriver {
async getSession(id) { return null; } async getSession(id) { return undefined; }
async listSessions(request) { return { items: [] }; } async listSessions(request) { return { items: [] }; }
async updateSession(session) {} async updateSession(session) {}
async listEvents(request) { return { items: [] }; } async listEvents(request) { return { items: [] }; }
async insertEvent(event) {} async insertEvent(sessionId, event) {}
} }
``` ```

View file

@ -35,9 +35,7 @@ await sdk.setSkillsConfig(
// Create a session using the configured skills // Create a session using the configured skills
const session = await sdk.createSession({ const session = await sdk.createSession({
agent: "claude", agent: "claude",
sessionInit: { cwd: "/workspace",
cwd: "/workspace",
},
}); });
await session.prompt([ await session.prompt([

View file

@ -25,7 +25,7 @@ const baseUrl = "http://localhost:3000";
console.log("Connecting to server..."); console.log("Connecting to server...");
const client = await SandboxAgent.connect({ baseUrl }); const client = await SandboxAgent.connect({ baseUrl });
const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/root", mcpServers: [] } }); const session = await client.createSession({ agent: detectAgent(), cwd: "/root" });
const sessionId = session.id; const sessionId = session.id;
console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`);

View file

@ -0,0 +1,154 @@
import { describe, it, expect } from "vitest";
import { spawn, type ChildProcess } from "node:child_process";
import { resolve, dirname } from "node:path";
import { fileURLToPath } from "node:url";
import { execSync } from "node:child_process";
const __dirname = dirname(fileURLToPath(import.meta.url));
const PROJECT_DIR = resolve(__dirname, "..");
/**
* Cloudflare Workers integration test.
*
* Set RUN_CLOUDFLARE_EXAMPLES=1 to enable. Requires wrangler and Docker.
*
* This starts `wrangler dev` which:
* 1. Builds the Dockerfile (cloudflare/sandbox base + sandbox-agent)
* 2. Starts a local Workers runtime with Durable Objects and containers
* 3. Exposes the app on a local port
*
* We then test through the proxy endpoint which forwards to sandbox-agent
* running inside the container.
*/
const shouldRun = process.env.RUN_CLOUDFLARE_EXAMPLES === "1";
const timeoutMs = Number.parseInt(process.env.SANDBOX_TEST_TIMEOUT_MS || "", 10) || 600_000;
const testFn = shouldRun ? it : it.skip;
interface WranglerDev {
baseUrl: string;
cleanup: () => void;
}
async function startWranglerDev(): Promise<WranglerDev> {
// Build frontend assets first (wrangler expects dist/ to exist)
execSync("npx vite build", { cwd: PROJECT_DIR, stdio: "pipe" });
return new Promise<WranglerDev>((resolve, reject) => {
const child: ChildProcess = spawn("npx", ["wrangler", "dev", "--port", "0"], {
cwd: PROJECT_DIR,
stdio: ["ignore", "pipe", "pipe"],
detached: true,
env: {
...process.env,
// Ensure wrangler picks up API keys to pass to the container
NODE_ENV: "development",
},
});
let stdout = "";
let stderr = "";
let resolved = false;
const cleanup = () => {
if (child.pid) {
// Kill process group to ensure wrangler and its children are cleaned up
try {
process.kill(-child.pid, "SIGTERM");
} catch {
try {
child.kill("SIGTERM");
} catch {}
}
}
};
const timer = setTimeout(() => {
if (!resolved) {
resolved = true;
cleanup();
reject(new Error(`wrangler dev did not start within 120s.\nstdout: ${stdout}\nstderr: ${stderr}`));
}
}, 120_000);
const onData = (chunk: Buffer) => {
const text = chunk.toString();
stdout += text;
// wrangler dev prints "Ready on http://localhost:XXXX" when ready
const match = stdout.match(/Ready on (https?:\/\/[^\s]+)/i) ?? stdout.match(/(https?:\/\/(?:localhost|127\.0\.0\.1):\d+)/);
if (match && !resolved) {
resolved = true;
clearTimeout(timer);
resolve({ baseUrl: match[1], cleanup });
}
};
child.stdout?.on("data", onData);
child.stderr?.on("data", (chunk: Buffer) => {
const text = chunk.toString();
stderr += text;
// Some wrangler versions print ready message to stderr
const match = text.match(/Ready on (https?:\/\/[^\s]+)/i) ?? text.match(/(https?:\/\/(?:localhost|127\.0\.0\.1):\d+)/);
if (match && !resolved) {
resolved = true;
clearTimeout(timer);
resolve({ baseUrl: match[1], cleanup });
}
});
child.on("error", (err) => {
if (!resolved) {
resolved = true;
clearTimeout(timer);
reject(new Error(`wrangler dev failed to start: ${err.message}`));
}
});
child.on("exit", (code) => {
if (!resolved) {
resolved = true;
clearTimeout(timer);
reject(new Error(`wrangler dev exited with code ${code}.\nstdout: ${stdout}\nstderr: ${stderr}`));
}
});
});
}
describe("cloudflare example", () => {
testFn(
"starts wrangler dev and sandbox-agent responds via proxy",
async () => {
const { baseUrl, cleanup } = await startWranglerDev();
try {
// The Cloudflare example proxies requests through /sandbox/:name/proxy/*
// Wait for the container inside the Durable Object to start sandbox-agent
const healthUrl = `${baseUrl}/sandbox/test/proxy/v1/health`;
let healthy = false;
for (let i = 0; i < 120; i++) {
try {
const res = await fetch(healthUrl);
if (res.ok) {
const data = await res.json();
// The proxied health endpoint returns {name: "Sandbox Agent", ...}
if (data.status === "ok" || data.name === "Sandbox Agent") {
healthy = true;
break;
}
}
} catch {}
await new Promise((r) => setTimeout(r, 2000));
}
expect(healthy).toBe(true);
// Confirm a second request also works
const response = await fetch(healthUrl);
expect(response.ok).toBe(true);
} finally {
cleanup();
}
},
timeoutMs,
);
});

View file

@ -2,7 +2,7 @@ import { defineConfig } from "vitest/config";
export default defineConfig({ export default defineConfig({
test: { test: {
root: ".",
include: ["tests/**/*.test.ts"], include: ["tests/**/*.test.ts"],
testTimeout: 60000,
}, },
}); });

View file

@ -3,7 +3,7 @@
"private": true, "private": true,
"type": "module", "type": "module",
"scripts": { "scripts": {
"start": "tsx src/computesdk.ts", "start": "tsx src/index.ts",
"typecheck": "tsc --noEmit" "typecheck": "tsc --noEmit"
}, },
"dependencies": { "dependencies": {

View file

@ -1,151 +0,0 @@
import {
compute,
detectProvider,
getMissingEnvVars,
getProviderConfigFromEnv,
isProviderAuthComplete,
isValidProvider,
PROVIDER_NAMES,
type ExplicitComputeConfig,
type ProviderName,
} from "computesdk";
import { SandboxAgent } from "sandbox-agent";
import { detectAgent, buildInspectorUrl } from "@sandbox-agent/example-shared";
import { fileURLToPath } from "node:url";
import { resolve } from "node:path";
const PORT = 3000;
const REQUEST_TIMEOUT_MS = Number.parseInt(process.env.COMPUTESDK_TIMEOUT_MS || "", 10) || 120_000;
/**
* Detects and validates the provider to use.
* Priority: COMPUTESDK_PROVIDER env var > auto-detection from API keys
*/
function resolveProvider(): ProviderName {
const providerOverride = process.env.COMPUTESDK_PROVIDER;
if (providerOverride) {
if (!isValidProvider(providerOverride)) {
throw new Error(`Unsupported ComputeSDK provider "${providerOverride}". Supported providers: ${PROVIDER_NAMES.join(", ")}`);
}
if (!isProviderAuthComplete(providerOverride)) {
const missing = getMissingEnvVars(providerOverride);
throw new Error(`Missing credentials for provider "${providerOverride}". Set: ${missing.join(", ")}`);
}
console.log(`Using ComputeSDK provider: ${providerOverride} (explicit)`);
return providerOverride as ProviderName;
}
const detected = detectProvider();
if (!detected) {
throw new Error(`No provider credentials found. Set one of: ${PROVIDER_NAMES.map((p) => getMissingEnvVars(p).join(", ")).join(" | ")}`);
}
console.log(`Using ComputeSDK provider: ${detected} (auto-detected)`);
return detected as ProviderName;
}
function configureComputeSDK(): void {
const provider = resolveProvider();
const config: ExplicitComputeConfig = {
provider,
computesdkApiKey: process.env.COMPUTESDK_API_KEY,
requestTimeoutMs: REQUEST_TIMEOUT_MS,
};
const providerConfig = getProviderConfigFromEnv(provider);
if (Object.keys(providerConfig).length > 0) {
const configWithProvider = config as ExplicitComputeConfig & Record<ProviderName, Record<string, string>>;
configWithProvider[provider] = providerConfig;
}
compute.setConfig(config);
}
configureComputeSDK();
const buildEnv = (): Record<string, string> => {
const env: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) env.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) env.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
return env;
};
export async function setupComputeSdkSandboxAgent(): Promise<{
baseUrl: string;
cleanup: () => Promise<void>;
}> {
const env = buildEnv();
console.log("Creating ComputeSDK sandbox...");
const sandbox = await compute.sandbox.create({
envs: Object.keys(env).length > 0 ? env : undefined,
});
const run = async (cmd: string, options?: { background?: boolean }) => {
const result = await sandbox.runCommand(cmd, options);
if (typeof result?.exitCode === "number" && result.exitCode !== 0) {
throw new Error(`Command failed: ${cmd} (exit ${result.exitCode})\n${result.stderr || ""}`);
}
return result;
};
console.log("Installing sandbox-agent...");
await run("curl -fsSL https://releases.rivet.dev/sandbox-agent/latest/install.sh | sh");
if (env.ANTHROPIC_API_KEY) {
console.log("Installing Claude agent...");
await run("sandbox-agent install-agent claude");
}
if (env.OPENAI_API_KEY) {
console.log("Installing Codex agent...");
await run("sandbox-agent install-agent codex");
}
console.log("Starting server...");
await run(`sandbox-agent server --no-token --host 0.0.0.0 --port ${PORT}`, { background: true });
const baseUrl = await sandbox.getUrl({ port: PORT });
const cleanup = async () => {
try {
await sandbox.destroy();
} catch (error) {
console.warn("Cleanup failed:", error instanceof Error ? error.message : error);
}
};
return { baseUrl, cleanup };
}
export async function runComputeSdkExample(): Promise<void> {
const { baseUrl, cleanup } = await setupComputeSdkSandboxAgent();
const handleExit = async () => {
await cleanup();
process.exit(0);
};
process.once("SIGINT", handleExit);
process.once("SIGTERM", handleExit);
const client = await SandboxAgent.connect({ baseUrl });
const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/home", mcpServers: [] } });
const sessionId = session.id;
console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`);
console.log(" Press Ctrl+C to stop.");
// Keep alive until SIGINT/SIGTERM triggers cleanup above
await new Promise(() => {});
}
const isDirectRun = Boolean(process.argv[1] && resolve(process.argv[1]) === fileURLToPath(import.meta.url));
if (isDirectRun) {
runComputeSdkExample().catch((error) => {
console.error(error instanceof Error ? error.message : error);
process.exit(1);
});
}

View file

@ -0,0 +1,30 @@
import { SandboxAgent } from "sandbox-agent";
import { computesdk } from "sandbox-agent/computesdk";
import { detectAgent } from "@sandbox-agent/example-shared";
const envs: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const client = await SandboxAgent.start({
sandbox: computesdk({
create: { envs },
}),
});
console.log(`UI: ${client.inspectorUrl}`);
const session = await client.createSession({
agent: detectAgent(),
});
session.onEvent((event) => {
console.log(`[${event.sender}]`, JSON.stringify(event.payload));
});
session.prompt([{ type: "text", text: "Say hello from ComputeSDK in one sentence." }]);
process.once("SIGINT", async () => {
await client.destroySandbox();
process.exit(0);
});

View file

@ -1,6 +1,6 @@
import { describe, it, expect } from "vitest"; import { describe, it, expect } from "vitest";
import { buildHeaders } from "@sandbox-agent/example-shared"; import { SandboxAgent } from "sandbox-agent";
import { setupComputeSdkSandboxAgent } from "../src/computesdk.ts"; import { computesdk } from "sandbox-agent/computesdk";
const hasModal = Boolean(process.env.MODAL_TOKEN_ID && process.env.MODAL_TOKEN_SECRET); const hasModal = Boolean(process.env.MODAL_TOKEN_ID && process.env.MODAL_TOKEN_SECRET);
const hasVercel = Boolean(process.env.VERCEL_TOKEN || process.env.VERCEL_OIDC_TOKEN); const hasVercel = Boolean(process.env.VERCEL_TOKEN || process.env.VERCEL_OIDC_TOKEN);
@ -13,20 +13,23 @@ const timeoutMs = Number.parseInt(process.env.SANDBOX_TEST_TIMEOUT_MS || "", 10)
const testFn = shouldRun ? it : it.skip; const testFn = shouldRun ? it : it.skip;
describe("computesdk example", () => { describe("computesdk provider", () => {
testFn( testFn(
"starts sandbox-agent and responds to /v1/health", "starts sandbox-agent and responds to /v1/health",
async () => { async () => {
const { baseUrl, cleanup } = await setupComputeSdkSandboxAgent(); const envs: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const sdk = await SandboxAgent.start({
sandbox: computesdk({ create: { envs } }),
});
try { try {
const response = await fetch(`${baseUrl}/v1/health`, { const health = await sdk.getHealth();
headers: buildHeaders({}), expect(health.status).toBe("ok");
});
expect(response.ok).toBe(true);
const data = await response.json();
expect(data.status).toBe("ok");
} finally { } finally {
await cleanup(); await sdk.destroySandbox();
} }
}, },
timeoutMs, timeoutMs,

View file

@ -1,42 +1,31 @@
import { Daytona } from "@daytonaio/sdk";
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { detectAgent, buildInspectorUrl } from "@sandbox-agent/example-shared"; import { daytona } from "sandbox-agent/daytona";
import { detectAgent } from "@sandbox-agent/example-shared";
const daytona = new Daytona();
const envVars: Record<string, string> = {}; const envVars: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) envVars.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.ANTHROPIC_API_KEY) envVars.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) envVars.OPENAI_API_KEY = process.env.OPENAI_API_KEY; if (process.env.OPENAI_API_KEY) envVars.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
// Use default image and install sandbox-agent at runtime (faster startup, no snapshot build) const client = await SandboxAgent.start({
console.log("Creating Daytona sandbox..."); sandbox: daytona({
const sandbox = await daytona.create({ envVars, autoStopInterval: 0 }); create: { envVars },
}),
});
// Install sandbox-agent and start server console.log(`UI: ${client.inspectorUrl}`);
console.log("Installing sandbox-agent...");
await sandbox.process.executeCommand("curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh");
console.log("Installing agents..."); const session = await client.createSession({
await sandbox.process.executeCommand("sandbox-agent install-agent claude"); agent: detectAgent(),
await sandbox.process.executeCommand("sandbox-agent install-agent codex"); cwd: "/home/daytona",
});
await sandbox.process.executeCommand("nohup sandbox-agent server --no-token --host 0.0.0.0 --port 3000 >/tmp/sandbox-agent.log 2>&1 &"); session.onEvent((event) => {
console.log(`[${event.sender}]`, JSON.stringify(event.payload));
});
const baseUrl = (await sandbox.getSignedPreviewUrl(3000, 4 * 60 * 60)).url; session.prompt([{ type: "text", text: "Say hello from Daytona in one sentence." }]);
console.log("Connecting to server..."); process.once("SIGINT", async () => {
const client = await SandboxAgent.connect({ baseUrl }); await client.destroySandbox();
const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/home/daytona", mcpServers: [] } });
const sessionId = session.id;
console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`);
console.log(" Press Ctrl+C to stop.");
const keepAlive = setInterval(() => {}, 60_000);
const cleanup = async () => {
clearInterval(keepAlive);
await sandbox.delete(60);
process.exit(0); process.exit(0);
}; });
process.once("SIGINT", cleanup);
process.once("SIGTERM", cleanup);

View file

@ -9,10 +9,10 @@
"dependencies": { "dependencies": {
"@sandbox-agent/example-shared": "workspace:*", "@sandbox-agent/example-shared": "workspace:*",
"dockerode": "latest", "dockerode": "latest",
"get-port": "latest",
"sandbox-agent": "workspace:*" "sandbox-agent": "workspace:*"
}, },
"devDependencies": { "devDependencies": {
"@types/dockerode": "latest",
"@types/node": "latest", "@types/node": "latest",
"tsx": "latest", "tsx": "latest",
"typescript": "latest", "typescript": "latest",

View file

@ -1,68 +1,40 @@
import Docker from "dockerode";
import fs from "node:fs"; import fs from "node:fs";
import path from "node:path"; import path from "node:path";
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { detectAgent, buildInspectorUrl } from "@sandbox-agent/example-shared"; import { docker } from "sandbox-agent/docker";
import { detectAgent } from "@sandbox-agent/example-shared";
import { FULL_IMAGE } from "@sandbox-agent/example-shared/docker"; import { FULL_IMAGE } from "@sandbox-agent/example-shared/docker";
const IMAGE = FULL_IMAGE;
const PORT = 3000;
const agent = detectAgent();
const codexAuthPath = process.env.HOME ? path.join(process.env.HOME, ".codex", "auth.json") : null; const codexAuthPath = process.env.HOME ? path.join(process.env.HOME, ".codex", "auth.json") : null;
const bindMounts = codexAuthPath && fs.existsSync(codexAuthPath) ? [`${codexAuthPath}:/home/sandbox/.codex/auth.json:ro`] : []; const bindMounts = codexAuthPath && fs.existsSync(codexAuthPath) ? [`${codexAuthPath}:/home/sandbox/.codex/auth.json:ro`] : [];
const env = [
process.env.ANTHROPIC_API_KEY ? `ANTHROPIC_API_KEY=${process.env.ANTHROPIC_API_KEY}` : "",
process.env.OPENAI_API_KEY ? `OPENAI_API_KEY=${process.env.OPENAI_API_KEY}` : "",
process.env.CODEX_API_KEY ? `CODEX_API_KEY=${process.env.CODEX_API_KEY}` : "",
].filter(Boolean);
const docker = new Docker({ socketPath: "/var/run/docker.sock" }); const client = await SandboxAgent.start({
sandbox: docker({
// Pull image if needed image: FULL_IMAGE,
try { env,
await docker.getImage(IMAGE).inspect(); binds: bindMounts,
} catch { }),
console.log(`Pulling ${IMAGE}...`);
await new Promise<void>((resolve, reject) => {
docker.pull(IMAGE, (err: Error | null, stream: NodeJS.ReadableStream) => {
if (err) return reject(err);
docker.modem.followProgress(stream, (err: Error | null) => (err ? reject(err) : resolve()));
});
});
}
console.log("Starting container...");
const container = await docker.createContainer({
Image: IMAGE,
Cmd: ["server", "--no-token", "--host", "0.0.0.0", "--port", `${PORT}`],
Env: [
process.env.ANTHROPIC_API_KEY ? `ANTHROPIC_API_KEY=${process.env.ANTHROPIC_API_KEY}` : "",
process.env.OPENAI_API_KEY ? `OPENAI_API_KEY=${process.env.OPENAI_API_KEY}` : "",
process.env.CODEX_API_KEY ? `CODEX_API_KEY=${process.env.CODEX_API_KEY}` : "",
].filter(Boolean),
ExposedPorts: { [`${PORT}/tcp`]: {} },
HostConfig: {
AutoRemove: true,
PortBindings: { [`${PORT}/tcp`]: [{ HostPort: `${PORT}` }] },
Binds: bindMounts,
},
}); });
await container.start();
const baseUrl = `http://127.0.0.1:${PORT}`; console.log(`UI: ${client.inspectorUrl}`);
const client = await SandboxAgent.connect({ baseUrl }); const session = await client.createSession({
const session = await client.createSession({ agent, sessionInit: { cwd: "/home/sandbox", mcpServers: [] } }); agent: detectAgent(),
const sessionId = session.id; cwd: "/home/sandbox",
});
console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); session.onEvent((event) => {
console.log(" Press Ctrl+C to stop."); console.log(`[${event.sender}]`, JSON.stringify(event.payload));
});
const keepAlive = setInterval(() => {}, 60_000); session.prompt([{ type: "text", text: "Say hello from Docker in one sentence." }]);
const cleanup = async () => {
clearInterval(keepAlive); process.once("SIGINT", async () => {
try { await client.destroySandbox();
await container.stop({ t: 5 });
} catch {}
try {
await container.remove({ force: true });
} catch {}
process.exit(0); process.exit(0);
}; });
process.once("SIGINT", cleanup);
process.once("SIGTERM", cleanup);

View file

@ -1,8 +1,15 @@
import { describe, it, expect } from "vitest"; import { describe, it, expect } from "vitest";
import { buildHeaders } from "@sandbox-agent/example-shared"; import { startDockerSandbox } from "@sandbox-agent/example-shared/docker";
import { setupDockerSandboxAgent } from "../src/docker.ts";
const shouldRun = process.env.RUN_DOCKER_EXAMPLES === "1"; /**
* Docker integration test.
*
* Set SANDBOX_AGENT_DOCKER_IMAGE to the image tag to test (e.g. a locally-built
* full image). The test starts a container from that image, waits for
* sandbox-agent to become healthy, and validates the /v1/health endpoint.
*/
const image = process.env.SANDBOX_AGENT_DOCKER_IMAGE;
const shouldRun = Boolean(image);
const timeoutMs = Number.parseInt(process.env.SANDBOX_TEST_TIMEOUT_MS || "", 10) || 300_000; const timeoutMs = Number.parseInt(process.env.SANDBOX_TEST_TIMEOUT_MS || "", 10) || 300_000;
const testFn = shouldRun ? it : it.skip; const testFn = shouldRun ? it : it.skip;
@ -11,11 +18,29 @@ describe("docker example", () => {
testFn( testFn(
"starts sandbox-agent and responds to /v1/health", "starts sandbox-agent and responds to /v1/health",
async () => { async () => {
const { baseUrl, token, cleanup } = await setupDockerSandboxAgent(); const { baseUrl, cleanup } = await startDockerSandbox({
port: 2468,
image: image!,
});
try { try {
const response = await fetch(`${baseUrl}/v1/health`, { // Wait for health check
headers: buildHeaders({ token }), let healthy = false;
}); for (let i = 0; i < 60; i++) {
try {
const res = await fetch(`${baseUrl}/v1/health`);
if (res.ok) {
const data = await res.json();
if (data.status === "ok") {
healthy = true;
break;
}
}
} catch {}
await new Promise((r) => setTimeout(r, 1000));
}
expect(healthy).toBe(true);
const response = await fetch(`${baseUrl}/v1/health`);
expect(response.ok).toBe(true); expect(response.ok).toBe(true);
const data = await response.json(); const data = await response.json();
expect(data.status).toBe("ok"); expect(data.status).toBe("ok");

View file

@ -1,45 +1,28 @@
import { Sandbox } from "@e2b/code-interpreter";
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { detectAgent, buildInspectorUrl } from "@sandbox-agent/example-shared"; import { e2b } from "sandbox-agent/e2b";
import { detectAgent } from "@sandbox-agent/example-shared";
const envs: Record<string, string> = {}; const envs: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
console.log("Creating E2B sandbox..."); const client = await SandboxAgent.start({
const sandbox = await Sandbox.create({ allowInternetAccess: true, envs }); // ✨ NEW ✨
sandbox: e2b({ create: { envs } }),
});
const run = async (cmd: string) => { const session = await client.createSession({
const result = await sandbox.commands.run(cmd); agent: detectAgent(),
if (result.exitCode !== 0) throw new Error(`Command failed: ${cmd}\n${result.stderr}`); cwd: "/home/user",
return result; });
};
console.log("Installing sandbox-agent..."); session.onEvent((event) => {
await run("curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh"); console.log(`[${event.sender}]`, JSON.stringify(event.payload));
});
console.log("Installing agents..."); session.prompt([{ type: "text", text: "Say hello from E2B in one sentence." }]);
await run("sandbox-agent install-agent claude");
await run("sandbox-agent install-agent codex");
console.log("Starting server..."); process.once("SIGINT", async () => {
await sandbox.commands.run("sandbox-agent server --no-token --host 0.0.0.0 --port 3000", { background: true, timeoutMs: 0 }); await client.destroySandbox();
const baseUrl = `https://${sandbox.getHost(3000)}`;
console.log("Connecting to server...");
const client = await SandboxAgent.connect({ baseUrl });
const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/home/user", mcpServers: [] } });
const sessionId = session.id;
console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`);
console.log(" Press Ctrl+C to stop.");
const keepAlive = setInterval(() => {}, 60_000);
const cleanup = async () => {
clearInterval(keepAlive);
await sandbox.kill();
process.exit(0); process.exit(0);
}; });
process.once("SIGINT", cleanup);
process.once("SIGTERM", cleanup);

View file

@ -44,7 +44,7 @@ const readmeText = new TextDecoder().decode(readmeBytes);
console.log(` README.md content: ${readmeText.trim()}`); console.log(` README.md content: ${readmeText.trim()}`);
console.log("Creating session..."); console.log("Creating session...");
const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/opt/my-project", mcpServers: [] } }); const session = await client.createSession({ agent: detectAgent(), cwd: "/opt/my-project" });
const sessionId = session.id; const sessionId = session.id;
console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`);
console.log(' Try: "read the README in /opt/my-project"'); console.log(' Try: "read the README in /opt/my-project"');

View file

@ -3,7 +3,7 @@
"private": true, "private": true,
"type": "module", "type": "module",
"scripts": { "scripts": {
"start": "tsx src/modal.ts", "start": "tsx src/index.ts",
"typecheck": "tsc --noEmit" "typecheck": "tsc --noEmit"
}, },
"dependencies": { "dependencies": {

View file

@ -0,0 +1,30 @@
import { SandboxAgent } from "sandbox-agent";
import { modal } from "sandbox-agent/modal";
import { detectAgent } from "@sandbox-agent/example-shared";
const secrets: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) secrets.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) secrets.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const client = await SandboxAgent.start({
sandbox: modal({
create: { secrets },
}),
});
console.log(`UI: ${client.inspectorUrl}`);
const session = await client.createSession({
agent: detectAgent(),
});
session.onEvent((event) => {
console.log(`[${event.sender}]`, JSON.stringify(event.payload));
});
session.prompt([{ type: "text", text: "Say hello from Modal in one sentence." }]);
process.once("SIGINT", async () => {
await client.destroySandbox();
process.exit(0);
});

View file

@ -1,123 +0,0 @@
import { ModalClient } from "modal";
import { SandboxAgent } from "sandbox-agent";
import { detectAgent, buildInspectorUrl, waitForHealth } from "@sandbox-agent/example-shared";
import { fileURLToPath } from "node:url";
import { resolve } from "node:path";
import { run } from "node:test";
const PORT = 3000;
const APP_NAME = "sandbox-agent";
async function buildSecrets(modal: ModalClient) {
const envVars: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY)
envVars.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY)
envVars.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
if (Object.keys(envVars).length === 0) return [];
return [await modal.secrets.fromObject(envVars)];
}
export async function setupModalSandboxAgent(): Promise<{
baseUrl: string;
cleanup: () => Promise<void>;
}> {
const modal = new ModalClient();
const app = await modal.apps.fromName(APP_NAME, { createIfMissing: true });
const image = modal.images
.fromRegistry("ubuntu:22.04")
.dockerfileCommands([
"RUN apt-get update && apt-get install -y curl ca-certificates",
"RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.2.x/install.sh | sh",
]);
const secrets = await buildSecrets(modal);
console.log("Creating Modal sandbox!");
const sb = await modal.sandboxes.create(app, image, {
secrets: secrets,
encryptedPorts: [PORT],
});
console.log(`Sandbox created: ${sb.sandboxId}`);
const exec = async (cmd: string) => {
const p = await sb.exec(["bash", "-c", cmd], {
stdout: "pipe",
stderr: "pipe",
});
const exitCode = await p.wait();
if (exitCode !== 0) {
const stderr = await p.stderr.readText();
throw new Error(`Command failed (exit ${exitCode}): ${cmd}\n${stderr}`);
}
};
if (process.env.ANTHROPIC_API_KEY) {
console.log("Installing Claude agent...");
await exec("sandbox-agent install-agent claude");
}
if (process.env.OPENAI_API_KEY) {
console.log("Installing Codex agent...");
await exec("sandbox-agent install-agent codex");
}
console.log("Starting server...");
await sb.exec(
["bash", "-c", `sandbox-agent server --no-token --host 0.0.0.0 --port ${PORT} &`],
);
const tunnels = await sb.tunnels();
const tunnel = tunnels[PORT];
if (!tunnel) {
throw new Error(`No tunnel found for port ${PORT}`);
}
const baseUrl = tunnel.url;
console.log("Waiting for server...");
await waitForHealth({ baseUrl });
const cleanup = async () => {
try {
await sb.terminate();
} catch (error) {
console.warn("Cleanup failed:", error instanceof Error ? error.message : error);
}
};
return { baseUrl, cleanup };
}
export async function runModalExample(): Promise<void> {
const { baseUrl, cleanup } = await setupModalSandboxAgent();
const handleExit = async () => {
await cleanup();
process.exit(0);
};
process.once("SIGINT", handleExit);
process.once("SIGTERM", handleExit);
const client = await SandboxAgent.connect({ baseUrl });
const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/root", mcpServers: [] } });
const sessionId = session.id;
console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`);
console.log(" Press Ctrl+C to stop.");
await new Promise(() => {});
}
const isDirectRun = Boolean(
process.argv[1] && resolve(process.argv[1]) === fileURLToPath(import.meta.url),
);
if (isDirectRun) {
runModalExample().catch((error) => {
console.error(error instanceof Error ? error.message : error);
process.exit(1);
});
}

View file

@ -1,26 +1,29 @@
import { describe, it, expect } from "vitest"; import { describe, it, expect } from "vitest";
import { buildHeaders } from "@sandbox-agent/example-shared"; import { SandboxAgent } from "sandbox-agent";
import { setupModalSandboxAgent } from "../src/modal.ts"; import { modal } from "sandbox-agent/modal";
const shouldRun = Boolean(process.env.MODAL_TOKEN_ID && process.env.MODAL_TOKEN_SECRET); const shouldRun = Boolean(process.env.MODAL_TOKEN_ID && process.env.MODAL_TOKEN_SECRET);
const timeoutMs = Number.parseInt(process.env.SANDBOX_TEST_TIMEOUT_MS || "", 10) || 300_000; const timeoutMs = Number.parseInt(process.env.SANDBOX_TEST_TIMEOUT_MS || "", 10) || 300_000;
const testFn = shouldRun ? it : it.skip; const testFn = shouldRun ? it : it.skip;
describe("modal example", () => { describe("modal provider", () => {
testFn( testFn(
"starts sandbox-agent and responds to /v1/health", "starts sandbox-agent and responds to /v1/health",
async () => { async () => {
const { baseUrl, cleanup } = await setupModalSandboxAgent(); const secrets: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) secrets.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) secrets.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const sdk = await SandboxAgent.start({
sandbox: modal({ create: { secrets } }),
});
try { try {
const response = await fetch(`${baseUrl}/v1/health`, { const health = await sdk.getHealth();
headers: buildHeaders({}), expect(health.status).toBe("ok");
});
expect(response.ok).toBe(true);
const data = await response.json();
expect(data.status).toBe("ok");
} finally { } finally {
await cleanup(); await sdk.destroySandbox();
} }
}, },
timeoutMs, timeoutMs,

View file

@ -2,6 +2,7 @@ import { createInterface } from "node:readline/promises";
import { stdin as input, stdout as output } from "node:process"; import { stdin as input, stdout as output } from "node:process";
import { Command } from "commander"; import { Command } from "commander";
import { SandboxAgent, type PermissionReply, type SessionPermissionRequest } from "sandbox-agent"; import { SandboxAgent, type PermissionReply, type SessionPermissionRequest } from "sandbox-agent";
import { local } from "sandbox-agent/local";
const options = parseOptions(); const options = parseOptions();
const agent = options.agent.trim().toLowerCase(); const agent = options.agent.trim().toLowerCase();
@ -9,10 +10,7 @@ const autoReply = parsePermissionReply(options.reply);
const promptText = options.prompt?.trim() || `Create ./permission-example.txt with the text 'hello from the ${agent} permissions example'.`; const promptText = options.prompt?.trim() || `Create ./permission-example.txt with the text 'hello from the ${agent} permissions example'.`;
const sdk = await SandboxAgent.start({ const sdk = await SandboxAgent.start({
spawn: { sandbox: local({ log: "inherit" }),
enabled: true,
log: "inherit",
},
}); });
try { try {
@ -43,10 +41,7 @@ try {
const session = await sdk.createSession({ const session = await sdk.createSession({
agent, agent,
...(mode ? { mode } : {}), ...(mode ? { mode } : {}),
sessionInit: { cwd: process.cwd(),
cwd: process.cwd(),
mcpServers: [],
},
}); });
const rl = autoReply const rl = autoReply

View file

@ -8,7 +8,6 @@
}, },
"dependencies": { "dependencies": {
"@sandbox-agent/example-shared": "workspace:*", "@sandbox-agent/example-shared": "workspace:*",
"@sandbox-agent/persist-postgres": "workspace:*",
"pg": "latest", "pg": "latest",
"sandbox-agent": "workspace:*" "sandbox-agent": "workspace:*"
}, },

View file

@ -3,7 +3,7 @@ import { randomUUID } from "node:crypto";
import { Client } from "pg"; import { Client } from "pg";
import { setTimeout as delay } from "node:timers/promises"; import { setTimeout as delay } from "node:timers/promises";
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { PostgresSessionPersistDriver } from "@sandbox-agent/persist-postgres"; import { PostgresSessionPersistDriver } from "./persist.ts";
import { startDockerSandbox } from "@sandbox-agent/example-shared/docker"; import { startDockerSandbox } from "@sandbox-agent/example-shared/docker";
import { detectAgent } from "@sandbox-agent/example-shared"; import { detectAgent } from "@sandbox-agent/example-shared";

View file

@ -0,0 +1,336 @@
import { Pool, type PoolConfig } from "pg";
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent";
const DEFAULT_LIST_LIMIT = 100;
export interface PostgresSessionPersistDriverOptions {
connectionString?: string;
pool?: Pool;
poolConfig?: PoolConfig;
schema?: string;
}
export class PostgresSessionPersistDriver implements SessionPersistDriver {
private readonly pool: Pool;
private readonly ownsPool: boolean;
private readonly schema: string;
private readonly initialized: Promise<void>;
constructor(options: PostgresSessionPersistDriverOptions = {}) {
this.schema = normalizeSchema(options.schema ?? "public");
if (options.pool) {
this.pool = options.pool;
this.ownsPool = false;
} else {
this.pool = new Pool({
connectionString: options.connectionString,
...options.poolConfig,
});
this.ownsPool = true;
}
this.initialized = this.initialize();
}
async getSession(id: string): Promise<SessionRecord | undefined> {
await this.ready();
const result = await this.pool.query<SessionRow>(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json, config_options_json, modes_json
FROM ${this.table("sessions")}
WHERE id = $1`,
[id],
);
if (result.rows.length === 0) {
return undefined;
}
return decodeSessionRow(result.rows[0]);
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
await this.ready();
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rowsResult = await this.pool.query<SessionRow>(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json, config_options_json, modes_json
FROM ${this.table("sessions")}
ORDER BY created_at ASC, id ASC
LIMIT $1 OFFSET $2`,
[limit, offset],
);
const countResult = await this.pool.query<{ count: string }>(`SELECT COUNT(*) AS count FROM ${this.table("sessions")}`);
const total = parseInteger(countResult.rows[0]?.count ?? "0");
const nextOffset = offset + rowsResult.rows.length;
return {
items: rowsResult.rows.map(decodeSessionRow),
nextCursor: nextOffset < total ? String(nextOffset) : undefined,
};
}
async updateSession(session: SessionRecord): Promise<void> {
await this.ready();
await this.pool.query(
`INSERT INTO ${this.table("sessions")} (
id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json, config_options_json, modes_json
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10)
ON CONFLICT(id) DO UPDATE SET
agent = EXCLUDED.agent,
agent_session_id = EXCLUDED.agent_session_id,
last_connection_id = EXCLUDED.last_connection_id,
created_at = EXCLUDED.created_at,
destroyed_at = EXCLUDED.destroyed_at,
sandbox_id = EXCLUDED.sandbox_id,
session_init_json = EXCLUDED.session_init_json,
config_options_json = EXCLUDED.config_options_json,
modes_json = EXCLUDED.modes_json`,
[
session.id,
session.agent,
session.agentSessionId,
session.lastConnectionId,
session.createdAt,
session.destroyedAt ?? null,
session.sandboxId ?? null,
session.sessionInit ? JSON.stringify(session.sessionInit) : null,
session.configOptions ? JSON.stringify(session.configOptions) : null,
session.modes !== undefined ? JSON.stringify(session.modes) : null,
],
);
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
await this.ready();
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rowsResult = await this.pool.query<EventRow>(
`SELECT id, event_index, session_id, created_at, connection_id, sender, payload_json
FROM ${this.table("events")}
WHERE session_id = $1
ORDER BY event_index ASC, id ASC
LIMIT $2 OFFSET $3`,
[request.sessionId, limit, offset],
);
const countResult = await this.pool.query<{ count: string }>(`SELECT COUNT(*) AS count FROM ${this.table("events")} WHERE session_id = $1`, [
request.sessionId,
]);
const total = parseInteger(countResult.rows[0]?.count ?? "0");
const nextOffset = offset + rowsResult.rows.length;
return {
items: rowsResult.rows.map(decodeEventRow),
nextCursor: nextOffset < total ? String(nextOffset) : undefined,
};
}
async insertEvent(_sessionId: string, event: SessionEvent): Promise<void> {
await this.ready();
await this.pool.query(
`INSERT INTO ${this.table("events")} (
id, event_index, session_id, created_at, connection_id, sender, payload_json
) VALUES ($1, $2, $3, $4, $5, $6, $7)
ON CONFLICT(id) DO UPDATE SET
event_index = EXCLUDED.event_index,
session_id = EXCLUDED.session_id,
created_at = EXCLUDED.created_at,
connection_id = EXCLUDED.connection_id,
sender = EXCLUDED.sender,
payload_json = EXCLUDED.payload_json`,
[event.id, event.eventIndex, event.sessionId, event.createdAt, event.connectionId, event.sender, event.payload],
);
}
async close(): Promise<void> {
if (!this.ownsPool) {
return;
}
await this.pool.end();
}
private async ready(): Promise<void> {
await this.initialized;
}
private table(name: "sessions" | "events"): string {
return `"${this.schema}"."${name}"`;
}
private async initialize(): Promise<void> {
await this.pool.query(`CREATE SCHEMA IF NOT EXISTS "${this.schema}"`);
await this.pool.query(`
CREATE TABLE IF NOT EXISTS ${this.table("sessions")} (
id TEXT PRIMARY KEY,
agent TEXT NOT NULL,
agent_session_id TEXT NOT NULL,
last_connection_id TEXT NOT NULL,
created_at BIGINT NOT NULL,
destroyed_at BIGINT,
sandbox_id TEXT,
session_init_json JSONB,
config_options_json JSONB,
modes_json JSONB
)
`);
await this.pool.query(`
ALTER TABLE ${this.table("sessions")}
ADD COLUMN IF NOT EXISTS sandbox_id TEXT
`);
await this.pool.query(`
ALTER TABLE ${this.table("sessions")}
ADD COLUMN IF NOT EXISTS config_options_json JSONB
`);
await this.pool.query(`
ALTER TABLE ${this.table("sessions")}
ADD COLUMN IF NOT EXISTS modes_json JSONB
`);
await this.pool.query(`
CREATE TABLE IF NOT EXISTS ${this.table("events")} (
id TEXT PRIMARY KEY,
event_index BIGINT NOT NULL,
session_id TEXT NOT NULL,
created_at BIGINT NOT NULL,
connection_id TEXT NOT NULL,
sender TEXT NOT NULL,
payload_json JSONB NOT NULL
)
`);
await this.pool.query(`
ALTER TABLE ${this.table("events")}
ALTER COLUMN id TYPE TEXT USING id::TEXT
`);
await this.pool.query(`
ALTER TABLE ${this.table("events")}
ADD COLUMN IF NOT EXISTS event_index BIGINT
`);
await this.pool.query(`
WITH ranked AS (
SELECT id, ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC) AS ranked_index
FROM ${this.table("events")}
)
UPDATE ${this.table("events")} AS current_events
SET event_index = ranked.ranked_index
FROM ranked
WHERE current_events.id = ranked.id
AND current_events.event_index IS NULL
`);
await this.pool.query(`
ALTER TABLE ${this.table("events")}
ALTER COLUMN event_index SET NOT NULL
`);
await this.pool.query(`
CREATE INDEX IF NOT EXISTS idx_events_session_order
ON ${this.table("events")}(session_id, event_index, id)
`);
}
}
type SessionRow = {
id: string;
agent: string;
agent_session_id: string;
last_connection_id: string;
created_at: string | number;
destroyed_at: string | number | null;
sandbox_id: string | null;
session_init_json: unknown | null;
config_options_json: unknown | null;
modes_json: unknown | null;
};
type EventRow = {
id: string | number;
event_index: string | number;
session_id: string;
created_at: string | number;
connection_id: string;
sender: string;
payload_json: unknown;
};
function decodeSessionRow(row: SessionRow): SessionRecord {
return {
id: row.id,
agent: row.agent,
agentSessionId: row.agent_session_id,
lastConnectionId: row.last_connection_id,
createdAt: parseInteger(row.created_at),
destroyedAt: row.destroyed_at === null ? undefined : parseInteger(row.destroyed_at),
sandboxId: row.sandbox_id ?? undefined,
sessionInit: row.session_init_json ? (row.session_init_json as SessionRecord["sessionInit"]) : undefined,
configOptions: row.config_options_json ? (row.config_options_json as SessionRecord["configOptions"]) : undefined,
modes: row.modes_json ? (row.modes_json as SessionRecord["modes"]) : undefined,
};
}
function decodeEventRow(row: EventRow): SessionEvent {
return {
id: String(row.id),
eventIndex: parseInteger(row.event_index),
sessionId: row.session_id,
createdAt: parseInteger(row.created_at),
connectionId: row.connection_id,
sender: parseSender(row.sender),
payload: row.payload_json as SessionEvent["payload"],
};
}
function normalizeLimit(limit: number | undefined): number {
if (!Number.isFinite(limit) || (limit ?? 0) < 1) {
return DEFAULT_LIST_LIMIT;
}
return Math.floor(limit as number);
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
function parseInteger(value: string | number): number {
const parsed = typeof value === "number" ? value : Number.parseInt(value, 10);
if (!Number.isFinite(parsed)) {
throw new Error(`Invalid integer value returned by postgres: ${String(value)}`);
}
return parsed;
}
function parseSender(value: string): SessionEvent["sender"] {
if (value === "agent" || value === "client") {
return value;
}
throw new Error(`Invalid sender value returned by postgres: ${value}`);
}
function normalizeSchema(schema: string): string {
if (!/^[A-Za-z_][A-Za-z0-9_]*$/.test(schema)) {
throw new Error(`Invalid schema name '${schema}'. Use letters, numbers, and underscores only.`);
}
return schema;
}

View file

@ -8,10 +8,11 @@
}, },
"dependencies": { "dependencies": {
"@sandbox-agent/example-shared": "workspace:*", "@sandbox-agent/example-shared": "workspace:*",
"@sandbox-agent/persist-sqlite": "workspace:*", "better-sqlite3": "^11.0.0",
"sandbox-agent": "workspace:*" "sandbox-agent": "workspace:*"
}, },
"devDependencies": { "devDependencies": {
"@types/better-sqlite3": "^7.0.0",
"@types/node": "latest", "@types/node": "latest",
"tsx": "latest", "tsx": "latest",
"typescript": "latest" "typescript": "latest"

View file

@ -1,5 +1,5 @@
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { SQLiteSessionPersistDriver } from "@sandbox-agent/persist-sqlite"; import { SQLiteSessionPersistDriver } from "./persist.ts";
import { startDockerSandbox } from "@sandbox-agent/example-shared/docker"; import { startDockerSandbox } from "@sandbox-agent/example-shared/docker";
import { detectAgent } from "@sandbox-agent/example-shared"; import { detectAgent } from "@sandbox-agent/example-shared";

View file

@ -0,0 +1,310 @@
import Database from "better-sqlite3";
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent";
const DEFAULT_LIST_LIMIT = 100;
export interface SQLiteSessionPersistDriverOptions {
filename?: string;
}
export class SQLiteSessionPersistDriver implements SessionPersistDriver {
private readonly db: Database.Database;
constructor(options: SQLiteSessionPersistDriverOptions = {}) {
this.db = new Database(options.filename ?? ":memory:");
this.initialize();
}
async getSession(id: string): Promise<SessionRecord | undefined> {
const row = this.db
.prepare(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json, config_options_json, modes_json
FROM sessions WHERE id = ?`,
)
.get(id) as SessionRow | undefined;
if (!row) {
return undefined;
}
return decodeSessionRow(row);
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rows = this.db
.prepare(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json, config_options_json, modes_json
FROM sessions
ORDER BY created_at ASC, id ASC
LIMIT ? OFFSET ?`,
)
.all(limit, offset) as SessionRow[];
const countRow = this.db.prepare(`SELECT COUNT(*) as count FROM sessions`).get() as { count: number };
const nextOffset = offset + rows.length;
return {
items: rows.map(decodeSessionRow),
nextCursor: nextOffset < countRow.count ? String(nextOffset) : undefined,
};
}
async updateSession(session: SessionRecord): Promise<void> {
this.db
.prepare(
`INSERT INTO sessions (
id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json, config_options_json, modes_json
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(id) DO UPDATE SET
agent = excluded.agent,
agent_session_id = excluded.agent_session_id,
last_connection_id = excluded.last_connection_id,
created_at = excluded.created_at,
destroyed_at = excluded.destroyed_at,
sandbox_id = excluded.sandbox_id,
session_init_json = excluded.session_init_json,
config_options_json = excluded.config_options_json,
modes_json = excluded.modes_json`,
)
.run(
session.id,
session.agent,
session.agentSessionId,
session.lastConnectionId,
session.createdAt,
session.destroyedAt ?? null,
session.sandboxId ?? null,
session.sessionInit ? JSON.stringify(session.sessionInit) : null,
session.configOptions ? JSON.stringify(session.configOptions) : null,
session.modes !== undefined ? JSON.stringify(session.modes) : null,
);
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rows = this.db
.prepare(
`SELECT id, event_index, session_id, created_at, connection_id, sender, payload_json
FROM events
WHERE session_id = ?
ORDER BY event_index ASC, id ASC
LIMIT ? OFFSET ?`,
)
.all(request.sessionId, limit, offset) as EventRow[];
const countRow = this.db.prepare(`SELECT COUNT(*) as count FROM events WHERE session_id = ?`).get(request.sessionId) as { count: number };
const nextOffset = offset + rows.length;
return {
items: rows.map(decodeEventRow),
nextCursor: nextOffset < countRow.count ? String(nextOffset) : undefined,
};
}
async insertEvent(_sessionId: string, event: SessionEvent): Promise<void> {
this.db
.prepare(
`INSERT INTO events (
id, event_index, session_id, created_at, connection_id, sender, payload_json
) VALUES (?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(id) DO UPDATE SET
event_index = excluded.event_index,
session_id = excluded.session_id,
created_at = excluded.created_at,
connection_id = excluded.connection_id,
sender = excluded.sender,
payload_json = excluded.payload_json`,
)
.run(event.id, event.eventIndex, event.sessionId, event.createdAt, event.connectionId, event.sender, JSON.stringify(event.payload));
}
close(): void {
this.db.close();
}
private initialize(): void {
this.db.exec(`
CREATE TABLE IF NOT EXISTS sessions (
id TEXT PRIMARY KEY,
agent TEXT NOT NULL,
agent_session_id TEXT NOT NULL,
last_connection_id TEXT NOT NULL,
created_at INTEGER NOT NULL,
destroyed_at INTEGER,
sandbox_id TEXT,
session_init_json TEXT,
config_options_json TEXT,
modes_json TEXT
)
`);
const sessionColumns = this.db.prepare(`PRAGMA table_info(sessions)`).all() as TableInfoRow[];
if (!sessionColumns.some((column) => column.name === "sandbox_id")) {
this.db.exec(`ALTER TABLE sessions ADD COLUMN sandbox_id TEXT`);
}
if (!sessionColumns.some((column) => column.name === "config_options_json")) {
this.db.exec(`ALTER TABLE sessions ADD COLUMN config_options_json TEXT`);
}
if (!sessionColumns.some((column) => column.name === "modes_json")) {
this.db.exec(`ALTER TABLE sessions ADD COLUMN modes_json TEXT`);
}
this.ensureEventsTable();
}
private ensureEventsTable(): void {
const tableInfo = this.db.prepare(`PRAGMA table_info(events)`).all() as TableInfoRow[];
if (tableInfo.length === 0) {
this.createEventsTable();
return;
}
const idColumn = tableInfo.find((column) => column.name === "id");
const hasEventIndex = tableInfo.some((column) => column.name === "event_index");
const idType = (idColumn?.type ?? "").trim().toUpperCase();
const idIsText = idType === "TEXT";
if (!idIsText || !hasEventIndex) {
this.rebuildEventsTable(hasEventIndex);
}
this.db.exec(`
CREATE INDEX IF NOT EXISTS idx_events_session_order
ON events(session_id, event_index, id)
`);
}
private createEventsTable(): void {
this.db.exec(`
CREATE TABLE IF NOT EXISTS events (
id TEXT PRIMARY KEY,
event_index INTEGER NOT NULL,
session_id TEXT NOT NULL,
created_at INTEGER NOT NULL,
connection_id TEXT NOT NULL,
sender TEXT NOT NULL,
payload_json TEXT NOT NULL
);
CREATE INDEX IF NOT EXISTS idx_events_session_order
ON events(session_id, event_index, id)
`);
}
private rebuildEventsTable(hasEventIndex: boolean): void {
this.db.exec(`
ALTER TABLE events RENAME TO events_legacy;
`);
this.createEventsTable();
if (hasEventIndex) {
this.db.exec(`
INSERT INTO events (id, event_index, session_id, created_at, connection_id, sender, payload_json)
SELECT
CAST(id AS TEXT),
COALESCE(event_index, ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC)),
session_id,
created_at,
connection_id,
sender,
payload_json
FROM events_legacy
`);
} else {
this.db.exec(`
INSERT INTO events (id, event_index, session_id, created_at, connection_id, sender, payload_json)
SELECT
CAST(id AS TEXT),
ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC),
session_id,
created_at,
connection_id,
sender,
payload_json
FROM events_legacy
`);
}
this.db.exec(`DROP TABLE events_legacy`);
}
}
type SessionRow = {
id: string;
agent: string;
agent_session_id: string;
last_connection_id: string;
created_at: number;
destroyed_at: number | null;
sandbox_id: string | null;
session_init_json: string | null;
config_options_json: string | null;
modes_json: string | null;
};
type EventRow = {
id: string;
event_index: number;
session_id: string;
created_at: number;
connection_id: string;
sender: "client" | "agent";
payload_json: string;
};
type TableInfoRow = {
name: string;
type: string;
};
function decodeSessionRow(row: SessionRow): SessionRecord {
return {
id: row.id,
agent: row.agent,
agentSessionId: row.agent_session_id,
lastConnectionId: row.last_connection_id,
createdAt: row.created_at,
destroyedAt: row.destroyed_at ?? undefined,
sandboxId: row.sandbox_id ?? undefined,
sessionInit: row.session_init_json ? (JSON.parse(row.session_init_json) as SessionRecord["sessionInit"]) : undefined,
configOptions: row.config_options_json ? (JSON.parse(row.config_options_json) as SessionRecord["configOptions"]) : undefined,
modes: row.modes_json ? (JSON.parse(row.modes_json) as SessionRecord["modes"]) : undefined,
};
}
function decodeEventRow(row: EventRow): SessionEvent {
return {
id: row.id,
eventIndex: row.event_index,
sessionId: row.session_id,
createdAt: row.created_at,
connectionId: row.connection_id,
sender: row.sender,
payload: JSON.parse(row.payload_json),
};
}
function normalizeLimit(limit: number | undefined): number {
if (!Number.isFinite(limit) || (limit ?? 0) < 1) {
return DEFAULT_LIST_LIMIT;
}
return Math.floor(limit as number);
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}

View file

@ -78,11 +78,11 @@ function readClaudeCredentialFiles(): ClaudeCredentialFile[] {
const candidates: Array<{ hostPath: string; containerPath: string }> = [ const candidates: Array<{ hostPath: string; containerPath: string }> = [
{ {
hostPath: path.join(homeDir, ".claude", ".credentials.json"), hostPath: path.join(homeDir, ".claude", ".credentials.json"),
containerPath: "/root/.claude/.credentials.json", containerPath: ".claude/.credentials.json",
}, },
{ {
hostPath: path.join(homeDir, ".claude-oauth-credentials.json"), hostPath: path.join(homeDir, ".claude-oauth-credentials.json"),
containerPath: "/root/.claude-oauth-credentials.json", containerPath: ".claude-oauth-credentials.json",
}, },
]; ];
@ -180,10 +180,9 @@ export async function startDockerSandbox(opts: DockerSandboxOptions): Promise<Do
const credentialBootstrapCommands = claudeCredentialFiles.flatMap((file, index) => { const credentialBootstrapCommands = claudeCredentialFiles.flatMap((file, index) => {
const envKey = `SANDBOX_AGENT_CLAUDE_CREDENTIAL_${index}_B64`; const envKey = `SANDBOX_AGENT_CLAUDE_CREDENTIAL_${index}_B64`;
bootstrapEnv[envKey] = file.base64Content; bootstrapEnv[envKey] = file.base64Content;
return [ // Use $HOME-relative paths so credentials work regardless of container user
`mkdir -p ${shellSingleQuotedLiteral(path.posix.dirname(file.containerPath))}`, const containerDir = path.posix.dirname(file.containerPath);
`printf %s "$${envKey}" | base64 -d > ${shellSingleQuotedLiteral(file.containerPath)}`, return [`mkdir -p "$HOME/${containerDir}"`, `printf %s "$${envKey}" | base64 -d > "$HOME/${file.containerPath}"`];
];
}); });
setupCommands.unshift(...credentialBootstrapCommands); setupCommands.unshift(...credentialBootstrapCommands);
} }
@ -200,8 +199,9 @@ export async function startDockerSandbox(opts: DockerSandboxOptions): Promise<Do
const container = await docker.createContainer({ const container = await docker.createContainer({
Image: image, Image: image,
Entrypoint: ["/bin/sh", "-c"],
WorkingDir: "/home/sandbox", WorkingDir: "/home/sandbox",
Cmd: ["sh", "-c", bootCommands.join(" && ")], Cmd: [bootCommands.join(" && ")],
Env: [...Object.entries(credentialEnv).map(([key, value]) => `${key}=${value}`), ...Object.entries(bootstrapEnv).map(([key, value]) => `${key}=${value}`)], Env: [...Object.entries(credentialEnv).map(([key, value]) => `${key}=${value}`), ...Object.entries(bootstrapEnv).map(([key, value]) => `${key}=${value}`)],
ExposedPorts: { [`${port}/tcp`]: {} }, ExposedPorts: { [`${port}/tcp`]: {} },
HostConfig: { HostConfig: {
@ -253,10 +253,13 @@ export async function startDockerSandbox(opts: DockerSandboxOptions): Promise<Do
try { try {
await container.remove({ force: true }); await container.remove({ force: true });
} catch {} } catch {}
};
const signalCleanup = async () => {
await cleanup();
process.exit(0); process.exit(0);
}; };
process.once("SIGINT", cleanup); process.once("SIGINT", signalCleanup);
process.once("SIGTERM", cleanup); process.once("SIGTERM", signalCleanup);
return { baseUrl, cleanup }; return { baseUrl, cleanup };
} }

View file

@ -36,7 +36,7 @@ await client.setSkillsConfig({ directory: "/", skillName: "random-number" }, { s
// Create a session. // Create a session.
console.log("Creating session with custom skill..."); console.log("Creating session with custom skill...");
const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/root", mcpServers: [] } }); const session = await client.createSession({ agent: detectAgent(), cwd: "/root" });
const sessionId = session.id; const sessionId = session.id;
console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`);
console.log(' Try: "generate a random number between 1 and 100"'); console.log(' Try: "generate a random number between 1 and 100"');

View file

@ -15,7 +15,7 @@ await client.setSkillsConfig(
); );
console.log("Creating session..."); console.log("Creating session...");
const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/root", mcpServers: [] } }); const session = await client.createSession({ agent: detectAgent(), cwd: "/root" });
const sessionId = session.id; const sessionId = session.id;
console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`);
console.log(' Try: "How do I start sandbox-agent?"'); console.log(' Try: "How do I start sandbox-agent?"');

View file

@ -1,56 +1,34 @@
import { Sandbox } from "@vercel/sandbox";
import { SandboxAgent } from "sandbox-agent"; import { SandboxAgent } from "sandbox-agent";
import { detectAgent, buildInspectorUrl } from "@sandbox-agent/example-shared"; import { vercel } from "sandbox-agent/vercel";
import { detectAgent } from "@sandbox-agent/example-shared";
const envs: Record<string, string> = {}; const env: Record<string, string> = {};
if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.ANTHROPIC_API_KEY) env.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;
if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; if (process.env.OPENAI_API_KEY) env.OPENAI_API_KEY = process.env.OPENAI_API_KEY;
console.log("Creating Vercel sandbox..."); const client = await SandboxAgent.start({
const sandbox = await Sandbox.create({ sandbox: vercel({
runtime: "node24", create: {
ports: [3000], runtime: "node24",
env,
},
}),
}); });
const run = async (cmd: string, args: string[] = []) => { console.log(`UI: ${client.inspectorUrl}`);
const result = await sandbox.runCommand({ cmd, args, env: envs });
if (result.exitCode !== 0) {
const stderr = await result.stderr();
throw new Error(`Command failed: ${cmd} ${args.join(" ")}\n${stderr}`);
}
return result;
};
console.log("Installing sandbox-agent..."); const session = await client.createSession({
await run("sh", ["-c", "curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh"]); agent: detectAgent(),
cwd: "/home/vercel-sandbox",
console.log("Installing agents...");
await run("sandbox-agent", ["install-agent", "claude"]);
await run("sandbox-agent", ["install-agent", "codex"]);
console.log("Starting server...");
await sandbox.runCommand({
cmd: "sandbox-agent",
args: ["server", "--no-token", "--host", "0.0.0.0", "--port", "3000"],
env: envs,
detached: true,
}); });
const baseUrl = sandbox.domain(3000); session.onEvent((event) => {
console.log(`[${event.sender}]`, JSON.stringify(event.payload));
});
console.log("Connecting to server..."); session.prompt([{ type: "text", text: "Say hello from Vercel in one sentence." }]);
const client = await SandboxAgent.connect({ baseUrl });
const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/home/vercel-sandbox", mcpServers: [] } });
const sessionId = session.id;
console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); process.once("SIGINT", async () => {
console.log(" Press Ctrl+C to stop."); await client.destroySandbox();
const keepAlive = setInterval(() => {}, 60_000);
const cleanup = async () => {
clearInterval(keepAlive);
await sandbox.stop();
process.exit(0); process.exit(0);
}; });
process.once("SIGINT", cleanup);
process.once("SIGTERM", cleanup);

View file

@ -65,7 +65,6 @@ services:
- "foundry_backend_root_node_modules:/app/node_modules" - "foundry_backend_root_node_modules:/app/node_modules"
- "foundry_backend_backend_node_modules:/app/foundry/packages/backend/node_modules" - "foundry_backend_backend_node_modules:/app/foundry/packages/backend/node_modules"
- "foundry_backend_shared_node_modules:/app/foundry/packages/shared/node_modules" - "foundry_backend_shared_node_modules:/app/foundry/packages/shared/node_modules"
- "foundry_backend_persist_rivet_node_modules:/app/sdks/persist-rivet/node_modules"
- "foundry_backend_typescript_node_modules:/app/sdks/typescript/node_modules" - "foundry_backend_typescript_node_modules:/app/sdks/typescript/node_modules"
- "foundry_backend_pnpm_store:/root/.local/share/pnpm/store" - "foundry_backend_pnpm_store:/root/.local/share/pnpm/store"
# Persist RivetKit local storage across container restarts. # Persist RivetKit local storage across container restarts.
@ -120,7 +119,6 @@ volumes:
foundry_backend_root_node_modules: {} foundry_backend_root_node_modules: {}
foundry_backend_backend_node_modules: {} foundry_backend_backend_node_modules: {}
foundry_backend_shared_node_modules: {} foundry_backend_shared_node_modules: {}
foundry_backend_persist_rivet_node_modules: {}
foundry_backend_typescript_node_modules: {} foundry_backend_typescript_node_modules: {}
foundry_backend_pnpm_store: {} foundry_backend_pnpm_store: {}
foundry_rivetkit_storage: {} foundry_rivetkit_storage: {}

View file

@ -13,7 +13,6 @@ RUN pnpm --filter @sandbox-agent/foundry-shared build
RUN pnpm --filter acp-http-client build RUN pnpm --filter acp-http-client build
RUN pnpm --filter @sandbox-agent/cli-shared build RUN pnpm --filter @sandbox-agent/cli-shared build
RUN SKIP_OPENAPI_GEN=1 pnpm --filter sandbox-agent build RUN SKIP_OPENAPI_GEN=1 pnpm --filter sandbox-agent build
RUN pnpm --filter @sandbox-agent/persist-rivet build
RUN pnpm --filter @sandbox-agent/foundry-backend build RUN pnpm --filter @sandbox-agent/foundry-backend build
RUN pnpm --filter @sandbox-agent/foundry-backend deploy --prod /out RUN pnpm --filter @sandbox-agent/foundry-backend deploy --prod /out

View file

@ -18,7 +18,6 @@
"@hono/node-ws": "^1.3.0", "@hono/node-ws": "^1.3.0",
"@iarna/toml": "^2.2.5", "@iarna/toml": "^2.2.5",
"@sandbox-agent/foundry-shared": "workspace:*", "@sandbox-agent/foundry-shared": "workspace:*",
"@sandbox-agent/persist-rivet": "workspace:*",
"better-auth": "^1.5.5", "better-auth": "^1.5.5",
"dockerode": "^4.0.9", "dockerode": "^4.0.9",
"drizzle-kit": "^0.31.8", "drizzle-kit": "^0.31.8",

View file

@ -6,10 +6,10 @@
"type": "module", "type": "module",
"scripts": { "scripts": {
"dev": "vite", "dev": "vite",
"build": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/persist-indexeddb build && pnpm --filter @sandbox-agent/react build && vite build", "build": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/react build && vite build",
"preview": "vite preview", "preview": "vite preview",
"typecheck": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/persist-indexeddb build && pnpm --filter @sandbox-agent/react build && tsc --noEmit", "typecheck": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/react build && tsc --noEmit",
"test": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/persist-indexeddb build && pnpm --filter @sandbox-agent/react build && vitest run" "test": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/react build && vitest run"
}, },
"devDependencies": { "devDependencies": {
"@sandbox-agent/react": "workspace:*", "@sandbox-agent/react": "workspace:*",
@ -23,7 +23,6 @@
"vitest": "^3.0.0" "vitest": "^3.0.0"
}, },
"dependencies": { "dependencies": {
"@sandbox-agent/persist-indexeddb": "workspace:*",
"lucide-react": "^0.469.0", "lucide-react": "^0.469.0",
"react": "^18.3.1", "react": "^18.3.1",
"react-dom": "^18.3.1" "react-dom": "^18.3.1"

View file

@ -24,7 +24,7 @@ type ConfigOption = {
}; };
type AgentModeInfo = { id: string; name: string; description: string }; type AgentModeInfo = { id: string; name: string; description: string };
type AgentModelInfo = { id: string; name?: string }; type AgentModelInfo = { id: string; name?: string };
import { IndexedDbSessionPersistDriver } from "@sandbox-agent/persist-indexeddb"; import { IndexedDbSessionPersistDriver } from "./persist-indexeddb";
import ChatPanel from "./components/chat/ChatPanel"; import ChatPanel from "./components/chat/ChatPanel";
import ConnectScreen from "./components/ConnectScreen"; import ConnectScreen from "./components/ConnectScreen";
import DebugPanel, { type DebugTab } from "./components/debug/DebugPanel"; import DebugPanel, { type DebugTab } from "./components/debug/DebugPanel";

View file

@ -0,0 +1,320 @@
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent";
const DEFAULT_DB_NAME = "sandbox-agent-session-store";
const DEFAULT_DB_VERSION = 2;
const SESSIONS_STORE = "sessions";
const EVENTS_STORE = "events";
const EVENTS_BY_SESSION_INDEX = "by_session_index";
const DEFAULT_LIST_LIMIT = 100;
export interface IndexedDbSessionPersistDriverOptions {
databaseName?: string;
databaseVersion?: number;
indexedDb?: IDBFactory;
}
export class IndexedDbSessionPersistDriver implements SessionPersistDriver {
private readonly indexedDb: IDBFactory;
private readonly dbName: string;
private readonly dbVersion: number;
private readonly dbPromise: Promise<IDBDatabase>;
constructor(options: IndexedDbSessionPersistDriverOptions = {}) {
const indexedDb = options.indexedDb ?? globalThis.indexedDB;
if (!indexedDb) {
throw new Error("IndexedDB is not available in this runtime.");
}
this.indexedDb = indexedDb;
this.dbName = options.databaseName ?? DEFAULT_DB_NAME;
this.dbVersion = options.databaseVersion ?? DEFAULT_DB_VERSION;
this.dbPromise = this.openDatabase();
}
async getSession(id: string): Promise<SessionRecord | undefined> {
const db = await this.dbPromise;
const row = await requestToPromise<IDBValidKey | SessionRow | undefined>(db.transaction(SESSIONS_STORE, "readonly").objectStore(SESSIONS_STORE).get(id));
if (!row || typeof row !== "object") {
return undefined;
}
return decodeSessionRow(row as SessionRow);
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
const db = await this.dbPromise;
const rows = await getAllRows<SessionRow>(db, SESSIONS_STORE);
rows.sort((a, b) => {
if (a.createdAt !== b.createdAt) {
return a.createdAt - b.createdAt;
}
return a.id.localeCompare(b.id);
});
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const slice = rows.slice(offset, offset + limit).map(decodeSessionRow);
const nextOffset = offset + slice.length;
return {
items: slice,
nextCursor: nextOffset < rows.length ? String(nextOffset) : undefined,
};
}
async updateSession(session: SessionRecord): Promise<void> {
const db = await this.dbPromise;
await transactionPromise(db, [SESSIONS_STORE], "readwrite", (tx) => {
tx.objectStore(SESSIONS_STORE).put(encodeSessionRow(session));
});
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
const db = await this.dbPromise;
const rows = (await getAllRows<EventRow>(db, EVENTS_STORE)).filter((row) => row.sessionId === request.sessionId).sort(compareEventRowsByOrder);
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const slice = rows.slice(offset, offset + limit).map(decodeEventRow);
const nextOffset = offset + slice.length;
return {
items: slice,
nextCursor: nextOffset < rows.length ? String(nextOffset) : undefined,
};
}
async insertEvent(_sessionId: string, event: SessionEvent): Promise<void> {
const db = await this.dbPromise;
await transactionPromise(db, [EVENTS_STORE], "readwrite", (tx) => {
tx.objectStore(EVENTS_STORE).put(encodeEventRow(event));
});
}
async close(): Promise<void> {
const db = await this.dbPromise;
db.close();
}
private openDatabase(): Promise<IDBDatabase> {
return new Promise((resolve, reject) => {
const request = this.indexedDb.open(this.dbName, this.dbVersion);
request.onupgradeneeded = () => {
const db = request.result;
if (!db.objectStoreNames.contains(SESSIONS_STORE)) {
db.createObjectStore(SESSIONS_STORE, { keyPath: "id" });
}
if (!db.objectStoreNames.contains(EVENTS_STORE)) {
const events = db.createObjectStore(EVENTS_STORE, { keyPath: "id" });
events.createIndex(EVENTS_BY_SESSION_INDEX, ["sessionId", "eventIndex", "id"], {
unique: false,
});
} else {
const tx = request.transaction;
if (!tx) {
return;
}
const events = tx.objectStore(EVENTS_STORE);
if (!events.indexNames.contains(EVENTS_BY_SESSION_INDEX)) {
events.createIndex(EVENTS_BY_SESSION_INDEX, ["sessionId", "eventIndex", "id"], {
unique: false,
});
}
}
};
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error ?? new Error("Unable to open IndexedDB"));
});
}
}
type SessionRow = {
id: string;
agent: string;
agentSessionId: string;
lastConnectionId: string;
createdAt: number;
destroyedAt?: number;
sandboxId?: string;
sessionInit?: SessionRecord["sessionInit"];
configOptions?: SessionRecord["configOptions"];
modes?: SessionRecord["modes"];
};
type EventRow = {
id: number | string;
eventIndex?: number;
sessionId: string;
createdAt: number;
connectionId: string;
sender: "client" | "agent";
payload: unknown;
};
function encodeSessionRow(session: SessionRecord): SessionRow {
return {
id: session.id,
agent: session.agent,
agentSessionId: session.agentSessionId,
lastConnectionId: session.lastConnectionId,
createdAt: session.createdAt,
destroyedAt: session.destroyedAt,
sandboxId: session.sandboxId,
sessionInit: session.sessionInit,
configOptions: session.configOptions,
modes: session.modes,
};
}
function decodeSessionRow(row: SessionRow): SessionRecord {
return {
id: row.id,
agent: row.agent,
agentSessionId: row.agentSessionId,
lastConnectionId: row.lastConnectionId,
createdAt: row.createdAt,
destroyedAt: row.destroyedAt,
sandboxId: row.sandboxId,
sessionInit: row.sessionInit,
configOptions: row.configOptions,
modes: row.modes,
};
}
function encodeEventRow(event: SessionEvent): EventRow {
return {
id: event.id,
eventIndex: event.eventIndex,
sessionId: event.sessionId,
createdAt: event.createdAt,
connectionId: event.connectionId,
sender: event.sender,
payload: event.payload,
};
}
function decodeEventRow(row: EventRow): SessionEvent {
return {
id: String(row.id),
eventIndex: parseEventIndex(row.eventIndex, row.id),
sessionId: row.sessionId,
createdAt: row.createdAt,
connectionId: row.connectionId,
sender: row.sender,
payload: row.payload as SessionEvent["payload"],
};
}
async function getAllRows<T>(db: IDBDatabase, storeName: string): Promise<T[]> {
return await transactionPromise<T[]>(db, [storeName], "readonly", async (tx) => {
const request = tx.objectStore(storeName).getAll();
return (await requestToPromise(request)) as T[];
});
}
function normalizeLimit(limit: number | undefined): number {
if (!Number.isFinite(limit) || (limit ?? 0) < 1) {
return DEFAULT_LIST_LIMIT;
}
return Math.floor(limit as number);
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
function compareEventRowsByOrder(a: EventRow, b: EventRow): number {
const indexA = parseEventIndex(a.eventIndex, a.id);
const indexB = parseEventIndex(b.eventIndex, b.id);
if (indexA !== indexB) {
return indexA - indexB;
}
return String(a.id).localeCompare(String(b.id));
}
function parseEventIndex(value: number | undefined, fallback: number | string): number {
if (typeof value === "number" && Number.isFinite(value)) {
return Math.max(0, Math.floor(value));
}
const parsed = Number.parseInt(String(fallback), 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
function requestToPromise<T>(request: IDBRequest<T>): Promise<T> {
return new Promise((resolve, reject) => {
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error ?? new Error("IndexedDB request failed"));
});
}
function transactionPromise<T>(db: IDBDatabase, stores: string[], mode: IDBTransactionMode, run: (tx: IDBTransaction) => T | Promise<T>): Promise<T> {
return new Promise((resolve, reject) => {
const tx = db.transaction(stores, mode);
let settled = false;
let resultValue: T | undefined;
let runCompleted = false;
let txCompleted = false;
function tryResolve() {
if (settled || !runCompleted || !txCompleted) {
return;
}
settled = true;
resolve(resultValue as T);
}
tx.oncomplete = () => {
txCompleted = true;
tryResolve();
};
tx.onerror = () => {
if (settled) {
return;
}
settled = true;
reject(tx.error ?? new Error("IndexedDB transaction failed"));
};
tx.onabort = () => {
if (settled) {
return;
}
settled = true;
reject(tx.error ?? new Error("IndexedDB transaction aborted"));
};
Promise.resolve(run(tx))
.then((value) => {
resultValue = value;
runCompleted = true;
tryResolve();
})
.catch((error) => {
if (!settled) {
settled = true;
reject(error);
}
try {
tx.abort();
} catch {
// no-op
}
});
});
}

View file

@ -0,0 +1,3 @@
<svg viewBox="0 0 24 24" fill="currentColor" xmlns="http://www.w3.org/2000/svg">
<path d="M16.5088 16.8447c.1475-.5068.0908-.9707-.1553-1.3154-.2246-.3164-.6045-.499-1.0615-.5205l-8.6592-.1123a.1559.1559 0 0 1-.1333-.0713c-.0283-.042-.0351-.0986-.021-.1553.0278-.084.1123-.1484.2036-.1562l8.7359-.1123c1.0351-.0489 2.1601-.8868 2.5537-1.9136l.499-1.3013c.0215-.0561.0293-.1128.0147-.168-.5625-2.5463-2.835-4.4453-5.5499-4.4453-2.5039 0-4.6284 1.6177-5.3876 3.8614-.4927-.3658-1.1187-.5625-1.794-.499-1.2026.119-2.1665 1.083-2.2861 2.2856-.0283.31-.0069.6128.0635.894C1.5683 13.171 0 14.7754 0 16.752c0 .1748.0142.3515.0352.5273.0141.083.0844.1475.1689.1475h15.9814c.0909 0 .1758-.0645.2032-.1553l.12-.4268zm2.7568-5.5634c-.0771 0-.1611 0-.2383.0112-.0566 0-.1054.0415-.127.0976l-.3378 1.1744c-.1475.5068-.0918.9707.1543 1.3164.2256.3164.6055.498 1.0625.5195l1.8437.1133c.0557 0 .1055.0263.1329.0703.0283.043.0351.1074.0214.1562-.0283.084-.1132.1485-.204.1553l-1.921.1123c-1.041.0488-2.1582.8867-2.5527 1.914l-.1406.3585c-.0283.0713.0215.1416.0986.1416h6.5977c.0771 0 .1474-.0489.169-.126.1122-.4082.1757-.837.1757-1.2803 0-2.6025-2.125-4.727-4.7344-4.727"/>
</svg>

After

Width:  |  Height:  |  Size: 1.1 KiB

View file

@ -0,0 +1,3 @@
<svg viewBox="0 0 1711 1711" fill="currentColor" xmlns="http://www.w3.org/2000/svg">
<path d="M1036.26 1002.28H1274.13L1273.2 1021.37C1264.82 1131.69 1223.39 1219.67 1149.38 1283.44C1076.29 1346.75 978.54 1378.87 858.9 1378.87C728.09 1378.87 623.35 1334.18 547.47 1245.27C472.99 1157.29 434.82 1035.79 434.82 884.04V823.53C434.82 726.7 452.52 640.12 486.5 566.1C521.41 491.15 571.69 432.49 636.39 392.47C701.09 352.43 776.51 331.95 861.69 331.95C979.46 331.95 1075.82 364.07 1147.98 427.85C1220.6 491.15 1262.96 581.46 1274.13 695.52L1275.99 714.6H1037.65L1036.72 698.77C1032.07 639.66 1015.77 596.83 988.77 571.69C961.77 546.09 918.94 533.52 861.69 533.52C799.78 533.52 754.63 554.47 724.36 598.69C692.71 643.84 676.42 716.46 675.49 814.22V888.7C675.49 991.11 690.85 1066.53 721.11 1112.61C749.97 1156.83 795.12 1178.24 858.9 1178.24C917.09 1178.24 960.38 1165.67 987.85 1140.07C1014.84 1114.93 1031.14 1073.97 1035.33 1018.57L1036.26 1002.27V1002.28Z"/>
</svg>

After

Width:  |  Height:  |  Size: 965 B

View file

@ -0,0 +1,3 @@
<svg viewBox="0 0 24 24" fill="currentColor" xmlns="http://www.w3.org/2000/svg">
<path d="M13.983 11.078h2.119a.186.186 0 00.186-.185V9.006a.186.186 0 00-.186-.186h-2.119a.185.185 0 00-.185.185v1.888c0 .102.083.185.185.185m-2.954-5.43h2.118a.186.186 0 00.186-.186V3.574a.186.186 0 00-.186-.185h-2.118a.185.185 0 00-.185.185v1.888c0 .102.082.185.185.185m0 2.716h2.118a.187.187 0 00.186-.186V6.29a.186.186 0 00-.186-.185h-2.118a.185.185 0 00-.185.185v1.887c0 .102.082.185.185.186m-2.93 0h2.12a.186.186 0 00.184-.186V6.29a.185.185 0 00-.185-.185H8.1a.185.185 0 00-.185.185v1.887c0 .102.083.185.185.186m-2.964 0h2.119a.186.186 0 00.185-.186V6.29a.185.185 0 00-.185-.185H5.136a.186.186 0 00-.186.185v1.887c0 .102.084.185.186.186m5.893 2.715h2.118a.186.186 0 00.186-.185V9.006a.186.186 0 00-.186-.186h-2.118a.185.185 0 00-.185.185v1.888c0 .102.082.185.185.185m-2.93 0h2.12a.185.185 0 00.184-.185V9.006a.185.185 0 00-.184-.186h-2.12a.185.185 0 00-.184.185v1.888c0 .102.083.185.185.185m-2.964 0h2.119a.185.185 0 00.185-.185V9.006a.185.185 0 00-.184-.186h-2.12a.186.186 0 00-.186.186v1.887c0 .102.084.185.186.185m-2.92 0h2.12a.185.185 0 00.184-.185V9.006a.185.185 0 00-.184-.186h-2.12a.185.185 0 00-.184.185v1.888c0 .102.082.185.185.185M23.763 9.89c-.065-.051-.672-.51-1.954-.51-.338.001-.676.03-1.01.087-.248-1.7-1.653-2.53-1.716-2.566l-.344-.199-.226.327c-.284.438-.49.922-.612 1.43-.23.97-.09 1.882.403 2.661-.595.332-1.55.413-1.744.42H.751a.751.751 0 00-.75.748 11.376 11.376 0 00.692 4.062c.545 1.428 1.355 2.48 2.41 3.124 1.18.723 3.1 1.137 5.275 1.137.983.003 1.963-.086 2.93-.266a12.248 12.248 0 003.823-1.389c.98-.567 1.86-1.288 2.61-2.136 1.252-1.418 1.998-2.997 2.553-4.4h.221c1.372 0 2.215-.549 2.68-1.009.309-.293.55-.65.707-1.046l.098-.288Z"/>
</svg>

After

Width:  |  Height:  |  Size: 1.7 KiB

View file

@ -0,0 +1,3 @@
<svg viewBox="0 0 24 24" fill="currentColor" xmlns="http://www.w3.org/2000/svg">
<path d="M4.89 5.57 0 14.002l2.521 4.4h5.05l4.396-7.718 4.512 7.709 4.996.037L24 14.057l-4.857-8.452-5.073-.015-2.076 3.598L9.94 5.57Zm.837.729h3.787l1.845 3.252H7.572Zm9.189.021 3.803.012 4.228 7.355-3.736-.027zm-9.82.346L6.94 9.914l-4.209 7.389-1.892-3.3Zm9.187.014 4.297 7.343-1.892 3.282-4.3-7.344zm-6.713 3.6h3.79l-4.212 7.394H3.361Zm11.64 4.109 3.74.027-1.893 3.281-3.74-.027z"/>
</svg>

After

Width:  |  Height:  |  Size: 476 B

View file

@ -5,8 +5,11 @@ import { Code, Server, GitBranch } from "lucide-react";
import { CopyButton } from "./ui/CopyButton"; import { CopyButton } from "./ui/CopyButton";
const sdkCodeRaw = `import { SandboxAgent } from "sandbox-agent"; const sdkCodeRaw = `import { SandboxAgent } from "sandbox-agent";
import { local } from "sandbox-agent/local";
const client = await SandboxAgent.start(); const client = await SandboxAgent.start({
sandbox: local(),
});
await client.createSession("my-session", { await client.createSession("my-session", {
agent: "claude-code", agent: "claude-code",
@ -32,13 +35,26 @@ function SdkCodeHighlighted() {
<span className="text-zinc-300"> </span> <span className="text-zinc-300"> </span>
<span className="text-green-400">"sandbox-agent"</span> <span className="text-green-400">"sandbox-agent"</span>
<span className="text-zinc-300">;</span> <span className="text-zinc-300">;</span>
{"\n"}
<span className="text-purple-400">import</span>
<span className="text-zinc-300">{" { "}</span>
<span className="text-white">local</span>
<span className="text-zinc-300">{" } "}</span>
<span className="text-purple-400">from</span>
<span className="text-zinc-300"> </span>
<span className="text-green-400">"sandbox-agent/local"</span>
<span className="text-zinc-300">;</span>
{"\n\n"} {"\n\n"}
<span className="text-purple-400">const</span> <span className="text-purple-400">const</span>
<span className="text-zinc-300"> client = </span> <span className="text-zinc-300"> client = </span>
<span className="text-purple-400">await</span> <span className="text-purple-400">await</span>
<span className="text-zinc-300"> SandboxAgent.</span> <span className="text-zinc-300"> SandboxAgent.</span>
<span className="text-blue-400">start</span> <span className="text-blue-400">start</span>
<span className="text-zinc-300">();</span> <span className="text-zinc-300">{"({"}</span>
{"\n"}
<span className="text-zinc-300">{" sandbox: local(),"}</span>
{"\n"}
<span className="text-zinc-300">{"});"}</span>
{"\n\n"} {"\n\n"}
<span className="text-purple-400">await</span> <span className="text-purple-400">await</span>
<span className="text-zinc-300"> client.</span> <span className="text-zinc-300"> client.</span>

135
pnpm-lock.yaml generated
View file

@ -154,13 +154,13 @@ importers:
dockerode: dockerode:
specifier: latest specifier: latest
version: 4.0.9 version: 4.0.9
get-port:
specifier: latest
version: 7.1.0
sandbox-agent: sandbox-agent:
specifier: workspace:* specifier: workspace:*
version: link:../../sdks/typescript version: link:../../sdks/typescript
devDependencies: devDependencies:
'@types/dockerode':
specifier: latest
version: 4.0.1
'@types/node': '@types/node':
specifier: latest specifier: latest
version: 25.5.0 version: 25.5.0
@ -345,9 +345,6 @@ importers:
'@sandbox-agent/example-shared': '@sandbox-agent/example-shared':
specifier: workspace:* specifier: workspace:*
version: link:../shared version: link:../shared
'@sandbox-agent/persist-postgres':
specifier: workspace:*
version: link:../../sdks/persist-postgres
pg: pg:
specifier: latest specifier: latest
version: 8.20.0 version: 8.20.0
@ -373,13 +370,16 @@ importers:
'@sandbox-agent/example-shared': '@sandbox-agent/example-shared':
specifier: workspace:* specifier: workspace:*
version: link:../shared version: link:../shared
'@sandbox-agent/persist-sqlite': better-sqlite3:
specifier: workspace:* specifier: ^11.0.0
version: link:../../sdks/persist-sqlite version: 11.10.0
sandbox-agent: sandbox-agent:
specifier: workspace:* specifier: workspace:*
version: link:../../sdks/typescript version: link:../../sdks/typescript
devDependencies: devDependencies:
'@types/better-sqlite3':
specifier: ^7.0.0
version: 7.6.13
'@types/node': '@types/node':
specifier: latest specifier: latest
version: 25.5.0 version: 25.5.0
@ -640,9 +640,6 @@ importers:
frontend/packages/inspector: frontend/packages/inspector:
dependencies: dependencies:
'@sandbox-agent/persist-indexeddb':
specifier: workspace:*
version: link:../../../sdks/persist-indexeddb
lucide-react: lucide-react:
specifier: ^0.469.0 specifier: ^0.469.0
version: 0.469.0(react@18.3.1) version: 0.469.0(react@18.3.1)
@ -897,57 +894,30 @@ importers:
sdks/gigacode/platforms/win32-x64: {} sdks/gigacode/platforms/win32-x64: {}
sdks/persist-indexeddb: sdks/persist-indexeddb:
dependencies:
sandbox-agent:
specifier: workspace:*
version: link:../typescript
devDependencies: devDependencies:
'@types/node': '@types/node':
specifier: ^22.0.0 specifier: ^22.0.0
version: 22.19.7 version: 22.19.7
fake-indexeddb:
specifier: ^6.2.4
version: 6.2.5
tsup: tsup:
specifier: ^8.0.0 specifier: ^8.0.0
version: 8.5.1(jiti@1.21.7)(postcss@8.5.6)(tsx@4.21.0)(typescript@5.9.3)(yaml@2.8.2) version: 8.5.1(jiti@1.21.7)(postcss@8.5.6)(tsx@4.21.0)(typescript@5.9.3)(yaml@2.8.2)
typescript: typescript:
specifier: ^5.7.0 specifier: ^5.7.0
version: 5.9.3 version: 5.9.3
vitest:
specifier: ^3.0.0
version: 3.2.4(@types/debug@4.1.12)(@types/node@22.19.7)(jiti@1.21.7)(tsx@4.21.0)(yaml@2.8.2)
sdks/persist-postgres: sdks/persist-postgres:
dependencies:
pg:
specifier: ^8.16.3
version: 8.18.0
sandbox-agent:
specifier: workspace:*
version: link:../typescript
devDependencies: devDependencies:
'@types/node': '@types/node':
specifier: ^22.0.0 specifier: ^22.0.0
version: 22.19.7 version: 22.19.7
'@types/pg':
specifier: ^8.15.6
version: 8.16.0
tsup: tsup:
specifier: ^8.0.0 specifier: ^8.0.0
version: 8.5.1(jiti@1.21.7)(postcss@8.5.6)(tsx@4.21.0)(typescript@5.9.3)(yaml@2.8.2) version: 8.5.1(jiti@1.21.7)(postcss@8.5.6)(tsx@4.21.0)(typescript@5.9.3)(yaml@2.8.2)
typescript: typescript:
specifier: ^5.7.0 specifier: ^5.7.0
version: 5.9.3 version: 5.9.3
vitest:
specifier: ^3.0.0
version: 3.2.4(@types/debug@4.1.12)(@types/node@22.19.7)(jiti@1.21.7)(tsx@4.21.0)(yaml@2.8.2)
sdks/persist-rivet: sdks/persist-rivet:
dependencies:
sandbox-agent:
specifier: workspace:*
version: link:../typescript
devDependencies: devDependencies:
'@types/node': '@types/node':
specifier: ^22.0.0 specifier: ^22.0.0
@ -958,22 +928,9 @@ importers:
typescript: typescript:
specifier: ^5.7.0 specifier: ^5.7.0
version: 5.9.3 version: 5.9.3
vitest:
specifier: ^3.0.0
version: 3.2.4(@types/debug@4.1.12)(@types/node@22.19.7)(jiti@1.21.7)(tsx@4.21.0)(yaml@2.8.2)
sdks/persist-sqlite: sdks/persist-sqlite:
dependencies:
better-sqlite3:
specifier: ^11.0.0
version: 11.10.0
sandbox-agent:
specifier: workspace:*
version: link:../typescript
devDependencies: devDependencies:
'@types/better-sqlite3':
specifier: ^7.0.0
version: 7.6.13
'@types/node': '@types/node':
specifier: ^22.0.0 specifier: ^22.0.0
version: 22.19.7 version: 22.19.7
@ -983,9 +940,6 @@ importers:
typescript: typescript:
specifier: ^5.7.0 specifier: ^5.7.0
version: 5.9.3 version: 5.9.3
vitest:
specifier: ^3.0.0
version: 3.2.4(@types/debug@4.1.12)(@types/node@22.19.7)(jiti@1.21.7)(tsx@4.21.0)(yaml@2.8.2)
sdks/react: sdks/react:
dependencies: dependencies:
@ -1025,12 +979,39 @@ importers:
specifier: workspace:* specifier: workspace:*
version: link:../cli version: link:../cli
devDependencies: devDependencies:
'@cloudflare/sandbox':
specifier: '>=0.1.0'
version: 0.7.17(@opencode-ai/sdk@1.2.24)
'@daytonaio/sdk':
specifier: '>=0.12.0'
version: 0.151.0(ws@8.19.0)
'@e2b/code-interpreter':
specifier: '>=1.0.0'
version: 2.3.3
'@types/dockerode':
specifier: ^4.0.0
version: 4.0.1
'@types/node': '@types/node':
specifier: ^22.0.0 specifier: ^22.0.0
version: 22.19.7 version: 22.19.7
'@types/ws': '@types/ws':
specifier: ^8.18.1 specifier: ^8.18.1
version: 8.18.1 version: 8.18.1
'@vercel/sandbox':
specifier: '>=0.1.0'
version: 1.8.1
computesdk:
specifier: '>=0.1.0'
version: 2.5.0
dockerode:
specifier: '>=4.0.0'
version: 4.0.9
get-port:
specifier: '>=7.0.0'
version: 7.1.0
modal:
specifier: '>=0.1.0'
version: 0.7.3
openapi-typescript: openapi-typescript:
specifier: ^6.7.0 specifier: ^6.7.0
version: 6.7.6 version: 6.7.6
@ -3607,9 +3588,6 @@ packages:
'@types/node@25.5.0': '@types/node@25.5.0':
resolution: {integrity: sha512-jp2P3tQMSxWugkCUKLRPVUpGaL5MVFwF8RDuSRztfwgN1wmqJeMSbKlnEtQqU8UrhTmzEmZdu2I6v2dpp7XIxw==} resolution: {integrity: sha512-jp2P3tQMSxWugkCUKLRPVUpGaL5MVFwF8RDuSRztfwgN1wmqJeMSbKlnEtQqU8UrhTmzEmZdu2I6v2dpp7XIxw==}
'@types/pg@8.16.0':
resolution: {integrity: sha512-RmhMd/wD+CF8Dfo+cVIy3RR5cl8CyfXQ0tGgW6XBL8L4LM/UTEbNXYRbLwU6w+CgrKBNbrQWt4FUtTfaU5jSYQ==}
'@types/pg@8.18.0': '@types/pg@8.18.0':
resolution: {integrity: sha512-gT+oueVQkqnj6ajGJXblFR4iavIXWsGAFCk3dP4Kki5+a9R4NMt0JARdk6s8cUKcfUoqP5dAtDSLU8xYUTFV+Q==} resolution: {integrity: sha512-gT+oueVQkqnj6ajGJXblFR4iavIXWsGAFCk3dP4Kki5+a9R4NMt0JARdk6s8cUKcfUoqP5dAtDSLU8xYUTFV+Q==}
@ -5823,9 +5801,6 @@ packages:
pg-cloudflare@1.3.0: pg-cloudflare@1.3.0:
resolution: {integrity: sha512-6lswVVSztmHiRtD6I8hw4qP/nDm1EJbKMRhf3HCYaqud7frGysPv7FYJ5noZQdhQtN2xJnimfMtvQq21pdbzyQ==} resolution: {integrity: sha512-6lswVVSztmHiRtD6I8hw4qP/nDm1EJbKMRhf3HCYaqud7frGysPv7FYJ5noZQdhQtN2xJnimfMtvQq21pdbzyQ==}
pg-connection-string@2.11.0:
resolution: {integrity: sha512-kecgoJwhOpxYU21rZjULrmrBJ698U2RxXofKVzOn5UDj61BPj/qMb7diYUR1nLScCDbrztQFl1TaQZT0t1EtzQ==}
pg-connection-string@2.12.0: pg-connection-string@2.12.0:
resolution: {integrity: sha512-U7qg+bpswf3Cs5xLzRqbXbQl85ng0mfSV/J0nnA31MCLgvEaAo7CIhmeyrmJpOr7o+zm0rXK+hNnT5l9RHkCkQ==} resolution: {integrity: sha512-U7qg+bpswf3Cs5xLzRqbXbQl85ng0mfSV/J0nnA31MCLgvEaAo7CIhmeyrmJpOr7o+zm0rXK+hNnT5l9RHkCkQ==}
@ -5833,11 +5808,6 @@ packages:
resolution: {integrity: sha512-WCtabS6t3c8SkpDBUlb1kjOs7l66xsGdKpIPZsg4wR+B3+u9UAum2odSsF9tnvxg80h4ZxLWMy4pRjOsFIqQpw==} resolution: {integrity: sha512-WCtabS6t3c8SkpDBUlb1kjOs7l66xsGdKpIPZsg4wR+B3+u9UAum2odSsF9tnvxg80h4ZxLWMy4pRjOsFIqQpw==}
engines: {node: '>=4.0.0'} engines: {node: '>=4.0.0'}
pg-pool@3.11.0:
resolution: {integrity: sha512-MJYfvHwtGp870aeusDh+hg9apvOe2zmpZJpyt+BMtzUWlVqbhFmMK6bOBXLBUPd7iRtIF9fZplDc7KrPN3PN7w==}
peerDependencies:
pg: '>=8.0'
pg-pool@3.13.0: pg-pool@3.13.0:
resolution: {integrity: sha512-gB+R+Xud1gLFuRD/QgOIgGOBE2KCQPaPwkzBBGC9oG69pHTkhQeIuejVIk3/cnDyX39av2AxomQiyPT13WKHQA==} resolution: {integrity: sha512-gB+R+Xud1gLFuRD/QgOIgGOBE2KCQPaPwkzBBGC9oG69pHTkhQeIuejVIk3/cnDyX39av2AxomQiyPT13WKHQA==}
peerDependencies: peerDependencies:
@ -5853,15 +5823,6 @@ packages:
resolution: {integrity: sha512-qTAAlrEsl8s4OiEQY69wDvcMIdQN6wdz5ojQiOy6YRMuynxenON0O5oCpJI6lshc6scgAY8qvJ2On/p+CXY0GA==} resolution: {integrity: sha512-qTAAlrEsl8s4OiEQY69wDvcMIdQN6wdz5ojQiOy6YRMuynxenON0O5oCpJI6lshc6scgAY8qvJ2On/p+CXY0GA==}
engines: {node: '>=4'} engines: {node: '>=4'}
pg@8.18.0:
resolution: {integrity: sha512-xqrUDL1b9MbkydY/s+VZ6v+xiMUmOUk7SS9d/1kpyQxoJ6U9AO1oIJyUWVZojbfe5Cc/oluutcgFG4L9RDP1iQ==}
engines: {node: '>= 16.0.0'}
peerDependencies:
pg-native: '>=3.0.1'
peerDependenciesMeta:
pg-native:
optional: true
pg@8.20.0: pg@8.20.0:
resolution: {integrity: sha512-ldhMxz2r8fl/6QkXnBD3CR9/xg694oT6DZQ2s6c/RI28OjtSOpxnPrUCGOBJ46RCUxcWdx3p6kw/xnDHjKvaRA==} resolution: {integrity: sha512-ldhMxz2r8fl/6QkXnBD3CR9/xg694oT6DZQ2s6c/RI28OjtSOpxnPrUCGOBJ46RCUxcWdx3p6kw/xnDHjKvaRA==}
engines: {node: '>= 16.0.0'} engines: {node: '>= 16.0.0'}
@ -10190,12 +10151,6 @@ snapshots:
dependencies: dependencies:
undici-types: 7.18.2 undici-types: 7.18.2
'@types/pg@8.16.0':
dependencies:
'@types/node': 24.10.9
pg-protocol: 1.11.0
pg-types: 2.2.0
'@types/pg@8.18.0': '@types/pg@8.18.0':
dependencies: dependencies:
'@types/node': 24.10.9 '@types/node': 24.10.9
@ -11273,7 +11228,7 @@ snapshots:
glob: 11.1.0 glob: 11.1.0
openapi-fetch: 0.14.1 openapi-fetch: 0.14.1
platform: 1.3.6 platform: 1.3.6
tar: 7.5.6 tar: 7.5.7
earcut@2.2.4: {} earcut@2.2.4: {}
@ -12783,16 +12738,10 @@ snapshots:
pg-cloudflare@1.3.0: pg-cloudflare@1.3.0:
optional: true optional: true
pg-connection-string@2.11.0: {}
pg-connection-string@2.12.0: {} pg-connection-string@2.12.0: {}
pg-int8@1.0.1: {} pg-int8@1.0.1: {}
pg-pool@3.11.0(pg@8.18.0):
dependencies:
pg: 8.18.0
pg-pool@3.13.0(pg@8.20.0): pg-pool@3.13.0(pg@8.20.0):
dependencies: dependencies:
pg: 8.20.0 pg: 8.20.0
@ -12809,16 +12758,6 @@ snapshots:
postgres-date: 1.0.7 postgres-date: 1.0.7
postgres-interval: 1.2.0 postgres-interval: 1.2.0
pg@8.18.0:
dependencies:
pg-connection-string: 2.11.0
pg-pool: 3.11.0(pg@8.18.0)
pg-protocol: 1.11.0
pg-types: 2.2.0
pgpass: 1.0.5
optionalDependencies:
pg-cloudflare: 1.3.0
pg@8.20.0: pg@8.20.0:
dependencies: dependencies:
pg-connection-string: 2.12.0 pg-connection-string: 2.12.0

View file

@ -0,0 +1,5 @@
# @sandbox-agent/persist-indexeddb
> **Deprecated:** This package has been deprecated and removed.
Copy the driver source into your project. See the [reference implementation](https://github.com/rivet-dev/sandbox-agent/tree/main/frontend/packages/inspector/src/persist-indexeddb.ts) and the [session persistence docs](https://sandboxagent.dev/session-persistence) for guidance.

View file

@ -1,7 +1,7 @@
{ {
"name": "@sandbox-agent/persist-indexeddb", "name": "@sandbox-agent/persist-indexeddb",
"version": "0.3.2", "version": "0.3.2",
"description": "IndexedDB persistence driver for the Sandbox Agent TypeScript SDK", "description": "IndexedDB persistence driver for the Sandbox Agent TypeScript SDK (DEPRECATED)",
"license": "Apache-2.0", "license": "Apache-2.0",
"repository": { "repository": {
"type": "git", "type": "git",
@ -16,23 +16,16 @@
"import": "./dist/index.js" "import": "./dist/index.js"
} }
}, },
"dependencies": {
"sandbox-agent": "workspace:*"
},
"files": [ "files": [
"dist" "dist"
], ],
"scripts": { "scripts": {
"build": "tsup", "build": "tsup",
"typecheck": "tsc --noEmit", "typecheck": "tsc --noEmit"
"test": "vitest run",
"test:watch": "vitest"
}, },
"devDependencies": { "devDependencies": {
"@types/node": "^22.0.0", "@types/node": "^22.0.0",
"fake-indexeddb": "^6.2.4",
"tsup": "^8.0.0", "tsup": "^8.0.0",
"typescript": "^5.7.0", "typescript": "^5.7.0"
"vitest": "^3.0.0"
} }
} }

View file

@ -1,311 +1,5 @@
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent"; throw new Error(
"@sandbox-agent/persist-indexeddb has been deprecated and removed. " +
const DEFAULT_DB_NAME = "sandbox-agent-session-store"; "Copy the reference implementation from frontend/packages/inspector/src/persist-indexeddb.ts into your project instead. " +
const DEFAULT_DB_VERSION = 2; "See https://github.com/rivet-dev/sandbox-agent/tree/main/frontend/packages/inspector/src/persist-indexeddb.ts",
const SESSIONS_STORE = "sessions"; );
const EVENTS_STORE = "events";
const EVENTS_BY_SESSION_INDEX = "by_session_index";
const DEFAULT_LIST_LIMIT = 100;
export interface IndexedDbSessionPersistDriverOptions {
databaseName?: string;
databaseVersion?: number;
indexedDb?: IDBFactory;
}
export class IndexedDbSessionPersistDriver implements SessionPersistDriver {
private readonly indexedDb: IDBFactory;
private readonly dbName: string;
private readonly dbVersion: number;
private readonly dbPromise: Promise<IDBDatabase>;
constructor(options: IndexedDbSessionPersistDriverOptions = {}) {
const indexedDb = options.indexedDb ?? globalThis.indexedDB;
if (!indexedDb) {
throw new Error("IndexedDB is not available in this runtime.");
}
this.indexedDb = indexedDb;
this.dbName = options.databaseName ?? DEFAULT_DB_NAME;
this.dbVersion = options.databaseVersion ?? DEFAULT_DB_VERSION;
this.dbPromise = this.openDatabase();
}
async getSession(id: string): Promise<SessionRecord | null> {
const db = await this.dbPromise;
const row = await requestToPromise<IDBValidKey | SessionRow | undefined>(db.transaction(SESSIONS_STORE, "readonly").objectStore(SESSIONS_STORE).get(id));
if (!row || typeof row !== "object") {
return null;
}
return decodeSessionRow(row as SessionRow);
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
const db = await this.dbPromise;
const rows = await getAllRows<SessionRow>(db, SESSIONS_STORE);
rows.sort((a, b) => {
if (a.createdAt !== b.createdAt) {
return a.createdAt - b.createdAt;
}
return a.id.localeCompare(b.id);
});
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const slice = rows.slice(offset, offset + limit).map(decodeSessionRow);
const nextOffset = offset + slice.length;
return {
items: slice,
nextCursor: nextOffset < rows.length ? String(nextOffset) : undefined,
};
}
async updateSession(session: SessionRecord): Promise<void> {
const db = await this.dbPromise;
await transactionPromise(db, [SESSIONS_STORE], "readwrite", (tx) => {
tx.objectStore(SESSIONS_STORE).put(encodeSessionRow(session));
});
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
const db = await this.dbPromise;
const rows = (await getAllRows<EventRow>(db, EVENTS_STORE)).filter((row) => row.sessionId === request.sessionId).sort(compareEventRowsByOrder);
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const slice = rows.slice(offset, offset + limit).map(decodeEventRow);
const nextOffset = offset + slice.length;
return {
items: slice,
nextCursor: nextOffset < rows.length ? String(nextOffset) : undefined,
};
}
async insertEvent(event: SessionEvent): Promise<void> {
const db = await this.dbPromise;
await transactionPromise(db, [EVENTS_STORE], "readwrite", (tx) => {
tx.objectStore(EVENTS_STORE).put(encodeEventRow(event));
});
}
async close(): Promise<void> {
const db = await this.dbPromise;
db.close();
}
private openDatabase(): Promise<IDBDatabase> {
return new Promise((resolve, reject) => {
const request = this.indexedDb.open(this.dbName, this.dbVersion);
request.onupgradeneeded = () => {
const db = request.result;
if (!db.objectStoreNames.contains(SESSIONS_STORE)) {
db.createObjectStore(SESSIONS_STORE, { keyPath: "id" });
}
if (!db.objectStoreNames.contains(EVENTS_STORE)) {
const events = db.createObjectStore(EVENTS_STORE, { keyPath: "id" });
events.createIndex(EVENTS_BY_SESSION_INDEX, ["sessionId", "eventIndex", "id"], {
unique: false,
});
} else {
const tx = request.transaction;
if (!tx) {
return;
}
const events = tx.objectStore(EVENTS_STORE);
if (!events.indexNames.contains(EVENTS_BY_SESSION_INDEX)) {
events.createIndex(EVENTS_BY_SESSION_INDEX, ["sessionId", "eventIndex", "id"], {
unique: false,
});
}
}
};
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error ?? new Error("Unable to open IndexedDB"));
});
}
}
type SessionRow = {
id: string;
agent: string;
agentSessionId: string;
lastConnectionId: string;
createdAt: number;
destroyedAt?: number;
sessionInit?: SessionRecord["sessionInit"];
};
type EventRow = {
id: number | string;
eventIndex?: number;
sessionId: string;
createdAt: number;
connectionId: string;
sender: "client" | "agent";
payload: unknown;
};
function encodeSessionRow(session: SessionRecord): SessionRow {
return {
id: session.id,
agent: session.agent,
agentSessionId: session.agentSessionId,
lastConnectionId: session.lastConnectionId,
createdAt: session.createdAt,
destroyedAt: session.destroyedAt,
sessionInit: session.sessionInit,
};
}
function decodeSessionRow(row: SessionRow): SessionRecord {
return {
id: row.id,
agent: row.agent,
agentSessionId: row.agentSessionId,
lastConnectionId: row.lastConnectionId,
createdAt: row.createdAt,
destroyedAt: row.destroyedAt,
sessionInit: row.sessionInit,
};
}
function encodeEventRow(event: SessionEvent): EventRow {
return {
id: event.id,
eventIndex: event.eventIndex,
sessionId: event.sessionId,
createdAt: event.createdAt,
connectionId: event.connectionId,
sender: event.sender,
payload: event.payload,
};
}
function decodeEventRow(row: EventRow): SessionEvent {
return {
id: String(row.id),
eventIndex: parseEventIndex(row.eventIndex, row.id),
sessionId: row.sessionId,
createdAt: row.createdAt,
connectionId: row.connectionId,
sender: row.sender,
payload: row.payload as SessionEvent["payload"],
};
}
async function getAllRows<T>(db: IDBDatabase, storeName: string): Promise<T[]> {
return await transactionPromise<T[]>(db, [storeName], "readonly", async (tx) => {
const request = tx.objectStore(storeName).getAll();
return (await requestToPromise(request)) as T[];
});
}
function normalizeLimit(limit: number | undefined): number {
if (!Number.isFinite(limit) || (limit ?? 0) < 1) {
return DEFAULT_LIST_LIMIT;
}
return Math.floor(limit as number);
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
function compareEventRowsByOrder(a: EventRow, b: EventRow): number {
const indexA = parseEventIndex(a.eventIndex, a.id);
const indexB = parseEventIndex(b.eventIndex, b.id);
if (indexA !== indexB) {
return indexA - indexB;
}
return String(a.id).localeCompare(String(b.id));
}
function parseEventIndex(value: number | undefined, fallback: number | string): number {
if (typeof value === "number" && Number.isFinite(value)) {
return Math.max(0, Math.floor(value));
}
const parsed = Number.parseInt(String(fallback), 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
function requestToPromise<T>(request: IDBRequest<T>): Promise<T> {
return new Promise((resolve, reject) => {
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error ?? new Error("IndexedDB request failed"));
});
}
function transactionPromise<T>(db: IDBDatabase, stores: string[], mode: IDBTransactionMode, run: (tx: IDBTransaction) => T | Promise<T>): Promise<T> {
return new Promise((resolve, reject) => {
const tx = db.transaction(stores, mode);
let settled = false;
let resultValue: T | undefined;
let runCompleted = false;
let txCompleted = false;
function tryResolve() {
if (settled || !runCompleted || !txCompleted) {
return;
}
settled = true;
resolve(resultValue as T);
}
tx.oncomplete = () => {
txCompleted = true;
tryResolve();
};
tx.onerror = () => {
if (settled) {
return;
}
settled = true;
reject(tx.error ?? new Error("IndexedDB transaction failed"));
};
tx.onabort = () => {
if (settled) {
return;
}
settled = true;
reject(tx.error ?? new Error("IndexedDB transaction aborted"));
};
Promise.resolve(run(tx))
.then((value) => {
resultValue = value;
runCompleted = true;
tryResolve();
})
.catch((error) => {
if (!settled) {
settled = true;
reject(error);
}
try {
tx.abort();
} catch {
// no-op
}
});
});
}

View file

@ -1,96 +0,0 @@
import "fake-indexeddb/auto";
import { describe, it, expect } from "vitest";
import { IndexedDbSessionPersistDriver } from "../src/index.ts";
function uniqueDbName(prefix: string): string {
return `${prefix}-${Date.now().toString(36)}-${Math.random().toString(36).slice(2, 10)}`;
}
describe("IndexedDbSessionPersistDriver", () => {
it("stores and pages sessions and events", async () => {
const dbName = uniqueDbName("indexeddb-driver");
const driver = new IndexedDbSessionPersistDriver({ databaseName: dbName });
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 100,
});
await driver.updateSession({
id: "s-2",
agent: "mock",
agentSessionId: "a-2",
lastConnectionId: "c-2",
createdAt: 200,
destroyedAt: 300,
});
await driver.insertEvent({
id: "evt-1",
eventIndex: 1,
sessionId: "s-1",
createdAt: 1,
connectionId: "c-1",
sender: "client",
payload: { jsonrpc: "2.0", method: "session/prompt", params: { sessionId: "a-1" } },
});
await driver.insertEvent({
id: "evt-2",
eventIndex: 2,
sessionId: "s-1",
createdAt: 2,
connectionId: "c-1",
sender: "agent",
payload: { jsonrpc: "2.0", method: "session/update", params: { sessionId: "a-1" } },
});
const loaded = await driver.getSession("s-2");
expect(loaded?.destroyedAt).toBe(300);
const page1 = await driver.listSessions({ limit: 1 });
expect(page1.items).toHaveLength(1);
expect(page1.items[0]?.id).toBe("s-1");
expect(page1.nextCursor).toBeTruthy();
const page2 = await driver.listSessions({ cursor: page1.nextCursor, limit: 1 });
expect(page2.items).toHaveLength(1);
expect(page2.items[0]?.id).toBe("s-2");
expect(page2.nextCursor).toBeUndefined();
const eventsPage = await driver.listEvents({ sessionId: "s-1", limit: 10 });
expect(eventsPage.items).toHaveLength(2);
expect(eventsPage.items[0]?.id).toBe("evt-1");
expect(eventsPage.items[0]?.eventIndex).toBe(1);
expect(eventsPage.items[1]?.id).toBe("evt-2");
expect(eventsPage.items[1]?.eventIndex).toBe(2);
await driver.close();
});
it("persists across driver instances for same database", async () => {
const dbName = uniqueDbName("indexeddb-reopen");
{
const driver = new IndexedDbSessionPersistDriver({ databaseName: dbName });
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 1,
});
await driver.close();
}
{
const driver = new IndexedDbSessionPersistDriver({ databaseName: dbName });
const session = await driver.getSession("s-1");
expect(session?.id).toBe("s-1");
await driver.close();
}
});
});

View file

@ -1,129 +0,0 @@
import "fake-indexeddb/auto";
import { describe, it, expect, beforeAll, afterAll } from "vitest";
import { existsSync, mkdtempSync, rmSync } from "node:fs";
import { dirname, join, resolve } from "node:path";
import { fileURLToPath } from "node:url";
import { tmpdir } from "node:os";
import { SandboxAgent } from "sandbox-agent";
import { spawnSandboxAgent, type SandboxAgentSpawnHandle } from "../../typescript/src/spawn.ts";
import { prepareMockAgentDataHome } from "../../typescript/tests/helpers/mock-agent.ts";
import { IndexedDbSessionPersistDriver } from "../src/index.ts";
const __dirname = dirname(fileURLToPath(import.meta.url));
function findBinary(): string | null {
if (process.env.SANDBOX_AGENT_BIN) {
return process.env.SANDBOX_AGENT_BIN;
}
const cargoPaths = [resolve(__dirname, "../../../target/debug/sandbox-agent"), resolve(__dirname, "../../../target/release/sandbox-agent")];
for (const p of cargoPaths) {
if (existsSync(p)) {
return p;
}
}
return null;
}
function uniqueDbName(prefix: string): string {
return `${prefix}-${Date.now().toString(36)}-${Math.random().toString(36).slice(2, 10)}`;
}
const BINARY_PATH = findBinary();
if (!BINARY_PATH) {
throw new Error("sandbox-agent binary not found. Build it (cargo build -p sandbox-agent) or set SANDBOX_AGENT_BIN.");
}
if (!process.env.SANDBOX_AGENT_BIN) {
process.env.SANDBOX_AGENT_BIN = BINARY_PATH;
}
describe("IndexedDB persistence end-to-end", () => {
let handle: SandboxAgentSpawnHandle;
let baseUrl: string;
let token: string;
let dataHome: string;
beforeAll(async () => {
dataHome = mkdtempSync(join(tmpdir(), "indexeddb-integration-"));
prepareMockAgentDataHome(dataHome);
handle = await spawnSandboxAgent({
enabled: true,
log: "silent",
timeoutMs: 30000,
env: {
XDG_DATA_HOME: dataHome,
HOME: dataHome,
USERPROFILE: dataHome,
APPDATA: join(dataHome, "AppData", "Roaming"),
LOCALAPPDATA: join(dataHome, "AppData", "Local"),
},
});
baseUrl = handle.baseUrl;
token = handle.token;
});
afterAll(async () => {
await handle.dispose();
rmSync(dataHome, { recursive: true, force: true });
});
it("restores sessions/events across sdk instances", async () => {
const dbName = uniqueDbName("sandbox-agent-browser-e2e");
const persist1 = new IndexedDbSessionPersistDriver({ databaseName: dbName });
const sdk1 = await SandboxAgent.connect({
baseUrl,
token,
persist: persist1,
replayMaxEvents: 40,
replayMaxChars: 16000,
});
const created = await sdk1.createSession({ agent: "mock" });
await created.prompt([{ type: "text", text: "indexeddb-first" }]);
const firstConnectionId = created.lastConnectionId;
await sdk1.dispose();
await persist1.close();
const persist2 = new IndexedDbSessionPersistDriver({ databaseName: dbName });
const sdk2 = await SandboxAgent.connect({
baseUrl,
token,
persist: persist2,
replayMaxEvents: 40,
replayMaxChars: 16000,
});
const restored = await sdk2.resumeSession(created.id);
expect(restored.lastConnectionId).not.toBe(firstConnectionId);
await restored.prompt([{ type: "text", text: "indexeddb-second" }]);
const sessions = await sdk2.listSessions({ limit: 20 });
expect(sessions.items.some((entry) => entry.id === created.id)).toBe(true);
const events = await sdk2.getEvents({ sessionId: created.id, limit: 1000 });
expect(events.items.length).toBeGreaterThan(0);
const replayInjected = events.items.find((event) => {
if (event.sender !== "client") {
return false;
}
const payload = event.payload as Record<string, unknown>;
const method = payload.method;
const params = payload.params as Record<string, unknown> | undefined;
const prompt = Array.isArray(params?.prompt) ? params?.prompt : [];
const firstBlock = prompt[0] as Record<string, unknown> | undefined;
return method === "session/prompt" && typeof firstBlock?.text === "string" && firstBlock.text.includes("Previous session history is replayed below");
});
expect(replayInjected).toBeTruthy();
await sdk2.dispose();
await persist2.close();
});
});

View file

@ -0,0 +1,5 @@
# @sandbox-agent/persist-postgres
> **Deprecated:** This package has been deprecated and removed.
Install `pg` directly and copy the driver source into your project. See the [full example](https://github.com/rivet-dev/sandbox-agent/tree/main/examples/persist-postgres) and the [session persistence docs](https://sandboxagent.dev/session-persistence) for guidance.

View file

@ -1,7 +1,7 @@
{ {
"name": "@sandbox-agent/persist-postgres", "name": "@sandbox-agent/persist-postgres",
"version": "0.3.2", "version": "0.3.2",
"description": "PostgreSQL persistence driver for the Sandbox Agent TypeScript SDK", "description": "PostgreSQL persistence driver for the Sandbox Agent TypeScript SDK (DEPRECATED)",
"license": "Apache-2.0", "license": "Apache-2.0",
"repository": { "repository": {
"type": "git", "type": "git",
@ -16,24 +16,16 @@
"import": "./dist/index.js" "import": "./dist/index.js"
} }
}, },
"dependencies": {
"pg": "^8.16.3",
"sandbox-agent": "workspace:*"
},
"files": [ "files": [
"dist" "dist"
], ],
"scripts": { "scripts": {
"build": "tsup", "build": "tsup",
"typecheck": "tsc --noEmit", "typecheck": "tsc --noEmit"
"test": "vitest run",
"test:watch": "vitest"
}, },
"devDependencies": { "devDependencies": {
"@types/node": "^22.0.0", "@types/node": "^22.0.0",
"@types/pg": "^8.15.6",
"tsup": "^8.0.0", "tsup": "^8.0.0",
"typescript": "^5.7.0", "typescript": "^5.7.0"
"vitest": "^3.0.0"
} }
} }

View file

@ -1,306 +1,5 @@
import { Pool, type PoolConfig } from "pg"; throw new Error(
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent"; "@sandbox-agent/persist-postgres has been deprecated and removed. " +
"Copy the reference implementation from examples/persist-postgres into your project instead. " +
const DEFAULT_LIST_LIMIT = 100; "See https://github.com/rivet-dev/sandbox-agent/tree/main/examples/persist-postgres",
);
export interface PostgresSessionPersistDriverOptions {
connectionString?: string;
pool?: Pool;
poolConfig?: PoolConfig;
schema?: string;
}
export class PostgresSessionPersistDriver implements SessionPersistDriver {
private readonly pool: Pool;
private readonly ownsPool: boolean;
private readonly schema: string;
private readonly initialized: Promise<void>;
constructor(options: PostgresSessionPersistDriverOptions = {}) {
this.schema = normalizeSchema(options.schema ?? "public");
if (options.pool) {
this.pool = options.pool;
this.ownsPool = false;
} else {
this.pool = new Pool({
connectionString: options.connectionString,
...options.poolConfig,
});
this.ownsPool = true;
}
this.initialized = this.initialize();
}
async getSession(id: string): Promise<SessionRecord | null> {
await this.ready();
const result = await this.pool.query<SessionRow>(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, session_init_json
FROM ${this.table("sessions")}
WHERE id = $1`,
[id],
);
if (result.rows.length === 0) {
return null;
}
return decodeSessionRow(result.rows[0]);
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
await this.ready();
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rowsResult = await this.pool.query<SessionRow>(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, session_init_json
FROM ${this.table("sessions")}
ORDER BY created_at ASC, id ASC
LIMIT $1 OFFSET $2`,
[limit, offset],
);
const countResult = await this.pool.query<{ count: string }>(`SELECT COUNT(*) AS count FROM ${this.table("sessions")}`);
const total = parseInteger(countResult.rows[0]?.count ?? "0");
const nextOffset = offset + rowsResult.rows.length;
return {
items: rowsResult.rows.map(decodeSessionRow),
nextCursor: nextOffset < total ? String(nextOffset) : undefined,
};
}
async updateSession(session: SessionRecord): Promise<void> {
await this.ready();
await this.pool.query(
`INSERT INTO ${this.table("sessions")} (
id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, session_init_json
) VALUES ($1, $2, $3, $4, $5, $6, $7)
ON CONFLICT(id) DO UPDATE SET
agent = EXCLUDED.agent,
agent_session_id = EXCLUDED.agent_session_id,
last_connection_id = EXCLUDED.last_connection_id,
created_at = EXCLUDED.created_at,
destroyed_at = EXCLUDED.destroyed_at,
session_init_json = EXCLUDED.session_init_json`,
[
session.id,
session.agent,
session.agentSessionId,
session.lastConnectionId,
session.createdAt,
session.destroyedAt ?? null,
session.sessionInit ?? null,
],
);
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
await this.ready();
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rowsResult = await this.pool.query<EventRow>(
`SELECT id, event_index, session_id, created_at, connection_id, sender, payload_json
FROM ${this.table("events")}
WHERE session_id = $1
ORDER BY event_index ASC, id ASC
LIMIT $2 OFFSET $3`,
[request.sessionId, limit, offset],
);
const countResult = await this.pool.query<{ count: string }>(`SELECT COUNT(*) AS count FROM ${this.table("events")} WHERE session_id = $1`, [
request.sessionId,
]);
const total = parseInteger(countResult.rows[0]?.count ?? "0");
const nextOffset = offset + rowsResult.rows.length;
return {
items: rowsResult.rows.map(decodeEventRow),
nextCursor: nextOffset < total ? String(nextOffset) : undefined,
};
}
async insertEvent(event: SessionEvent): Promise<void> {
await this.ready();
await this.pool.query(
`INSERT INTO ${this.table("events")} (
id, event_index, session_id, created_at, connection_id, sender, payload_json
) VALUES ($1, $2, $3, $4, $5, $6, $7)
ON CONFLICT(id) DO UPDATE SET
event_index = EXCLUDED.event_index,
session_id = EXCLUDED.session_id,
created_at = EXCLUDED.created_at,
connection_id = EXCLUDED.connection_id,
sender = EXCLUDED.sender,
payload_json = EXCLUDED.payload_json`,
[event.id, event.eventIndex, event.sessionId, event.createdAt, event.connectionId, event.sender, event.payload],
);
}
async close(): Promise<void> {
if (!this.ownsPool) {
return;
}
await this.pool.end();
}
private async ready(): Promise<void> {
await this.initialized;
}
private table(name: "sessions" | "events"): string {
return `"${this.schema}"."${name}"`;
}
private async initialize(): Promise<void> {
await this.pool.query(`CREATE SCHEMA IF NOT EXISTS "${this.schema}"`);
await this.pool.query(`
CREATE TABLE IF NOT EXISTS ${this.table("sessions")} (
id TEXT PRIMARY KEY,
agent TEXT NOT NULL,
agent_session_id TEXT NOT NULL,
last_connection_id TEXT NOT NULL,
created_at BIGINT NOT NULL,
destroyed_at BIGINT,
session_init_json JSONB
)
`);
await this.pool.query(`
CREATE TABLE IF NOT EXISTS ${this.table("events")} (
id TEXT PRIMARY KEY,
event_index BIGINT NOT NULL,
session_id TEXT NOT NULL,
created_at BIGINT NOT NULL,
connection_id TEXT NOT NULL,
sender TEXT NOT NULL,
payload_json JSONB NOT NULL
)
`);
await this.pool.query(`
ALTER TABLE ${this.table("events")}
ALTER COLUMN id TYPE TEXT USING id::TEXT
`);
await this.pool.query(`
ALTER TABLE ${this.table("events")}
ADD COLUMN IF NOT EXISTS event_index BIGINT
`);
await this.pool.query(`
WITH ranked AS (
SELECT id, ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC) AS ranked_index
FROM ${this.table("events")}
)
UPDATE ${this.table("events")} AS current_events
SET event_index = ranked.ranked_index
FROM ranked
WHERE current_events.id = ranked.id
AND current_events.event_index IS NULL
`);
await this.pool.query(`
ALTER TABLE ${this.table("events")}
ALTER COLUMN event_index SET NOT NULL
`);
await this.pool.query(`
CREATE INDEX IF NOT EXISTS idx_events_session_order
ON ${this.table("events")}(session_id, event_index, id)
`);
}
}
type SessionRow = {
id: string;
agent: string;
agent_session_id: string;
last_connection_id: string;
created_at: string | number;
destroyed_at: string | number | null;
session_init_json: unknown | null;
};
type EventRow = {
id: string | number;
event_index: string | number;
session_id: string;
created_at: string | number;
connection_id: string;
sender: string;
payload_json: unknown;
};
function decodeSessionRow(row: SessionRow): SessionRecord {
return {
id: row.id,
agent: row.agent,
agentSessionId: row.agent_session_id,
lastConnectionId: row.last_connection_id,
createdAt: parseInteger(row.created_at),
destroyedAt: row.destroyed_at === null ? undefined : parseInteger(row.destroyed_at),
sessionInit: row.session_init_json ? (row.session_init_json as SessionRecord["sessionInit"]) : undefined,
};
}
function decodeEventRow(row: EventRow): SessionEvent {
return {
id: String(row.id),
eventIndex: parseInteger(row.event_index),
sessionId: row.session_id,
createdAt: parseInteger(row.created_at),
connectionId: row.connection_id,
sender: parseSender(row.sender),
payload: row.payload_json as SessionEvent["payload"],
};
}
function normalizeLimit(limit: number | undefined): number {
if (!Number.isFinite(limit) || (limit ?? 0) < 1) {
return DEFAULT_LIST_LIMIT;
}
return Math.floor(limit as number);
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
function parseInteger(value: string | number): number {
const parsed = typeof value === "number" ? value : Number.parseInt(value, 10);
if (!Number.isFinite(parsed)) {
throw new Error(`Invalid integer value returned by postgres: ${String(value)}`);
}
return parsed;
}
function parseSender(value: string): SessionEvent["sender"] {
if (value === "agent" || value === "client") {
return value;
}
throw new Error(`Invalid sender value returned by postgres: ${value}`);
}
function normalizeSchema(schema: string): string {
if (!/^[A-Za-z_][A-Za-z0-9_]*$/.test(schema)) {
throw new Error(`Invalid schema name '${schema}'. Use letters, numbers, and underscores only.`);
}
return schema;
}

View file

@ -1,245 +0,0 @@
import { afterAll, afterEach, beforeAll, beforeEach, describe, expect, it } from "vitest";
import { execFileSync } from "node:child_process";
import { existsSync, mkdtempSync, rmSync } from "node:fs";
import { dirname, join, resolve } from "node:path";
import { fileURLToPath } from "node:url";
import { tmpdir } from "node:os";
import { randomUUID } from "node:crypto";
import { Client } from "pg";
import { SandboxAgent } from "sandbox-agent";
import { spawnSandboxAgent, type SandboxAgentSpawnHandle } from "../../typescript/src/spawn.ts";
import { prepareMockAgentDataHome } from "../../typescript/tests/helpers/mock-agent.ts";
import { PostgresSessionPersistDriver } from "../src/index.ts";
const __dirname = dirname(fileURLToPath(import.meta.url));
function findBinary(): string | null {
if (process.env.SANDBOX_AGENT_BIN) {
return process.env.SANDBOX_AGENT_BIN;
}
const cargoPaths = [resolve(__dirname, "../../../target/debug/sandbox-agent"), resolve(__dirname, "../../../target/release/sandbox-agent")];
for (const p of cargoPaths) {
if (existsSync(p)) {
return p;
}
}
return null;
}
const BINARY_PATH = findBinary();
if (!BINARY_PATH) {
throw new Error("sandbox-agent binary not found. Build it (cargo build -p sandbox-agent) or set SANDBOX_AGENT_BIN.");
}
if (!process.env.SANDBOX_AGENT_BIN) {
process.env.SANDBOX_AGENT_BIN = BINARY_PATH;
}
interface PostgresContainer {
containerId: string;
connectionString: string;
}
describe("Postgres persistence driver", () => {
let handle: SandboxAgentSpawnHandle;
let baseUrl: string;
let token: string;
let dataHome: string;
let postgres: PostgresContainer | null = null;
beforeAll(async () => {
dataHome = mkdtempSync(join(tmpdir(), "postgres-integration-"));
prepareMockAgentDataHome(dataHome);
handle = await spawnSandboxAgent({
enabled: true,
log: "silent",
timeoutMs: 30000,
env: {
XDG_DATA_HOME: dataHome,
HOME: dataHome,
USERPROFILE: dataHome,
APPDATA: join(dataHome, "AppData", "Roaming"),
LOCALAPPDATA: join(dataHome, "AppData", "Local"),
},
});
baseUrl = handle.baseUrl;
token = handle.token;
});
beforeEach(async () => {
postgres = await startPostgresContainer();
});
afterEach(() => {
if (postgres) {
stopPostgresContainer(postgres.containerId);
postgres = null;
}
});
afterAll(async () => {
await handle.dispose();
rmSync(dataHome, { recursive: true, force: true });
});
it("persists session/event history across SDK instances and supports replay restore", async () => {
const connectionString = requirePostgres(postgres).connectionString;
const persist1 = new PostgresSessionPersistDriver({
connectionString,
});
const sdk1 = await SandboxAgent.connect({
baseUrl,
token,
persist: persist1,
replayMaxEvents: 40,
replayMaxChars: 16000,
});
const created = await sdk1.createSession({ agent: "mock" });
await created.prompt([{ type: "text", text: "postgres-first" }]);
const firstConnectionId = created.lastConnectionId;
await sdk1.dispose();
await persist1.close();
const persist2 = new PostgresSessionPersistDriver({
connectionString,
});
const sdk2 = await SandboxAgent.connect({
baseUrl,
token,
persist: persist2,
replayMaxEvents: 40,
replayMaxChars: 16000,
});
const restored = await sdk2.resumeSession(created.id);
expect(restored.lastConnectionId).not.toBe(firstConnectionId);
await restored.prompt([{ type: "text", text: "postgres-second" }]);
const sessions = await sdk2.listSessions({ limit: 20 });
expect(sessions.items.some((entry) => entry.id === created.id)).toBe(true);
const events = await sdk2.getEvents({ sessionId: created.id, limit: 1000 });
expect(events.items.length).toBeGreaterThan(0);
expect(events.items.every((event) => typeof event.id === "string")).toBe(true);
expect(events.items.every((event) => Number.isInteger(event.eventIndex))).toBe(true);
for (let i = 1; i < events.items.length; i += 1) {
expect(events.items[i]!.eventIndex).toBeGreaterThanOrEqual(events.items[i - 1]!.eventIndex);
}
const replayInjected = events.items.find((event) => {
if (event.sender !== "client") {
return false;
}
const payload = event.payload as Record<string, unknown>;
const method = payload.method;
const params = payload.params as Record<string, unknown> | undefined;
const prompt = Array.isArray(params?.prompt) ? params?.prompt : [];
const firstBlock = prompt[0] as Record<string, unknown> | undefined;
return method === "session/prompt" && typeof firstBlock?.text === "string" && firstBlock.text.includes("Previous session history is replayed below");
});
expect(replayInjected).toBeTruthy();
await sdk2.dispose();
await persist2.close();
});
});
async function startPostgresContainer(): Promise<PostgresContainer> {
const name = `sandbox-agent-postgres-${randomUUID()}`;
const containerId = runDockerCommand([
"run",
"-d",
"--rm",
"--name",
name,
"-e",
"POSTGRES_USER=postgres",
"-e",
"POSTGRES_PASSWORD=postgres",
"-e",
"POSTGRES_DB=sandboxagent",
"-p",
"127.0.0.1::5432",
"postgres:16-alpine",
]);
const portOutput = runDockerCommand(["port", containerId, "5432/tcp"]);
const port = parsePort(portOutput);
const connectionString = `postgres://postgres:postgres@127.0.0.1:${port}/sandboxagent`;
await waitForPostgres(connectionString);
return {
containerId,
connectionString,
};
}
function stopPostgresContainer(containerId: string): void {
try {
runDockerCommand(["rm", "-f", containerId]);
} catch {
// Container may already be gone when test teardown runs.
}
}
function runDockerCommand(args: string[]): string {
return execFileSync("docker", args, {
encoding: "utf8",
stdio: ["ignore", "pipe", "pipe"],
}).trim();
}
function parsePort(output: string): string {
const firstLine = output.split("\n")[0]?.trim() ?? "";
const match = firstLine.match(/:(\d+)$/);
if (!match) {
throw new Error(`Failed to parse docker port output: '${output}'`);
}
return match[1];
}
async function waitForPostgres(connectionString: string): Promise<void> {
const timeoutMs = 30000;
const deadline = Date.now() + timeoutMs;
let lastError: unknown;
while (Date.now() < deadline) {
const client = new Client({ connectionString });
try {
await client.connect();
await client.query("SELECT 1");
await client.end();
return;
} catch (error) {
lastError = error;
try {
await client.end();
} catch {
// Ignore cleanup failures while retrying.
}
await delay(250);
}
}
throw new Error(`Postgres container did not become ready: ${String(lastError)}`);
}
function delay(ms: number): Promise<void> {
return new Promise((resolvePromise) => setTimeout(resolvePromise, ms));
}
function requirePostgres(container: PostgresContainer | null): PostgresContainer {
if (!container) {
throw new Error("Postgres container was not initialized for this test.");
}
return container;
}

View file

@ -0,0 +1,5 @@
# @sandbox-agent/persist-rivet
> **Deprecated:** This package has been deprecated and removed.
Copy the driver source into your project. See the [multiplayer docs](https://github.com/rivet-dev/sandbox-agent/tree/main/docs/multiplayer.mdx) and the [session persistence docs](https://sandboxagent.dev/session-persistence) for guidance.

View file

@ -1,7 +1,7 @@
{ {
"name": "@sandbox-agent/persist-rivet", "name": "@sandbox-agent/persist-rivet",
"version": "0.3.2", "version": "0.3.2",
"description": "Rivet Actor persistence driver for the Sandbox Agent TypeScript SDK", "description": "Rivet Actor persistence driver for the Sandbox Agent TypeScript SDK (DEPRECATED)",
"license": "Apache-2.0", "license": "Apache-2.0",
"repository": { "repository": {
"type": "git", "type": "git",
@ -16,30 +16,16 @@
"import": "./dist/index.js" "import": "./dist/index.js"
} }
}, },
"dependencies": {
"sandbox-agent": "workspace:*"
},
"peerDependencies": {
"rivetkit": ">=0.5.0"
},
"peerDependenciesMeta": {
"rivetkit": {
"optional": true
}
},
"files": [ "files": [
"dist" "dist"
], ],
"scripts": { "scripts": {
"build": "tsup", "build": "tsup",
"typecheck": "tsc --noEmit", "typecheck": "tsc --noEmit"
"test": "vitest run",
"test:watch": "vitest"
}, },
"devDependencies": { "devDependencies": {
"@types/node": "^22.0.0", "@types/node": "^22.0.0",
"tsup": "^8.0.0", "tsup": "^8.0.0",
"typescript": "^5.7.0", "typescript": "^5.7.0"
"vitest": "^3.0.0"
} }
} }

View file

@ -1,168 +1,5 @@
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent"; throw new Error(
"@sandbox-agent/persist-rivet has been deprecated and removed. " +
/** Structural type compatible with rivetkit's ActorContext without importing it. */ "Copy the reference implementation from docs/multiplayer.mdx into your project instead. " +
export interface ActorContextLike { "See https://github.com/rivet-dev/sandbox-agent/tree/main/docs/multiplayer.mdx",
state: Record<string, unknown>; );
}
export interface RivetPersistData {
sessions: Record<string, SessionRecord>;
events: Record<string, SessionEvent[]>;
}
export type RivetPersistState = {
_sandboxAgentPersist: RivetPersistData;
};
export interface RivetSessionPersistDriverOptions {
/** Maximum number of sessions to retain. Oldest are evicted first. Default: 1024. */
maxSessions?: number;
/** Maximum events per session. Oldest are trimmed first. Default: 500. */
maxEventsPerSession?: number;
/** Key on `c.state` where persist data is stored. Default: `"_sandboxAgentPersist"`. */
stateKey?: string;
}
const DEFAULT_MAX_SESSIONS = 1024;
const DEFAULT_MAX_EVENTS_PER_SESSION = 500;
const DEFAULT_LIST_LIMIT = 100;
const DEFAULT_STATE_KEY = "_sandboxAgentPersist";
export class RivetSessionPersistDriver implements SessionPersistDriver {
private readonly maxSessions: number;
private readonly maxEventsPerSession: number;
private readonly stateKey: string;
private readonly ctx: ActorContextLike;
constructor(ctx: ActorContextLike, options: RivetSessionPersistDriverOptions = {}) {
this.ctx = ctx;
this.maxSessions = normalizeCap(options.maxSessions, DEFAULT_MAX_SESSIONS);
this.maxEventsPerSession = normalizeCap(options.maxEventsPerSession, DEFAULT_MAX_EVENTS_PER_SESSION);
this.stateKey = options.stateKey ?? DEFAULT_STATE_KEY;
// Auto-initialize if absent; preserve existing data on actor wake.
if (!this.ctx.state[this.stateKey]) {
this.ctx.state[this.stateKey] = { sessions: {}, events: {} } satisfies RivetPersistData;
}
}
private get data(): RivetPersistData {
return this.ctx.state[this.stateKey] as RivetPersistData;
}
async getSession(id: string): Promise<SessionRecord | null> {
const session = this.data.sessions[id];
return session ? cloneSessionRecord(session) : null;
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
const sorted = Object.values(this.data.sessions).sort((a, b) => {
if (a.createdAt !== b.createdAt) {
return a.createdAt - b.createdAt;
}
return a.id.localeCompare(b.id);
});
const page = paginate(sorted, request);
return {
items: page.items.map(cloneSessionRecord),
nextCursor: page.nextCursor,
};
}
async updateSession(session: SessionRecord): Promise<void> {
this.data.sessions[session.id] = { ...session };
if (!this.data.events[session.id]) {
this.data.events[session.id] = [];
}
const ids = Object.keys(this.data.sessions);
if (ids.length <= this.maxSessions) {
return;
}
const overflow = ids.length - this.maxSessions;
const removable = Object.values(this.data.sessions)
.sort((a, b) => {
if (a.createdAt !== b.createdAt) {
return a.createdAt - b.createdAt;
}
return a.id.localeCompare(b.id);
})
.slice(0, overflow)
.map((s) => s.id);
for (const sessionId of removable) {
delete this.data.sessions[sessionId];
delete this.data.events[sessionId];
}
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
const all = [...(this.data.events[request.sessionId] ?? [])].sort((a, b) => {
if (a.eventIndex !== b.eventIndex) {
return a.eventIndex - b.eventIndex;
}
return a.id.localeCompare(b.id);
});
const page = paginate(all, request);
return {
items: page.items.map(cloneSessionEvent),
nextCursor: page.nextCursor,
};
}
async insertEvent(event: SessionEvent): Promise<void> {
const events = this.data.events[event.sessionId] ?? [];
events.push(cloneSessionEvent(event));
if (events.length > this.maxEventsPerSession) {
events.splice(0, events.length - this.maxEventsPerSession);
}
this.data.events[event.sessionId] = events;
}
}
function cloneSessionRecord(session: SessionRecord): SessionRecord {
return {
...session,
sessionInit: session.sessionInit ? (JSON.parse(JSON.stringify(session.sessionInit)) as SessionRecord["sessionInit"]) : undefined,
};
}
function cloneSessionEvent(event: SessionEvent): SessionEvent {
return {
...event,
payload: JSON.parse(JSON.stringify(event.payload)) as SessionEvent["payload"],
};
}
function normalizeCap(value: number | undefined, fallback: number): number {
if (!Number.isFinite(value) || (value ?? 0) < 1) {
return fallback;
}
return Math.floor(value as number);
}
function paginate<T>(items: T[], request: ListPageRequest): ListPage<T> {
const offset = parseCursor(request.cursor);
const limit = normalizeCap(request.limit, DEFAULT_LIST_LIMIT);
const slice = items.slice(offset, offset + limit);
const nextOffset = offset + slice.length;
return {
items: slice,
nextCursor: nextOffset < items.length ? String(nextOffset) : undefined,
};
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}

View file

@ -1,236 +0,0 @@
import { describe, it, expect } from "vitest";
import { RivetSessionPersistDriver } from "../src/index.ts";
import type { RivetPersistData } from "../src/index.ts";
function makeCtx() {
return { state: {} as Record<string, unknown> };
}
describe("RivetSessionPersistDriver", () => {
it("auto-initializes state on construction", () => {
const ctx = makeCtx();
new RivetSessionPersistDriver(ctx);
const data = ctx.state._sandboxAgentPersist as RivetPersistData;
expect(data).toBeDefined();
expect(data.sessions).toEqual({});
expect(data.events).toEqual({});
});
it("preserves existing state on construction (actor wake)", async () => {
const ctx = makeCtx();
const driver1 = new RivetSessionPersistDriver(ctx);
await driver1.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 100,
});
// Simulate actor wake: new driver instance, same state object
const driver2 = new RivetSessionPersistDriver(ctx);
const session = await driver2.getSession("s-1");
expect(session?.id).toBe("s-1");
expect(session?.createdAt).toBe(100);
});
it("stores and retrieves sessions", async () => {
const driver = new RivetSessionPersistDriver(makeCtx());
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 100,
});
await driver.updateSession({
id: "s-2",
agent: "mock",
agentSessionId: "a-2",
lastConnectionId: "c-2",
createdAt: 200,
destroyedAt: 300,
});
const loaded = await driver.getSession("s-2");
expect(loaded?.destroyedAt).toBe(300);
const missing = await driver.getSession("s-nonexistent");
expect(missing).toBeNull();
});
it("pages sessions sorted by createdAt", async () => {
const driver = new RivetSessionPersistDriver(makeCtx());
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 100,
});
await driver.updateSession({
id: "s-2",
agent: "mock",
agentSessionId: "a-2",
lastConnectionId: "c-2",
createdAt: 200,
});
const page1 = await driver.listSessions({ limit: 1 });
expect(page1.items).toHaveLength(1);
expect(page1.items[0]?.id).toBe("s-1");
expect(page1.nextCursor).toBeTruthy();
const page2 = await driver.listSessions({ cursor: page1.nextCursor, limit: 1 });
expect(page2.items).toHaveLength(1);
expect(page2.items[0]?.id).toBe("s-2");
expect(page2.nextCursor).toBeUndefined();
});
it("stores and pages events", async () => {
const driver = new RivetSessionPersistDriver(makeCtx());
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 1,
});
await driver.insertEvent({
id: "evt-1",
eventIndex: 1,
sessionId: "s-1",
createdAt: 1,
connectionId: "c-1",
sender: "client",
payload: { jsonrpc: "2.0", method: "session/prompt", params: { sessionId: "a-1" } },
});
await driver.insertEvent({
id: "evt-2",
eventIndex: 2,
sessionId: "s-1",
createdAt: 2,
connectionId: "c-1",
sender: "agent",
payload: { jsonrpc: "2.0", method: "session/update", params: { sessionId: "a-1" } },
});
const eventsPage = await driver.listEvents({ sessionId: "s-1", limit: 10 });
expect(eventsPage.items).toHaveLength(2);
expect(eventsPage.items[0]?.id).toBe("evt-1");
expect(eventsPage.items[0]?.eventIndex).toBe(1);
expect(eventsPage.items[1]?.id).toBe("evt-2");
expect(eventsPage.items[1]?.eventIndex).toBe(2);
});
it("evicts oldest sessions when maxSessions exceeded", async () => {
const driver = new RivetSessionPersistDriver(makeCtx(), { maxSessions: 2 });
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 100,
});
await driver.updateSession({
id: "s-2",
agent: "mock",
agentSessionId: "a-2",
lastConnectionId: "c-2",
createdAt: 200,
});
// Adding a third session should evict the oldest (s-1)
await driver.updateSession({
id: "s-3",
agent: "mock",
agentSessionId: "a-3",
lastConnectionId: "c-3",
createdAt: 300,
});
expect(await driver.getSession("s-1")).toBeNull();
expect(await driver.getSession("s-2")).not.toBeNull();
expect(await driver.getSession("s-3")).not.toBeNull();
});
it("trims oldest events when maxEventsPerSession exceeded", async () => {
const driver = new RivetSessionPersistDriver(makeCtx(), { maxEventsPerSession: 2 });
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 1,
});
for (let i = 1; i <= 3; i++) {
await driver.insertEvent({
id: `evt-${i}`,
eventIndex: i,
sessionId: "s-1",
createdAt: i,
connectionId: "c-1",
sender: "client",
payload: { jsonrpc: "2.0", method: "session/prompt", params: { sessionId: "a-1" } },
});
}
const page = await driver.listEvents({ sessionId: "s-1" });
expect(page.items).toHaveLength(2);
// Oldest event (evt-1) should be trimmed
expect(page.items[0]?.id).toBe("evt-2");
expect(page.items[1]?.id).toBe("evt-3");
});
it("clones data to prevent external mutation", async () => {
const driver = new RivetSessionPersistDriver(makeCtx());
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 1,
});
const s1 = await driver.getSession("s-1");
const s2 = await driver.getSession("s-1");
expect(s1).toEqual(s2);
expect(s1).not.toBe(s2); // Different object references
});
it("supports custom stateKey", async () => {
const ctx = makeCtx();
const driver = new RivetSessionPersistDriver(ctx, { stateKey: "myPersist" });
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 1,
});
expect((ctx.state.myPersist as RivetPersistData).sessions["s-1"]).toBeDefined();
expect(ctx.state._sandboxAgentPersist).toBeUndefined();
});
it("returns empty results for unknown session events", async () => {
const driver = new RivetSessionPersistDriver(makeCtx());
const page = await driver.listEvents({ sessionId: "nonexistent" });
expect(page.items).toHaveLength(0);
expect(page.nextCursor).toBeUndefined();
});
});

View file

@ -0,0 +1,5 @@
# @sandbox-agent/persist-sqlite
> **Deprecated:** This package has been deprecated and removed.
Install `better-sqlite3` directly and copy the driver source into your project. See the [full example](https://github.com/rivet-dev/sandbox-agent/tree/main/examples/persist-sqlite) and the [session persistence docs](https://sandboxagent.dev/session-persistence) for guidance.

View file

@ -1,7 +1,7 @@
{ {
"name": "@sandbox-agent/persist-sqlite", "name": "@sandbox-agent/persist-sqlite",
"version": "0.3.2", "version": "0.3.2",
"description": "SQLite persistence driver for the Sandbox Agent TypeScript SDK", "description": "SQLite persistence driver for the Sandbox Agent TypeScript SDK (DEPRECATED)",
"license": "Apache-2.0", "license": "Apache-2.0",
"repository": { "repository": {
"type": "git", "type": "git",
@ -16,24 +16,17 @@
"import": "./dist/index.js" "import": "./dist/index.js"
} }
}, },
"dependencies": { "dependencies": {},
"better-sqlite3": "^11.0.0",
"sandbox-agent": "workspace:*"
},
"files": [ "files": [
"dist" "dist"
], ],
"scripts": { "scripts": {
"build": "tsup", "build": "tsup",
"typecheck": "tsc --noEmit", "typecheck": "tsc --noEmit"
"test": "vitest run",
"test:watch": "vitest"
}, },
"devDependencies": { "devDependencies": {
"@types/better-sqlite3": "^7.0.0",
"@types/node": "^22.0.0", "@types/node": "^22.0.0",
"tsup": "^8.0.0", "tsup": "^8.0.0",
"typescript": "^5.7.0", "typescript": "^5.7.0"
"vitest": "^3.0.0"
} }
} }

View file

@ -1,284 +1,5 @@
import Database from "better-sqlite3"; throw new Error(
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent"; "@sandbox-agent/persist-sqlite has been deprecated and removed. " +
"Copy the reference implementation from examples/persist-sqlite into your project instead. " +
const DEFAULT_LIST_LIMIT = 100; "See https://github.com/rivet-dev/sandbox-agent/tree/main/examples/persist-sqlite",
);
export interface SQLiteSessionPersistDriverOptions {
filename?: string;
}
export class SQLiteSessionPersistDriver implements SessionPersistDriver {
private readonly db: Database.Database;
constructor(options: SQLiteSessionPersistDriverOptions = {}) {
this.db = new Database(options.filename ?? ":memory:");
this.initialize();
}
async getSession(id: string): Promise<SessionRecord | null> {
const row = this.db
.prepare(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, session_init_json
FROM sessions WHERE id = ?`,
)
.get(id) as SessionRow | undefined;
if (!row) {
return null;
}
return decodeSessionRow(row);
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rows = this.db
.prepare(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, session_init_json
FROM sessions
ORDER BY created_at ASC, id ASC
LIMIT ? OFFSET ?`,
)
.all(limit, offset) as SessionRow[];
const countRow = this.db.prepare(`SELECT COUNT(*) as count FROM sessions`).get() as { count: number };
const nextOffset = offset + rows.length;
return {
items: rows.map(decodeSessionRow),
nextCursor: nextOffset < countRow.count ? String(nextOffset) : undefined,
};
}
async updateSession(session: SessionRecord): Promise<void> {
this.db
.prepare(
`INSERT INTO sessions (
id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, session_init_json
) VALUES (?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(id) DO UPDATE SET
agent = excluded.agent,
agent_session_id = excluded.agent_session_id,
last_connection_id = excluded.last_connection_id,
created_at = excluded.created_at,
destroyed_at = excluded.destroyed_at,
session_init_json = excluded.session_init_json`,
)
.run(
session.id,
session.agent,
session.agentSessionId,
session.lastConnectionId,
session.createdAt,
session.destroyedAt ?? null,
session.sessionInit ? JSON.stringify(session.sessionInit) : null,
);
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rows = this.db
.prepare(
`SELECT id, event_index, session_id, created_at, connection_id, sender, payload_json
FROM events
WHERE session_id = ?
ORDER BY event_index ASC, id ASC
LIMIT ? OFFSET ?`,
)
.all(request.sessionId, limit, offset) as EventRow[];
const countRow = this.db.prepare(`SELECT COUNT(*) as count FROM events WHERE session_id = ?`).get(request.sessionId) as { count: number };
const nextOffset = offset + rows.length;
return {
items: rows.map(decodeEventRow),
nextCursor: nextOffset < countRow.count ? String(nextOffset) : undefined,
};
}
async insertEvent(event: SessionEvent): Promise<void> {
this.db
.prepare(
`INSERT INTO events (
id, event_index, session_id, created_at, connection_id, sender, payload_json
) VALUES (?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(id) DO UPDATE SET
event_index = excluded.event_index,
session_id = excluded.session_id,
created_at = excluded.created_at,
connection_id = excluded.connection_id,
sender = excluded.sender,
payload_json = excluded.payload_json`,
)
.run(event.id, event.eventIndex, event.sessionId, event.createdAt, event.connectionId, event.sender, JSON.stringify(event.payload));
}
close(): void {
this.db.close();
}
private initialize(): void {
this.db.exec(`
CREATE TABLE IF NOT EXISTS sessions (
id TEXT PRIMARY KEY,
agent TEXT NOT NULL,
agent_session_id TEXT NOT NULL,
last_connection_id TEXT NOT NULL,
created_at INTEGER NOT NULL,
destroyed_at INTEGER,
session_init_json TEXT
)
`);
this.ensureEventsTable();
}
private ensureEventsTable(): void {
const tableInfo = this.db.prepare(`PRAGMA table_info(events)`).all() as TableInfoRow[];
if (tableInfo.length === 0) {
this.createEventsTable();
return;
}
const idColumn = tableInfo.find((column) => column.name === "id");
const hasEventIndex = tableInfo.some((column) => column.name === "event_index");
const idType = (idColumn?.type ?? "").trim().toUpperCase();
const idIsText = idType === "TEXT";
if (!idIsText || !hasEventIndex) {
this.rebuildEventsTable(hasEventIndex);
}
this.db.exec(`
CREATE INDEX IF NOT EXISTS idx_events_session_order
ON events(session_id, event_index, id)
`);
}
private createEventsTable(): void {
this.db.exec(`
CREATE TABLE IF NOT EXISTS events (
id TEXT PRIMARY KEY,
event_index INTEGER NOT NULL,
session_id TEXT NOT NULL,
created_at INTEGER NOT NULL,
connection_id TEXT NOT NULL,
sender TEXT NOT NULL,
payload_json TEXT NOT NULL
);
CREATE INDEX IF NOT EXISTS idx_events_session_order
ON events(session_id, event_index, id)
`);
}
private rebuildEventsTable(hasEventIndex: boolean): void {
this.db.exec(`
ALTER TABLE events RENAME TO events_legacy;
`);
this.createEventsTable();
if (hasEventIndex) {
this.db.exec(`
INSERT INTO events (id, event_index, session_id, created_at, connection_id, sender, payload_json)
SELECT
CAST(id AS TEXT),
COALESCE(event_index, ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC)),
session_id,
created_at,
connection_id,
sender,
payload_json
FROM events_legacy
`);
} else {
this.db.exec(`
INSERT INTO events (id, event_index, session_id, created_at, connection_id, sender, payload_json)
SELECT
CAST(id AS TEXT),
ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC),
session_id,
created_at,
connection_id,
sender,
payload_json
FROM events_legacy
`);
}
this.db.exec(`DROP TABLE events_legacy`);
}
}
type SessionRow = {
id: string;
agent: string;
agent_session_id: string;
last_connection_id: string;
created_at: number;
destroyed_at: number | null;
session_init_json: string | null;
};
type EventRow = {
id: string;
event_index: number;
session_id: string;
created_at: number;
connection_id: string;
sender: "client" | "agent";
payload_json: string;
};
type TableInfoRow = {
name: string;
type: string;
};
function decodeSessionRow(row: SessionRow): SessionRecord {
return {
id: row.id,
agent: row.agent,
agentSessionId: row.agent_session_id,
lastConnectionId: row.last_connection_id,
createdAt: row.created_at,
destroyedAt: row.destroyed_at ?? undefined,
sessionInit: row.session_init_json ? (JSON.parse(row.session_init_json) as SessionRecord["sessionInit"]) : undefined,
};
}
function decodeEventRow(row: EventRow): SessionEvent {
return {
id: row.id,
eventIndex: row.event_index,
sessionId: row.session_id,
createdAt: row.created_at,
connectionId: row.connection_id,
sender: row.sender,
payload: JSON.parse(row.payload_json),
};
}
function normalizeLimit(limit: number | undefined): number {
if (!Number.isFinite(limit) || (limit ?? 0) < 1) {
return DEFAULT_LIST_LIMIT;
}
return Math.floor(limit as number);
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}

View file

@ -1,131 +0,0 @@
import { describe, it, expect, beforeAll, afterAll } from "vitest";
import { existsSync, mkdtempSync, rmSync } from "node:fs";
import { dirname, join, resolve } from "node:path";
import { fileURLToPath } from "node:url";
import { tmpdir } from "node:os";
import { SandboxAgent } from "sandbox-agent";
import { spawnSandboxAgent, type SandboxAgentSpawnHandle } from "../../typescript/src/spawn.ts";
import { prepareMockAgentDataHome } from "../../typescript/tests/helpers/mock-agent.ts";
import { SQLiteSessionPersistDriver } from "../src/index.ts";
const __dirname = dirname(fileURLToPath(import.meta.url));
function findBinary(): string | null {
if (process.env.SANDBOX_AGENT_BIN) {
return process.env.SANDBOX_AGENT_BIN;
}
const cargoPaths = [resolve(__dirname, "../../../target/debug/sandbox-agent"), resolve(__dirname, "../../../target/release/sandbox-agent")];
for (const p of cargoPaths) {
if (existsSync(p)) {
return p;
}
}
return null;
}
const BINARY_PATH = findBinary();
if (!BINARY_PATH) {
throw new Error("sandbox-agent binary not found. Build it (cargo build -p sandbox-agent) or set SANDBOX_AGENT_BIN.");
}
if (!process.env.SANDBOX_AGENT_BIN) {
process.env.SANDBOX_AGENT_BIN = BINARY_PATH;
}
describe("SQLite persistence driver", () => {
let handle: SandboxAgentSpawnHandle;
let baseUrl: string;
let token: string;
let dataHome: string;
beforeAll(async () => {
dataHome = mkdtempSync(join(tmpdir(), "sqlite-integration-"));
prepareMockAgentDataHome(dataHome);
handle = await spawnSandboxAgent({
enabled: true,
log: "silent",
timeoutMs: 30000,
env: {
XDG_DATA_HOME: dataHome,
HOME: dataHome,
USERPROFILE: dataHome,
APPDATA: join(dataHome, "AppData", "Roaming"),
LOCALAPPDATA: join(dataHome, "AppData", "Local"),
},
});
baseUrl = handle.baseUrl;
token = handle.token;
});
afterAll(async () => {
await handle.dispose();
rmSync(dataHome, { recursive: true, force: true });
});
it("persists session/event history across SDK instances and supports replay restore", async () => {
const tempDir = mkdtempSync(join(tmpdir(), "sqlite-persist-"));
const dbPath = join(tempDir, "session-store.db");
const persist1 = new SQLiteSessionPersistDriver({ filename: dbPath });
const sdk1 = await SandboxAgent.connect({
baseUrl,
token,
persist: persist1,
replayMaxEvents: 40,
replayMaxChars: 16000,
});
const created = await sdk1.createSession({ agent: "mock" });
await created.prompt([{ type: "text", text: "sqlite-first" }]);
const firstConnectionId = created.lastConnectionId;
await sdk1.dispose();
persist1.close();
const persist2 = new SQLiteSessionPersistDriver({ filename: dbPath });
const sdk2 = await SandboxAgent.connect({
baseUrl,
token,
persist: persist2,
replayMaxEvents: 40,
replayMaxChars: 16000,
});
const restored = await sdk2.resumeSession(created.id);
expect(restored.lastConnectionId).not.toBe(firstConnectionId);
await restored.prompt([{ type: "text", text: "sqlite-second" }]);
const sessions = await sdk2.listSessions({ limit: 20 });
expect(sessions.items.some((entry) => entry.id === created.id)).toBe(true);
const events = await sdk2.getEvents({ sessionId: created.id, limit: 1000 });
expect(events.items.length).toBeGreaterThan(0);
expect(events.items.every((event) => typeof event.id === "string")).toBe(true);
expect(events.items.every((event) => Number.isInteger(event.eventIndex))).toBe(true);
for (let i = 1; i < events.items.length; i += 1) {
expect(events.items[i]!.eventIndex).toBeGreaterThanOrEqual(events.items[i - 1]!.eventIndex);
}
const replayInjected = events.items.find((event) => {
if (event.sender !== "client") {
return false;
}
const payload = event.payload as Record<string, unknown>;
const method = payload.method;
const params = payload.params as Record<string, unknown> | undefined;
const prompt = Array.isArray(params?.prompt) ? params?.prompt : [];
const firstBlock = prompt[0] as Record<string, unknown> | undefined;
return method === "session/prompt" && typeof firstBlock?.text === "string" && firstBlock.text.includes("Previous session history is replayed below");
});
expect(replayInjected).toBeTruthy();
await sdk2.dispose();
persist2.close();
rmSync(tempDir, { recursive: true, force: true });
});
});

View file

@ -14,6 +14,74 @@
".": { ".": {
"types": "./dist/index.d.ts", "types": "./dist/index.d.ts",
"import": "./dist/index.js" "import": "./dist/index.js"
},
"./local": {
"types": "./dist/providers/local.d.ts",
"import": "./dist/providers/local.js"
},
"./e2b": {
"types": "./dist/providers/e2b.d.ts",
"import": "./dist/providers/e2b.js"
},
"./daytona": {
"types": "./dist/providers/daytona.d.ts",
"import": "./dist/providers/daytona.js"
},
"./docker": {
"types": "./dist/providers/docker.d.ts",
"import": "./dist/providers/docker.js"
},
"./vercel": {
"types": "./dist/providers/vercel.d.ts",
"import": "./dist/providers/vercel.js"
},
"./cloudflare": {
"types": "./dist/providers/cloudflare.d.ts",
"import": "./dist/providers/cloudflare.js"
},
"./modal": {
"types": "./dist/providers/modal.d.ts",
"import": "./dist/providers/modal.js"
},
"./computesdk": {
"types": "./dist/providers/computesdk.d.ts",
"import": "./dist/providers/computesdk.js"
}
},
"peerDependencies": {
"@cloudflare/sandbox": ">=0.1.0",
"@daytonaio/sdk": ">=0.12.0",
"@e2b/code-interpreter": ">=1.0.0",
"@vercel/sandbox": ">=0.1.0",
"dockerode": ">=4.0.0",
"get-port": ">=7.0.0",
"modal": ">=0.1.0",
"computesdk": ">=0.1.0"
},
"peerDependenciesMeta": {
"@cloudflare/sandbox": {
"optional": true
},
"@daytonaio/sdk": {
"optional": true
},
"@e2b/code-interpreter": {
"optional": true
},
"@vercel/sandbox": {
"optional": true
},
"dockerode": {
"optional": true
},
"get-port": {
"optional": true
},
"modal": {
"optional": true
},
"computesdk": {
"optional": true
} }
}, },
"dependencies": { "dependencies": {
@ -33,8 +101,17 @@
"test:watch": "vitest" "test:watch": "vitest"
}, },
"devDependencies": { "devDependencies": {
"@cloudflare/sandbox": ">=0.1.0",
"@daytonaio/sdk": ">=0.12.0",
"@e2b/code-interpreter": ">=1.0.0",
"@types/dockerode": "^4.0.0",
"@types/node": "^22.0.0", "@types/node": "^22.0.0",
"@types/ws": "^8.18.1", "@types/ws": "^8.18.1",
"@vercel/sandbox": ">=0.1.0",
"dockerode": ">=4.0.0",
"get-port": ">=7.0.0",
"modal": ">=0.1.0",
"computesdk": ">=0.1.0",
"openapi-typescript": "^6.7.0", "openapi-typescript": "^6.7.0",
"tsup": "^8.0.0", "tsup": "^8.0.0",
"typescript": "^5.7.0", "typescript": "^5.7.0",

View file

@ -22,7 +22,7 @@ import {
type SetSessionModeResponse, type SetSessionModeResponse,
type SetSessionModeRequest, type SetSessionModeRequest,
} from "acp-http-client"; } from "acp-http-client";
import type { SandboxAgentSpawnHandle, SandboxAgentSpawnOptions } from "./spawn.ts"; import type { SandboxProvider } from "./providers/types.ts";
import { import {
type AcpServerListResponse, type AcpServerListResponse,
type AgentInfo, type AgentInfo,
@ -89,6 +89,7 @@ const HEALTH_WAIT_MIN_DELAY_MS = 500;
const HEALTH_WAIT_MAX_DELAY_MS = 15_000; const HEALTH_WAIT_MAX_DELAY_MS = 15_000;
const HEALTH_WAIT_LOG_AFTER_MS = 5_000; const HEALTH_WAIT_LOG_AFTER_MS = 5_000;
const HEALTH_WAIT_LOG_EVERY_MS = 10_000; const HEALTH_WAIT_LOG_EVERY_MS = 10_000;
const HEALTH_WAIT_ENSURE_SERVER_AFTER_FAILURES = 3;
export interface SandboxAgentHealthWaitOptions { export interface SandboxAgentHealthWaitOptions {
timeoutMs?: number; timeoutMs?: number;
@ -101,6 +102,8 @@ interface SandboxAgentConnectCommonOptions {
replayMaxChars?: number; replayMaxChars?: number;
signal?: AbortSignal; signal?: AbortSignal;
token?: string; token?: string;
skipHealthCheck?: boolean;
/** @deprecated Use skipHealthCheck instead. */
waitForHealth?: boolean | SandboxAgentHealthWaitOptions; waitForHealth?: boolean | SandboxAgentHealthWaitOptions;
} }
@ -115,17 +118,24 @@ export type SandboxAgentConnectOptions =
}); });
export interface SandboxAgentStartOptions { export interface SandboxAgentStartOptions {
sandbox: SandboxProvider;
sandboxId?: string;
skipHealthCheck?: boolean;
fetch?: typeof fetch; fetch?: typeof fetch;
headers?: HeadersInit; headers?: HeadersInit;
persist?: SessionPersistDriver; persist?: SessionPersistDriver;
replayMaxEvents?: number; replayMaxEvents?: number;
replayMaxChars?: number; replayMaxChars?: number;
spawn?: SandboxAgentSpawnOptions | boolean; signal?: AbortSignal;
token?: string;
} }
export interface SessionCreateRequest { export interface SessionCreateRequest {
id?: string; id?: string;
agent: string; agent: string;
/** Shorthand for `sessionInit.cwd`. Ignored when `sessionInit` is provided. */
cwd?: string;
/** Full session init. When omitted, built from `cwd` (or default) with empty `mcpServers`. */
sessionInit?: Omit<NewSessionRequest, "_meta">; sessionInit?: Omit<NewSessionRequest, "_meta">;
model?: string; model?: string;
mode?: string; mode?: string;
@ -135,6 +145,9 @@ export interface SessionCreateRequest {
export interface SessionResumeOrCreateRequest { export interface SessionResumeOrCreateRequest {
id: string; id: string;
agent: string; agent: string;
/** Shorthand for `sessionInit.cwd`. Ignored when `sessionInit` is provided. */
cwd?: string;
/** Full session init. When omitted, built from `cwd` (or default) with empty `mcpServers`. */
sessionInit?: Omit<NewSessionRequest, "_meta">; sessionInit?: Omit<NewSessionRequest, "_meta">;
model?: string; model?: string;
mode?: string; mode?: string;
@ -824,12 +837,14 @@ export class SandboxAgent {
private readonly defaultHeaders?: HeadersInit; private readonly defaultHeaders?: HeadersInit;
private readonly healthWait: NormalizedHealthWaitOptions; private readonly healthWait: NormalizedHealthWaitOptions;
private readonly healthWaitAbortController = new AbortController(); private readonly healthWaitAbortController = new AbortController();
private sandboxProvider?: SandboxProvider;
private sandboxProviderId?: string;
private sandboxProviderRawId?: string;
private readonly persist: SessionPersistDriver; private readonly persist: SessionPersistDriver;
private readonly replayMaxEvents: number; private readonly replayMaxEvents: number;
private readonly replayMaxChars: number; private readonly replayMaxChars: number;
private spawnHandle?: SandboxAgentSpawnHandle;
private healthPromise?: Promise<void>; private healthPromise?: Promise<void>;
private healthError?: Error; private healthError?: Error;
private disposed = false; private disposed = false;
@ -857,7 +872,7 @@ export class SandboxAgent {
} }
this.fetcher = resolvedFetch; this.fetcher = resolvedFetch;
this.defaultHeaders = options.headers; this.defaultHeaders = options.headers;
this.healthWait = normalizeHealthWaitOptions(options.waitForHealth, options.signal); this.healthWait = normalizeHealthWaitOptions(options.skipHealthCheck, options.waitForHealth, options.signal);
this.persist = options.persist ?? new InMemorySessionPersistDriver(); this.persist = options.persist ?? new InMemorySessionPersistDriver();
this.replayMaxEvents = normalizePositiveInt(options.replayMaxEvents, DEFAULT_REPLAY_MAX_EVENTS); this.replayMaxEvents = normalizePositiveInt(options.replayMaxEvents, DEFAULT_REPLAY_MAX_EVENTS);
@ -870,29 +885,79 @@ export class SandboxAgent {
return new SandboxAgent(options); return new SandboxAgent(options);
} }
static async start(options: SandboxAgentStartOptions = {}): Promise<SandboxAgent> { static async start(options: SandboxAgentStartOptions): Promise<SandboxAgent> {
const spawnOptions = normalizeSpawnOptions(options.spawn, true); const provider = options.sandbox;
if (!spawnOptions.enabled) { if (!provider.getUrl && !provider.getFetch) {
throw new Error("SandboxAgent.start requires spawn to be enabled."); throw new Error(`Sandbox provider '${provider.name}' must implement getUrl() or getFetch().`);
} }
const { spawnSandboxAgent } = await import("./spawn.js"); const existingSandbox = options.sandboxId ? parseSandboxProviderId(options.sandboxId) : null;
const resolvedFetch = options.fetch ?? globalThis.fetch?.bind(globalThis);
const handle = await spawnSandboxAgent(spawnOptions, resolvedFetch);
const client = new SandboxAgent({ if (existingSandbox && existingSandbox.provider !== provider.name) {
baseUrl: handle.baseUrl, throw new Error(
token: handle.token, `SandboxAgent.start received sandboxId '${options.sandboxId}' for provider '${existingSandbox.provider}', but the configured provider is '${provider.name}'.`,
fetch: options.fetch, );
headers: options.headers, }
waitForHealth: false,
persist: options.persist,
replayMaxEvents: options.replayMaxEvents,
replayMaxChars: options.replayMaxChars,
});
client.spawnHandle = handle; const rawSandboxId = existingSandbox?.rawId ?? (await provider.create());
return client; const prefixedSandboxId = `${provider.name}/${rawSandboxId}`;
const createdSandbox = !existingSandbox;
if (existingSandbox) {
await provider.ensureServer?.(rawSandboxId);
}
try {
const fetcher = await resolveProviderFetch(provider, rawSandboxId);
const baseUrl = provider.getUrl ? await provider.getUrl(rawSandboxId) : undefined;
const providerFetch = options.fetch ?? fetcher;
const commonConnectOptions = {
headers: options.headers,
persist: options.persist,
replayMaxEvents: options.replayMaxEvents,
replayMaxChars: options.replayMaxChars,
signal: options.signal,
skipHealthCheck: options.skipHealthCheck,
token: options.token ?? (await resolveProviderToken(provider, rawSandboxId)),
};
const client = providerFetch
? new SandboxAgent({
...commonConnectOptions,
baseUrl,
fetch: providerFetch,
})
: new SandboxAgent({
...commonConnectOptions,
baseUrl: requireSandboxBaseUrl(baseUrl, provider.name),
});
client.sandboxProvider = provider;
client.sandboxProviderId = prefixedSandboxId;
client.sandboxProviderRawId = rawSandboxId;
return client;
} catch (error) {
if (createdSandbox) {
try {
await provider.destroy(rawSandboxId);
} catch {
// Best-effort cleanup if connect fails after provisioning.
}
}
throw error;
}
}
get sandboxId(): string | undefined {
return this.sandboxProviderId;
}
get sandbox(): SandboxProvider | undefined {
return this.sandboxProvider;
}
get inspectorUrl(): string {
return `${this.baseUrl.replace(/\/+$/, "")}/ui/`;
} }
async dispose(): Promise<void> { async dispose(): Promise<void> {
@ -922,10 +987,23 @@ export class SandboxAgent {
await connection.close(); await connection.close();
}), }),
); );
}
if (this.spawnHandle) { async destroySandbox(): Promise<void> {
await this.spawnHandle.dispose(); const provider = this.sandboxProvider;
this.spawnHandle = undefined; const rawSandboxId = this.sandboxProviderRawId;
try {
if (provider && rawSandboxId) {
await provider.destroy(rawSandboxId);
} else if (!provider || !rawSandboxId) {
throw new Error("SandboxAgent is not attached to a provisioned sandbox.");
}
} finally {
await this.dispose();
this.sandboxProvider = undefined;
this.sandboxProviderId = undefined;
this.sandboxProviderRawId = undefined;
} }
} }
@ -956,7 +1034,7 @@ export class SandboxAgent {
const localSessionId = request.id?.trim() || randomId(); const localSessionId = request.id?.trim() || randomId();
const live = await this.getLiveConnection(request.agent.trim()); const live = await this.getLiveConnection(request.agent.trim());
const sessionInit = normalizeSessionInit(request.sessionInit); const sessionInit = normalizeSessionInit(request.sessionInit, request.cwd);
const response = await live.createRemoteSession(localSessionId, sessionInit); const response = await live.createRemoteSession(localSessionId, sessionInit);
@ -966,6 +1044,7 @@ export class SandboxAgent {
agentSessionId: response.sessionId, agentSessionId: response.sessionId,
lastConnectionId: live.connectionId, lastConnectionId: live.connectionId,
createdAt: nowMs(), createdAt: nowMs(),
sandboxId: this.sandboxProviderId,
sessionInit, sessionInit,
configOptions: cloneConfigOptions(response.configOptions), configOptions: cloneConfigOptions(response.configOptions),
modes: cloneModes(response.modes), modes: cloneModes(response.modes),
@ -1692,7 +1771,7 @@ export class SandboxAgent {
}; };
try { try {
await this.persist.insertEvent(event); await this.persist.insertEvent(localSessionId, event);
break; break;
} catch (error) { } catch (error) {
if (!isSessionEventIndexConflict(error) || attempt === MAX_EVENT_INDEX_INSERT_RETRIES - 1) { if (!isSessionEventIndexConflict(error) || attempt === MAX_EVENT_INDEX_INSERT_RETRIES - 1) {
@ -2040,6 +2119,7 @@ export class SandboxAgent {
let delayMs = HEALTH_WAIT_MIN_DELAY_MS; let delayMs = HEALTH_WAIT_MIN_DELAY_MS;
let nextLogAt = startedAt + HEALTH_WAIT_LOG_AFTER_MS; let nextLogAt = startedAt + HEALTH_WAIT_LOG_AFTER_MS;
let lastError: unknown; let lastError: unknown;
let consecutiveFailures = 0;
while (!this.disposed && (deadline === undefined || Date.now() < deadline)) { while (!this.disposed && (deadline === undefined || Date.now() < deadline)) {
throwIfAborted(signal); throwIfAborted(signal);
@ -2050,11 +2130,22 @@ export class SandboxAgent {
return; return;
} }
lastError = new Error(`Unexpected health response: ${JSON.stringify(health)}`); lastError = new Error(`Unexpected health response: ${JSON.stringify(health)}`);
consecutiveFailures++;
} catch (error) { } catch (error) {
if (isAbortError(error)) { if (isAbortError(error)) {
throw error; throw error;
} }
lastError = error; lastError = error;
consecutiveFailures++;
}
if (consecutiveFailures >= HEALTH_WAIT_ENSURE_SERVER_AFTER_FAILURES && this.sandboxProvider?.ensureServer && this.sandboxProviderRawId) {
try {
await this.sandboxProvider.ensureServer(this.sandboxProviderRawId);
} catch {
// Best-effort; the next health check will determine if it worked.
}
consecutiveFailures = 0;
} }
const now = Date.now(); const now = Date.now();
@ -2255,17 +2346,17 @@ function toAgentQuery(options: AgentQueryOptions | undefined): Record<string, Qu
}; };
} }
function normalizeSessionInit(value: Omit<NewSessionRequest, "_meta"> | undefined): Omit<NewSessionRequest, "_meta"> { function normalizeSessionInit(value: Omit<NewSessionRequest, "_meta"> | undefined, cwdShorthand?: string): Omit<NewSessionRequest, "_meta"> {
if (!value) { if (!value) {
return { return {
cwd: defaultCwd(), cwd: cwdShorthand ?? defaultCwd(),
mcpServers: [], mcpServers: [],
}; };
} }
return { return {
...value, ...value,
cwd: value.cwd ?? defaultCwd(), cwd: value.cwd ?? cwdShorthand ?? defaultCwd(),
mcpServers: value.mcpServers ?? [], mcpServers: value.mcpServers ?? [],
}; };
} }
@ -2405,16 +2496,23 @@ function normalizePositiveInt(value: number | undefined, fallback: number): numb
return Math.floor(value as number); return Math.floor(value as number);
} }
function normalizeHealthWaitOptions(value: boolean | SandboxAgentHealthWaitOptions | undefined, signal: AbortSignal | undefined): NormalizedHealthWaitOptions { function normalizeHealthWaitOptions(
if (value === false) { skipHealthCheck: boolean | undefined,
waitForHealth: boolean | SandboxAgentHealthWaitOptions | undefined,
signal: AbortSignal | undefined,
): NormalizedHealthWaitOptions {
if (skipHealthCheck === true || waitForHealth === false) {
return { enabled: false }; return { enabled: false };
} }
if (value === true || value === undefined) { if (waitForHealth === true || waitForHealth === undefined) {
return { enabled: true, signal }; return { enabled: true, signal };
} }
const timeoutMs = typeof value.timeoutMs === "number" && Number.isFinite(value.timeoutMs) && value.timeoutMs > 0 ? Math.floor(value.timeoutMs) : undefined; const timeoutMs =
typeof waitForHealth.timeoutMs === "number" && Number.isFinite(waitForHealth.timeoutMs) && waitForHealth.timeoutMs > 0
? Math.floor(waitForHealth.timeoutMs)
: undefined;
return { return {
enabled: true, enabled: true,
@ -2423,24 +2521,47 @@ function normalizeHealthWaitOptions(value: boolean | SandboxAgentHealthWaitOptio
}; };
} }
function normalizeSpawnOptions( function parseSandboxProviderId(sandboxId: string): { provider: string; rawId: string } {
spawn: SandboxAgentSpawnOptions | boolean | undefined, const slashIndex = sandboxId.indexOf("/");
defaultEnabled: boolean, if (slashIndex < 1 || slashIndex === sandboxId.length - 1) {
): SandboxAgentSpawnOptions & { enabled: boolean } { throw new Error(`Sandbox IDs must be prefixed as "{provider}/{id}". Received '${sandboxId}'.`);
if (spawn === false) {
return { enabled: false };
}
if (spawn === true || spawn === undefined) {
return { enabled: defaultEnabled };
} }
return { return {
...spawn, provider: sandboxId.slice(0, slashIndex),
enabled: spawn.enabled ?? defaultEnabled, rawId: sandboxId.slice(slashIndex + 1),
}; };
} }
function requireSandboxBaseUrl(baseUrl: string | undefined, providerName: string): string {
if (!baseUrl) {
throw new Error(`Sandbox provider '${providerName}' did not return a base URL.`);
}
return baseUrl;
}
async function resolveProviderFetch(provider: SandboxProvider, rawSandboxId: string): Promise<typeof globalThis.fetch | undefined> {
if (provider.getFetch) {
return await provider.getFetch(rawSandboxId);
}
return undefined;
}
async function resolveProviderToken(provider: SandboxProvider, rawSandboxId: string): Promise<string | undefined> {
const maybeGetToken = (
provider as SandboxProvider & {
getToken?: (sandboxId: string) => string | undefined | Promise<string | undefined>;
}
).getToken;
if (typeof maybeGetToken !== "function") {
return undefined;
}
const token = await maybeGetToken.call(provider, rawSandboxId);
return typeof token === "string" && token ? token : undefined;
}
async function readProblem(response: Response): Promise<ProblemDetails | undefined> { async function readProblem(response: Response): Promise<ProblemDetails | undefined> {
try { try {
const text = await response.clone().text(); const text = await response.clone().text();

View file

@ -38,6 +38,7 @@ export type {
export type { InspectorUrlOptions } from "./inspector.ts"; export type { InspectorUrlOptions } from "./inspector.ts";
export { InMemorySessionPersistDriver } from "./types.ts"; export { InMemorySessionPersistDriver } from "./types.ts";
export type { SandboxProvider } from "./providers/types.ts";
export type { export type {
AcpEnvelope, AcpEnvelope,

View file

@ -0,0 +1,79 @@
import type { SandboxProvider } from "./types.ts";
const DEFAULT_AGENT_PORT = 3000;
export interface CloudflareSandboxClient {
create?(options?: Record<string, unknown>): Promise<{ id?: string; sandboxId?: string }>;
connect?(
sandboxId: string,
options?: Record<string, unknown>,
): Promise<{
close?(): Promise<void>;
stop?(): Promise<void>;
containerFetch(input: RequestInfo | URL, init?: RequestInit, port?: number): Promise<Response>;
}>;
}
export interface CloudflareProviderOptions {
sdk: CloudflareSandboxClient;
create?: Record<string, unknown> | (() => Record<string, unknown> | Promise<Record<string, unknown>>);
agentPort?: number;
}
async function resolveCreateOptions(value: CloudflareProviderOptions["create"]): Promise<Record<string, unknown>> {
if (!value) {
return {};
}
if (typeof value === "function") {
return await value();
}
return value;
}
export function cloudflare(options: CloudflareProviderOptions): SandboxProvider {
const agentPort = options.agentPort ?? DEFAULT_AGENT_PORT;
const sdk = options.sdk;
return {
name: "cloudflare",
async create(): Promise<string> {
if (typeof sdk.create !== "function") {
throw new Error('sandbox provider "cloudflare" requires a sdk with a `create()` method.');
}
const sandbox = await sdk.create(await resolveCreateOptions(options.create));
const sandboxId = sandbox.sandboxId ?? sandbox.id;
if (!sandboxId) {
throw new Error("cloudflare sandbox did not return an id");
}
return sandboxId;
},
async destroy(sandboxId: string): Promise<void> {
if (typeof sdk.connect !== "function") {
throw new Error('sandbox provider "cloudflare" requires a sdk with a `connect()` method.');
}
const sandbox = await sdk.connect(sandboxId);
if (typeof sandbox.close === "function") {
await sandbox.close();
return;
}
if (typeof sandbox.stop === "function") {
await sandbox.stop();
}
},
async getFetch(sandboxId: string): Promise<typeof globalThis.fetch> {
if (typeof sdk.connect !== "function") {
throw new Error('sandbox provider "cloudflare" requires a sdk with a `connect()` method.');
}
const sandbox = await sdk.connect(sandboxId);
return async (input, init) =>
sandbox.containerFetch(
input,
{
...(init ?? {}),
signal: undefined,
},
agentPort,
);
},
};
}

View file

@ -0,0 +1,60 @@
import { compute } from "computesdk";
import type { SandboxProvider } from "./types.ts";
import { DEFAULT_AGENTS, SANDBOX_AGENT_INSTALL_SCRIPT } from "./shared.ts";
const DEFAULT_AGENT_PORT = 3000;
export interface ComputeSdkProviderOptions {
create?: {
envs?: Record<string, string>;
};
agentPort?: number;
}
export function computesdk(options: ComputeSdkProviderOptions = {}): SandboxProvider {
const agentPort = options.agentPort ?? DEFAULT_AGENT_PORT;
return {
name: "computesdk",
async create(): Promise<string> {
const envs = options.create?.envs;
const sandbox = await compute.sandbox.create({
envs: envs && Object.keys(envs).length > 0 ? envs : undefined,
});
const run = async (cmd: string, runOptions?: { background?: boolean }) => {
const result = await sandbox.runCommand(cmd, runOptions);
if (typeof result?.exitCode === "number" && result.exitCode !== 0) {
throw new Error(`computesdk command failed: ${cmd} (exit ${result.exitCode})\n${result.stderr || ""}`);
}
return result;
};
await run(`curl -fsSL ${SANDBOX_AGENT_INSTALL_SCRIPT} | sh`);
for (const agent of DEFAULT_AGENTS) {
await run(`sandbox-agent install-agent ${agent}`);
}
await run(`sandbox-agent server --no-token --host 0.0.0.0 --port ${agentPort}`, {
background: true,
});
return sandbox.sandboxId;
},
async destroy(sandboxId: string): Promise<void> {
const sandbox = await compute.sandbox.getById(sandboxId);
if (sandbox) await sandbox.destroy();
},
async getUrl(sandboxId: string): Promise<string> {
const sandbox = await compute.sandbox.getById(sandboxId);
if (!sandbox) throw new Error(`computesdk sandbox not found: ${sandboxId}`);
return sandbox.getUrl({ port: agentPort });
},
async ensureServer(sandboxId: string): Promise<void> {
const sandbox = await compute.sandbox.getById(sandboxId);
if (!sandbox) throw new Error(`computesdk sandbox not found: ${sandboxId}`);
await sandbox.runCommand(`sandbox-agent server --no-token --host 0.0.0.0 --port ${agentPort}`, {
background: true,
});
},
};
}

View file

@ -0,0 +1,67 @@
import { Daytona } from "@daytonaio/sdk";
import type { SandboxProvider } from "./types.ts";
import { DEFAULT_SANDBOX_AGENT_IMAGE, buildServerStartCommand } from "./shared.ts";
const DEFAULT_AGENT_PORT = 3000;
const DEFAULT_PREVIEW_TTL_SECONDS = 4 * 60 * 60;
type DaytonaCreateParams = NonNullable<Parameters<Daytona["create"]>[0]>;
type DaytonaCreateOverrides = Partial<DaytonaCreateParams>;
export interface DaytonaProviderOptions {
create?: DaytonaCreateOverrides | (() => DaytonaCreateOverrides | Promise<DaytonaCreateOverrides>);
image?: string;
agentPort?: number;
previewTtlSeconds?: number;
deleteTimeoutSeconds?: number;
}
async function resolveCreateOptions(value: DaytonaProviderOptions["create"]): Promise<DaytonaCreateOverrides | undefined> {
if (!value) return undefined;
if (typeof value === "function") return await value();
return value;
}
export function daytona(options: DaytonaProviderOptions = {}): SandboxProvider {
const agentPort = options.agentPort ?? DEFAULT_AGENT_PORT;
const image = options.image ?? DEFAULT_SANDBOX_AGENT_IMAGE;
const previewTtlSeconds = options.previewTtlSeconds ?? DEFAULT_PREVIEW_TTL_SECONDS;
const client = new Daytona();
return {
name: "daytona",
async create(): Promise<string> {
const createOpts = await resolveCreateOptions(options.create);
const sandbox = await client.create({
image,
autoStopInterval: 0,
...createOpts,
} as DaytonaCreateParams);
await sandbox.process.executeCommand(buildServerStartCommand(agentPort));
return sandbox.id;
},
async destroy(sandboxId: string): Promise<void> {
const sandbox = await client.get(sandboxId);
if (!sandbox) {
return;
}
await sandbox.delete(options.deleteTimeoutSeconds);
},
async getUrl(sandboxId: string): Promise<string> {
const sandbox = await client.get(sandboxId);
if (!sandbox) {
throw new Error(`daytona sandbox not found: ${sandboxId}`);
}
const preview = await sandbox.getSignedPreviewUrl(agentPort, previewTtlSeconds);
return typeof preview === "string" ? preview : preview.url;
},
async ensureServer(sandboxId: string): Promise<void> {
const sandbox = await client.get(sandboxId);
if (!sandbox) {
throw new Error(`daytona sandbox not found: ${sandboxId}`);
}
await sandbox.process.executeCommand(buildServerStartCommand(agentPort));
},
};
}

View file

@ -0,0 +1,85 @@
import Docker from "dockerode";
import getPort from "get-port";
import type { SandboxProvider } from "./types.ts";
import { DEFAULT_SANDBOX_AGENT_IMAGE } from "./shared.ts";
const DEFAULT_HOST = "127.0.0.1";
const DEFAULT_AGENT_PORT = 3000;
export interface DockerProviderOptions {
image?: string;
host?: string;
agentPort?: number;
env?: string[] | (() => string[] | Promise<string[]>);
binds?: string[] | (() => string[] | Promise<string[]>);
createContainerOptions?: Record<string, unknown>;
}
async function resolveValue<T>(value: T | (() => T | Promise<T>) | undefined, fallback: T): Promise<T> {
if (value === undefined) {
return fallback;
}
if (typeof value === "function") {
return await (value as () => T | Promise<T>)();
}
return value;
}
function extractMappedPort(
inspect: { NetworkSettings?: { Ports?: Record<string, Array<{ HostPort?: string }> | null | undefined> } },
containerPort: number,
): number {
const hostPort = inspect.NetworkSettings?.Ports?.[`${containerPort}/tcp`]?.[0]?.HostPort;
if (!hostPort) {
throw new Error(`docker sandbox-agent port ${containerPort} is not published`);
}
return Number(hostPort);
}
export function docker(options: DockerProviderOptions = {}): SandboxProvider {
const image = options.image ?? DEFAULT_SANDBOX_AGENT_IMAGE;
const host = options.host ?? DEFAULT_HOST;
const agentPort = options.agentPort ?? DEFAULT_AGENT_PORT;
const client = new Docker({ socketPath: "/var/run/docker.sock" });
return {
name: "docker",
async create(): Promise<string> {
const hostPort = await getPort();
const env = await resolveValue(options.env, []);
const binds = await resolveValue(options.binds, []);
const container = await client.createContainer({
Image: image,
Cmd: ["server", "--no-token", "--host", "0.0.0.0", "--port", String(agentPort)],
Env: env,
ExposedPorts: { [`${agentPort}/tcp`]: {} },
HostConfig: {
AutoRemove: true,
Binds: binds,
PortBindings: {
[`${agentPort}/tcp`]: [{ HostPort: String(hostPort) }],
},
},
...(options.createContainerOptions ?? {}),
});
await container.start();
return container.id;
},
async destroy(sandboxId: string): Promise<void> {
const container = client.getContainer(sandboxId);
try {
await container.stop({ t: 5 });
} catch {}
try {
await container.remove({ force: true });
} catch {}
},
async getUrl(sandboxId: string): Promise<string> {
const container = client.getContainer(sandboxId);
const hostPort = extractMappedPort(await container.inspect(), agentPort);
return `http://${host}:${hostPort}`;
},
};
}

Some files were not shown because too many files have changed in this diff Show more