Fix SDK typecheck errors and update persist drivers for insertEvent signature

- Fix insertEvent call in client.ts to pass sessionId as first argument
- Update Daytona provider create options to use Partial type (image has default)
- Update StrictUniqueSessionPersistDriver in tests to match new insertEvent signature
- Sync persist packages, openapi spec, and docs with upstream changes

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Nathan Flurry 2026-03-15 13:17:10 -07:00
parent 6a42f06342
commit 441083ea2a
33 changed files with 1051 additions and 2121 deletions

View file

@ -22,36 +22,15 @@ icon: "rocket"
</Tabs>
</Step>
<Step title="Set environment variables">
Each coding agent requires API keys to connect to their respective LLM providers.
```bash
export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."
```
<AccordionGroup>
<Accordion title="Extracting API keys from current machine">
Use `sandbox-agent credentials extract-env --export` to extract your existing API keys (Anthropic, OpenAI, etc.) from local Claude Code or Codex config files.
</Accordion>
<Accordion title="Testing without API keys">
Use the `mock` agent for SDK and integration testing without provider credentials.
</Accordion>
<Accordion title="Multi-tenant and per-user billing">
For per-tenant token tracking, budget enforcement, or usage-based billing, see [LLM Credentials](/llm-credentials) for gateway options like OpenRouter, LiteLLM, and Portkey.
</Accordion>
</AccordionGroup>
</Step>
<Step title="Start the sandbox">
`SandboxAgent.start()` provisions a sandbox, starts a lightweight [Sandbox Agent server](/architecture) inside it, and connects your SDK client.
`SandboxAgent.start()` provisions a sandbox, starts a lightweight [Sandbox Agent server](/architecture) inside it, and connects your SDK client. Pass your LLM API keys so the agent can reach its provider.
<CodeGroup>
```typescript Local
import { SandboxAgent } from "sandbox-agent";
import { local } from "sandbox-agent/local";
// Runs on your machine. Best for local development and testing.
// Runs on your machine. Inherits process.env automatically.
const sdk = await SandboxAgent.start({
sandbox: local(),
});
@ -62,7 +41,15 @@ icon: "rocket"
import { e2b } from "sandbox-agent/e2b";
const sdk = await SandboxAgent.start({
sandbox: e2b({ create: { envs } }),
sandbox: e2b({
create: {
// Pass whichever keys your agent needs
envs: {
ANTHROPIC_API_KEY: process.env.ANTHROPIC_API_KEY,
OPENAI_API_KEY: process.env.OPENAI_API_KEY,
},
},
}),
});
```
@ -71,7 +58,14 @@ icon: "rocket"
import { daytona } from "sandbox-agent/daytona";
const sdk = await SandboxAgent.start({
sandbox: daytona({ create: { envVars } }),
sandbox: daytona({
create: {
envVars: {
ANTHROPIC_API_KEY: process.env.ANTHROPIC_API_KEY,
OPENAI_API_KEY: process.env.OPENAI_API_KEY,
},
},
}),
});
```
@ -80,7 +74,15 @@ icon: "rocket"
import { vercel } from "sandbox-agent/vercel";
const sdk = await SandboxAgent.start({
sandbox: vercel({ create: { runtime: "node24", env } }),
sandbox: vercel({
create: {
runtime: "node24",
env: {
ANTHROPIC_API_KEY: process.env.ANTHROPIC_API_KEY,
OPENAI_API_KEY: process.env.OPENAI_API_KEY,
},
},
}),
});
```
@ -100,13 +102,16 @@ icon: "rocket"
// Good for testing. Not security-hardened like cloud sandboxes.
const sdk = await SandboxAgent.start({
sandbox: docker({
env: [`ANTHROPIC_API_KEY=${process.env.ANTHROPIC_API_KEY}`],
env: [
`ANTHROPIC_API_KEY=${process.env.ANTHROPIC_API_KEY}`,
`OPENAI_API_KEY=${process.env.OPENAI_API_KEY}`,
],
}),
});
```
</CodeGroup>
Each provider handles provisioning, server installation, and networking. Install the provider's peer dependency (e.g. `@e2b/code-interpreter`, `dockerode`) in your project. See the [Deploy](/deploy/local) guides for full setup details.
Each provider handles provisioning, server installation, and networking. Install the provider's peer dependency (e.g. `@e2b/code-interpreter`, `dockerode`) in your project. See the [Deploy](/deploy/local) guides for full setup details. For multi-tenant billing, per-user keys, and gateway options, see [LLM Credentials](/llm-credentials).
<AccordionGroup>
<Accordion title="Implementing a custom provider">
@ -212,10 +217,14 @@ icon: "rocket"
```typescript
import { SandboxAgent } from "sandbox-agent";
import { local } from "sandbox-agent/local";
import { e2b } from "sandbox-agent/e2b";
const sdk = await SandboxAgent.start({
sandbox: local(),
sandbox: e2b({
create: {
envs: { ANTHROPIC_API_KEY: process.env.ANTHROPIC_API_KEY },
},
}),
});
try {

View file

@ -23,12 +23,6 @@ The TypeScript SDK is centered on `sandbox-agent` and its `SandboxAgent` class.
</Tab>
</Tabs>
## Optional persistence drivers
```bash
npm install @sandbox-agent/persist-indexeddb@0.3.x @sandbox-agent/persist-sqlite@0.3.x @sandbox-agent/persist-postgres@0.3.x
```
## Optional React components
```bash
@ -68,15 +62,12 @@ const sdk = await SandboxAgent.connect({
controller.abort();
```
With persistence:
With persistence (see [Persisting Sessions](/session-persistence) for driver options):
```ts
import { SandboxAgent } from "sandbox-agent";
import { SQLiteSessionPersistDriver } from "@sandbox-agent/persist-sqlite";
import { SandboxAgent, InMemorySessionPersistDriver } from "sandbox-agent";
const persist = new SQLiteSessionPersistDriver({
filename: "./sessions.db",
});
const persist = new InMemorySessionPersistDriver();
const sdk = await SandboxAgent.connect({
baseUrl: "http://127.0.0.1:2468",

View file

@ -15,9 +15,9 @@ Each driver stores:
## Persistence drivers
### In-memory
### In-memory (built-in)
Best for local dev and ephemeral workloads.
Best for local dev and ephemeral workloads. No extra dependencies required.
```ts
import { InMemorySessionPersistDriver, SandboxAgent } from "sandbox-agent";
@ -33,91 +33,17 @@ const sdk = await SandboxAgent.connect({
});
```
### Rivet
Recommended for sandbox orchestration with actor state.
```bash
npm install @sandbox-agent/persist-rivet@0.3.x
```
```ts
import { actor } from "rivetkit";
import { SandboxAgent } from "sandbox-agent";
import { RivetSessionPersistDriver, type RivetPersistState } from "@sandbox-agent/persist-rivet";
type PersistedState = RivetPersistState & {
sandboxId: string;
baseUrl: string;
};
export default actor({
createState: async () => {
return {
sandboxId: "sbx_123",
baseUrl: "http://127.0.0.1:2468",
} satisfies Partial<PersistedState>;
},
createVars: async (c) => {
const persist = new RivetSessionPersistDriver(c);
const sdk = await SandboxAgent.connect({
baseUrl: c.state.baseUrl,
persist,
});
const session = await sdk.resumeOrCreateSession({ id: "default", agent: "codex" });
const unsubscribe = session.onEvent((event) => {
c.broadcast("session.event", event);
});
return { sdk, session, unsubscribe };
},
actions: {
sendMessage: async (c, message: string) => {
await c.vars.session.prompt([{ type: "text", text: message }]);
},
},
onSleep: async (c) => {
c.vars.unsubscribe?.();
await c.vars.sdk.dispose();
},
});
```
### IndexedDB
Best for browser apps that should survive reloads.
```bash
npm install @sandbox-agent/persist-indexeddb@0.3.x
```
```ts
import { SandboxAgent } from "sandbox-agent";
import { IndexedDbSessionPersistDriver } from "@sandbox-agent/persist-indexeddb";
const persist = new IndexedDbSessionPersistDriver({
databaseName: "sandbox-agent-session-store",
});
const sdk = await SandboxAgent.connect({
baseUrl: "http://127.0.0.1:2468",
persist,
});
```
### SQLite
Best for local/server Node apps that need durable storage without a DB server.
```bash
npm install @sandbox-agent/persist-sqlite@0.3.x
npm install better-sqlite3
```
```ts
import { SandboxAgent } from "sandbox-agent";
import { SQLiteSessionPersistDriver } from "@sandbox-agent/persist-sqlite";
import { SQLiteSessionPersistDriver } from "./persist.ts";
const persist = new SQLiteSessionPersistDriver({
filename: "./sandbox-agent.db",
@ -129,17 +55,19 @@ const sdk = await SandboxAgent.connect({
});
```
See the [full SQLite example](https://github.com/nichochar/sandbox-agent/tree/main/examples/persist-sqlite) for the complete driver implementation you can copy into your project.
### Postgres
Use when you already run Postgres and want shared relational storage.
```bash
npm install @sandbox-agent/persist-postgres@0.3.x
npm install pg
```
```ts
import { SandboxAgent } from "sandbox-agent";
import { PostgresSessionPersistDriver } from "@sandbox-agent/persist-postgres";
import { PostgresSessionPersistDriver } from "./persist.ts";
const persist = new PostgresSessionPersistDriver({
connectionString: process.env.DATABASE_URL,
@ -152,6 +80,16 @@ const sdk = await SandboxAgent.connect({
});
```
See the [full Postgres example](https://github.com/nichochar/sandbox-agent/tree/main/examples/persist-postgres) for the complete driver implementation you can copy into your project.
### IndexedDB (browser)
Best for browser apps that should survive reloads. See the [Inspector source](https://github.com/nichochar/sandbox-agent/tree/main/frontend/packages/inspector/src/persist-indexeddb.ts) for a complete IndexedDB driver you can copy into your project.
### Rivet
Recommended for sandbox orchestration with actor state. See [Multiplayer](/multiplayer) for a full Rivet actor example with inline persistence.
### Custom driver
Implement `SessionPersistDriver` for custom backends.

View file

@ -8,7 +8,6 @@
},
"dependencies": {
"@sandbox-agent/example-shared": "workspace:*",
"@sandbox-agent/persist-postgres": "workspace:*",
"pg": "latest",
"sandbox-agent": "workspace:*"
},

View file

@ -3,7 +3,7 @@ import { randomUUID } from "node:crypto";
import { Client } from "pg";
import { setTimeout as delay } from "node:timers/promises";
import { SandboxAgent } from "sandbox-agent";
import { PostgresSessionPersistDriver } from "@sandbox-agent/persist-postgres";
import { PostgresSessionPersistDriver } from "./persist.ts";
import { startDockerSandbox } from "@sandbox-agent/example-shared/docker";
import { detectAgent } from "@sandbox-agent/example-shared";

View file

@ -0,0 +1,316 @@
import { Pool, type PoolConfig } from "pg";
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent";
const DEFAULT_LIST_LIMIT = 100;
export interface PostgresSessionPersistDriverOptions {
connectionString?: string;
pool?: Pool;
poolConfig?: PoolConfig;
schema?: string;
}
export class PostgresSessionPersistDriver implements SessionPersistDriver {
private readonly pool: Pool;
private readonly ownsPool: boolean;
private readonly schema: string;
private readonly initialized: Promise<void>;
constructor(options: PostgresSessionPersistDriverOptions = {}) {
this.schema = normalizeSchema(options.schema ?? "public");
if (options.pool) {
this.pool = options.pool;
this.ownsPool = false;
} else {
this.pool = new Pool({
connectionString: options.connectionString,
...options.poolConfig,
});
this.ownsPool = true;
}
this.initialized = this.initialize();
}
async getSession(id: string): Promise<SessionRecord | undefined> {
await this.ready();
const result = await this.pool.query<SessionRow>(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json
FROM ${this.table("sessions")}
WHERE id = $1`,
[id],
);
if (result.rows.length === 0) {
return undefined;
}
return decodeSessionRow(result.rows[0]);
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
await this.ready();
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rowsResult = await this.pool.query<SessionRow>(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json
FROM ${this.table("sessions")}
ORDER BY created_at ASC, id ASC
LIMIT $1 OFFSET $2`,
[limit, offset],
);
const countResult = await this.pool.query<{ count: string }>(`SELECT COUNT(*) AS count FROM ${this.table("sessions")}`);
const total = parseInteger(countResult.rows[0]?.count ?? "0");
const nextOffset = offset + rowsResult.rows.length;
return {
items: rowsResult.rows.map(decodeSessionRow),
nextCursor: nextOffset < total ? String(nextOffset) : undefined,
};
}
async updateSession(session: SessionRecord): Promise<void> {
await this.ready();
await this.pool.query(
`INSERT INTO ${this.table("sessions")} (
id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)
ON CONFLICT(id) DO UPDATE SET
agent = EXCLUDED.agent,
agent_session_id = EXCLUDED.agent_session_id,
last_connection_id = EXCLUDED.last_connection_id,
created_at = EXCLUDED.created_at,
destroyed_at = EXCLUDED.destroyed_at,
sandbox_id = EXCLUDED.sandbox_id,
session_init_json = EXCLUDED.session_init_json`,
[
session.id,
session.agent,
session.agentSessionId,
session.lastConnectionId,
session.createdAt,
session.destroyedAt ?? null,
session.sandboxId ?? null,
session.sessionInit ?? null,
],
);
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
await this.ready();
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rowsResult = await this.pool.query<EventRow>(
`SELECT id, event_index, session_id, created_at, connection_id, sender, payload_json
FROM ${this.table("events")}
WHERE session_id = $1
ORDER BY event_index ASC, id ASC
LIMIT $2 OFFSET $3`,
[request.sessionId, limit, offset],
);
const countResult = await this.pool.query<{ count: string }>(`SELECT COUNT(*) AS count FROM ${this.table("events")} WHERE session_id = $1`, [
request.sessionId,
]);
const total = parseInteger(countResult.rows[0]?.count ?? "0");
const nextOffset = offset + rowsResult.rows.length;
return {
items: rowsResult.rows.map(decodeEventRow),
nextCursor: nextOffset < total ? String(nextOffset) : undefined,
};
}
async insertEvent(_sessionId: string, event: SessionEvent): Promise<void> {
await this.ready();
await this.pool.query(
`INSERT INTO ${this.table("events")} (
id, event_index, session_id, created_at, connection_id, sender, payload_json
) VALUES ($1, $2, $3, $4, $5, $6, $7)
ON CONFLICT(id) DO UPDATE SET
event_index = EXCLUDED.event_index,
session_id = EXCLUDED.session_id,
created_at = EXCLUDED.created_at,
connection_id = EXCLUDED.connection_id,
sender = EXCLUDED.sender,
payload_json = EXCLUDED.payload_json`,
[event.id, event.eventIndex, event.sessionId, event.createdAt, event.connectionId, event.sender, event.payload],
);
}
async close(): Promise<void> {
if (!this.ownsPool) {
return;
}
await this.pool.end();
}
private async ready(): Promise<void> {
await this.initialized;
}
private table(name: "sessions" | "events"): string {
return `"${this.schema}"."${name}"`;
}
private async initialize(): Promise<void> {
await this.pool.query(`CREATE SCHEMA IF NOT EXISTS "${this.schema}"`);
await this.pool.query(`
CREATE TABLE IF NOT EXISTS ${this.table("sessions")} (
id TEXT PRIMARY KEY,
agent TEXT NOT NULL,
agent_session_id TEXT NOT NULL,
last_connection_id TEXT NOT NULL,
created_at BIGINT NOT NULL,
destroyed_at BIGINT,
sandbox_id TEXT,
session_init_json JSONB
)
`);
await this.pool.query(`
ALTER TABLE ${this.table("sessions")}
ADD COLUMN IF NOT EXISTS sandbox_id TEXT
`);
await this.pool.query(`
CREATE TABLE IF NOT EXISTS ${this.table("events")} (
id TEXT PRIMARY KEY,
event_index BIGINT NOT NULL,
session_id TEXT NOT NULL,
created_at BIGINT NOT NULL,
connection_id TEXT NOT NULL,
sender TEXT NOT NULL,
payload_json JSONB NOT NULL
)
`);
await this.pool.query(`
ALTER TABLE ${this.table("events")}
ALTER COLUMN id TYPE TEXT USING id::TEXT
`);
await this.pool.query(`
ALTER TABLE ${this.table("events")}
ADD COLUMN IF NOT EXISTS event_index BIGINT
`);
await this.pool.query(`
WITH ranked AS (
SELECT id, ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC) AS ranked_index
FROM ${this.table("events")}
)
UPDATE ${this.table("events")} AS current_events
SET event_index = ranked.ranked_index
FROM ranked
WHERE current_events.id = ranked.id
AND current_events.event_index IS NULL
`);
await this.pool.query(`
ALTER TABLE ${this.table("events")}
ALTER COLUMN event_index SET NOT NULL
`);
await this.pool.query(`
CREATE INDEX IF NOT EXISTS idx_events_session_order
ON ${this.table("events")}(session_id, event_index, id)
`);
}
}
type SessionRow = {
id: string;
agent: string;
agent_session_id: string;
last_connection_id: string;
created_at: string | number;
destroyed_at: string | number | null;
sandbox_id: string | null;
session_init_json: unknown | null;
};
type EventRow = {
id: string | number;
event_index: string | number;
session_id: string;
created_at: string | number;
connection_id: string;
sender: string;
payload_json: unknown;
};
function decodeSessionRow(row: SessionRow): SessionRecord {
return {
id: row.id,
agent: row.agent,
agentSessionId: row.agent_session_id,
lastConnectionId: row.last_connection_id,
createdAt: parseInteger(row.created_at),
destroyedAt: row.destroyed_at === null ? undefined : parseInteger(row.destroyed_at),
sandboxId: row.sandbox_id ?? undefined,
sessionInit: row.session_init_json ? (row.session_init_json as SessionRecord["sessionInit"]) : undefined,
};
}
function decodeEventRow(row: EventRow): SessionEvent {
return {
id: String(row.id),
eventIndex: parseInteger(row.event_index),
sessionId: row.session_id,
createdAt: parseInteger(row.created_at),
connectionId: row.connection_id,
sender: parseSender(row.sender),
payload: row.payload_json as SessionEvent["payload"],
};
}
function normalizeLimit(limit: number | undefined): number {
if (!Number.isFinite(limit) || (limit ?? 0) < 1) {
return DEFAULT_LIST_LIMIT;
}
return Math.floor(limit as number);
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
function parseInteger(value: string | number): number {
const parsed = typeof value === "number" ? value : Number.parseInt(value, 10);
if (!Number.isFinite(parsed)) {
throw new Error(`Invalid integer value returned by postgres: ${String(value)}`);
}
return parsed;
}
function parseSender(value: string): SessionEvent["sender"] {
if (value === "agent" || value === "client") {
return value;
}
throw new Error(`Invalid sender value returned by postgres: ${value}`);
}
function normalizeSchema(schema: string): string {
if (!/^[A-Za-z_][A-Za-z0-9_]*$/.test(schema)) {
throw new Error(`Invalid schema name '${schema}'. Use letters, numbers, and underscores only.`);
}
return schema;
}

View file

@ -8,10 +8,11 @@
},
"dependencies": {
"@sandbox-agent/example-shared": "workspace:*",
"@sandbox-agent/persist-sqlite": "workspace:*",
"better-sqlite3": "^11.0.0",
"sandbox-agent": "workspace:*"
},
"devDependencies": {
"@types/better-sqlite3": "^7.0.0",
"@types/node": "latest",
"tsx": "latest",
"typescript": "latest"

View file

@ -1,5 +1,5 @@
import { SandboxAgent } from "sandbox-agent";
import { SQLiteSessionPersistDriver } from "@sandbox-agent/persist-sqlite";
import { SQLiteSessionPersistDriver } from "./persist.ts";
import { startDockerSandbox } from "@sandbox-agent/example-shared/docker";
import { detectAgent } from "@sandbox-agent/example-shared";

View file

@ -0,0 +1,294 @@
import Database from "better-sqlite3";
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent";
const DEFAULT_LIST_LIMIT = 100;
export interface SQLiteSessionPersistDriverOptions {
filename?: string;
}
export class SQLiteSessionPersistDriver implements SessionPersistDriver {
private readonly db: Database.Database;
constructor(options: SQLiteSessionPersistDriverOptions = {}) {
this.db = new Database(options.filename ?? ":memory:");
this.initialize();
}
async getSession(id: string): Promise<SessionRecord | undefined> {
const row = this.db
.prepare(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json
FROM sessions WHERE id = ?`,
)
.get(id) as SessionRow | undefined;
if (!row) {
return undefined;
}
return decodeSessionRow(row);
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rows = this.db
.prepare(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json
FROM sessions
ORDER BY created_at ASC, id ASC
LIMIT ? OFFSET ?`,
)
.all(limit, offset) as SessionRow[];
const countRow = this.db.prepare(`SELECT COUNT(*) as count FROM sessions`).get() as { count: number };
const nextOffset = offset + rows.length;
return {
items: rows.map(decodeSessionRow),
nextCursor: nextOffset < countRow.count ? String(nextOffset) : undefined,
};
}
async updateSession(session: SessionRecord): Promise<void> {
this.db
.prepare(
`INSERT INTO sessions (
id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json
) VALUES (?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(id) DO UPDATE SET
agent = excluded.agent,
agent_session_id = excluded.agent_session_id,
last_connection_id = excluded.last_connection_id,
created_at = excluded.created_at,
destroyed_at = excluded.destroyed_at,
sandbox_id = excluded.sandbox_id,
session_init_json = excluded.session_init_json`,
)
.run(
session.id,
session.agent,
session.agentSessionId,
session.lastConnectionId,
session.createdAt,
session.destroyedAt ?? null,
session.sandboxId ?? null,
session.sessionInit ? JSON.stringify(session.sessionInit) : null,
);
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rows = this.db
.prepare(
`SELECT id, event_index, session_id, created_at, connection_id, sender, payload_json
FROM events
WHERE session_id = ?
ORDER BY event_index ASC, id ASC
LIMIT ? OFFSET ?`,
)
.all(request.sessionId, limit, offset) as EventRow[];
const countRow = this.db.prepare(`SELECT COUNT(*) as count FROM events WHERE session_id = ?`).get(request.sessionId) as { count: number };
const nextOffset = offset + rows.length;
return {
items: rows.map(decodeEventRow),
nextCursor: nextOffset < countRow.count ? String(nextOffset) : undefined,
};
}
async insertEvent(_sessionId: string, event: SessionEvent): Promise<void> {
this.db
.prepare(
`INSERT INTO events (
id, event_index, session_id, created_at, connection_id, sender, payload_json
) VALUES (?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(id) DO UPDATE SET
event_index = excluded.event_index,
session_id = excluded.session_id,
created_at = excluded.created_at,
connection_id = excluded.connection_id,
sender = excluded.sender,
payload_json = excluded.payload_json`,
)
.run(event.id, event.eventIndex, event.sessionId, event.createdAt, event.connectionId, event.sender, JSON.stringify(event.payload));
}
close(): void {
this.db.close();
}
private initialize(): void {
this.db.exec(`
CREATE TABLE IF NOT EXISTS sessions (
id TEXT PRIMARY KEY,
agent TEXT NOT NULL,
agent_session_id TEXT NOT NULL,
last_connection_id TEXT NOT NULL,
created_at INTEGER NOT NULL,
destroyed_at INTEGER,
sandbox_id TEXT,
session_init_json TEXT
)
`);
const sessionColumns = this.db.prepare(`PRAGMA table_info(sessions)`).all() as TableInfoRow[];
if (!sessionColumns.some((column) => column.name === "sandbox_id")) {
this.db.exec(`ALTER TABLE sessions ADD COLUMN sandbox_id TEXT`);
}
this.ensureEventsTable();
}
private ensureEventsTable(): void {
const tableInfo = this.db.prepare(`PRAGMA table_info(events)`).all() as TableInfoRow[];
if (tableInfo.length === 0) {
this.createEventsTable();
return;
}
const idColumn = tableInfo.find((column) => column.name === "id");
const hasEventIndex = tableInfo.some((column) => column.name === "event_index");
const idType = (idColumn?.type ?? "").trim().toUpperCase();
const idIsText = idType === "TEXT";
if (!idIsText || !hasEventIndex) {
this.rebuildEventsTable(hasEventIndex);
}
this.db.exec(`
CREATE INDEX IF NOT EXISTS idx_events_session_order
ON events(session_id, event_index, id)
`);
}
private createEventsTable(): void {
this.db.exec(`
CREATE TABLE IF NOT EXISTS events (
id TEXT PRIMARY KEY,
event_index INTEGER NOT NULL,
session_id TEXT NOT NULL,
created_at INTEGER NOT NULL,
connection_id TEXT NOT NULL,
sender TEXT NOT NULL,
payload_json TEXT NOT NULL
);
CREATE INDEX IF NOT EXISTS idx_events_session_order
ON events(session_id, event_index, id)
`);
}
private rebuildEventsTable(hasEventIndex: boolean): void {
this.db.exec(`
ALTER TABLE events RENAME TO events_legacy;
`);
this.createEventsTable();
if (hasEventIndex) {
this.db.exec(`
INSERT INTO events (id, event_index, session_id, created_at, connection_id, sender, payload_json)
SELECT
CAST(id AS TEXT),
COALESCE(event_index, ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC)),
session_id,
created_at,
connection_id,
sender,
payload_json
FROM events_legacy
`);
} else {
this.db.exec(`
INSERT INTO events (id, event_index, session_id, created_at, connection_id, sender, payload_json)
SELECT
CAST(id AS TEXT),
ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC),
session_id,
created_at,
connection_id,
sender,
payload_json
FROM events_legacy
`);
}
this.db.exec(`DROP TABLE events_legacy`);
}
}
type SessionRow = {
id: string;
agent: string;
agent_session_id: string;
last_connection_id: string;
created_at: number;
destroyed_at: number | null;
sandbox_id: string | null;
session_init_json: string | null;
};
type EventRow = {
id: string;
event_index: number;
session_id: string;
created_at: number;
connection_id: string;
sender: "client" | "agent";
payload_json: string;
};
type TableInfoRow = {
name: string;
type: string;
};
function decodeSessionRow(row: SessionRow): SessionRecord {
return {
id: row.id,
agent: row.agent,
agentSessionId: row.agent_session_id,
lastConnectionId: row.last_connection_id,
createdAt: row.created_at,
destroyedAt: row.destroyed_at ?? undefined,
sandboxId: row.sandbox_id ?? undefined,
sessionInit: row.session_init_json ? (JSON.parse(row.session_init_json) as SessionRecord["sessionInit"]) : undefined,
};
}
function decodeEventRow(row: EventRow): SessionEvent {
return {
id: row.id,
eventIndex: row.event_index,
sessionId: row.session_id,
createdAt: row.created_at,
connectionId: row.connection_id,
sender: row.sender,
payload: JSON.parse(row.payload_json),
};
}
function normalizeLimit(limit: number | undefined): number {
if (!Number.isFinite(limit) || (limit ?? 0) < 1) {
return DEFAULT_LIST_LIMIT;
}
return Math.floor(limit as number);
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}

View file

@ -6,10 +6,10 @@
"type": "module",
"scripts": {
"dev": "vite",
"build": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/persist-indexeddb build && pnpm --filter @sandbox-agent/react build && vite build",
"build": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/react build && vite build",
"preview": "vite preview",
"typecheck": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/persist-indexeddb build && pnpm --filter @sandbox-agent/react build && tsc --noEmit",
"test": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/persist-indexeddb build && pnpm --filter @sandbox-agent/react build && vitest run"
"typecheck": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/react build && tsc --noEmit",
"test": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/react build && vitest run"
},
"devDependencies": {
"@sandbox-agent/react": "workspace:*",
@ -23,7 +23,6 @@
"vitest": "^3.0.0"
},
"dependencies": {
"@sandbox-agent/persist-indexeddb": "workspace:*",
"lucide-react": "^0.469.0",
"react": "^18.3.1",
"react-dom": "^18.3.1"

View file

@ -24,7 +24,7 @@ type ConfigOption = {
};
type AgentModeInfo = { id: string; name: string; description: string };
type AgentModelInfo = { id: string; name?: string };
import { IndexedDbSessionPersistDriver } from "@sandbox-agent/persist-indexeddb";
import { IndexedDbSessionPersistDriver } from "./persist-indexeddb";
import ChatPanel from "./components/chat/ChatPanel";
import ConnectScreen from "./components/ConnectScreen";
import DebugPanel, { type DebugTab } from "./components/debug/DebugPanel";

View file

@ -0,0 +1,314 @@
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent";
const DEFAULT_DB_NAME = "sandbox-agent-session-store";
const DEFAULT_DB_VERSION = 2;
const SESSIONS_STORE = "sessions";
const EVENTS_STORE = "events";
const EVENTS_BY_SESSION_INDEX = "by_session_index";
const DEFAULT_LIST_LIMIT = 100;
export interface IndexedDbSessionPersistDriverOptions {
databaseName?: string;
databaseVersion?: number;
indexedDb?: IDBFactory;
}
export class IndexedDbSessionPersistDriver implements SessionPersistDriver {
private readonly indexedDb: IDBFactory;
private readonly dbName: string;
private readonly dbVersion: number;
private readonly dbPromise: Promise<IDBDatabase>;
constructor(options: IndexedDbSessionPersistDriverOptions = {}) {
const indexedDb = options.indexedDb ?? globalThis.indexedDB;
if (!indexedDb) {
throw new Error("IndexedDB is not available in this runtime.");
}
this.indexedDb = indexedDb;
this.dbName = options.databaseName ?? DEFAULT_DB_NAME;
this.dbVersion = options.databaseVersion ?? DEFAULT_DB_VERSION;
this.dbPromise = this.openDatabase();
}
async getSession(id: string): Promise<SessionRecord | undefined> {
const db = await this.dbPromise;
const row = await requestToPromise<IDBValidKey | SessionRow | undefined>(db.transaction(SESSIONS_STORE, "readonly").objectStore(SESSIONS_STORE).get(id));
if (!row || typeof row !== "object") {
return undefined;
}
return decodeSessionRow(row as SessionRow);
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
const db = await this.dbPromise;
const rows = await getAllRows<SessionRow>(db, SESSIONS_STORE);
rows.sort((a, b) => {
if (a.createdAt !== b.createdAt) {
return a.createdAt - b.createdAt;
}
return a.id.localeCompare(b.id);
});
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const slice = rows.slice(offset, offset + limit).map(decodeSessionRow);
const nextOffset = offset + slice.length;
return {
items: slice,
nextCursor: nextOffset < rows.length ? String(nextOffset) : undefined,
};
}
async updateSession(session: SessionRecord): Promise<void> {
const db = await this.dbPromise;
await transactionPromise(db, [SESSIONS_STORE], "readwrite", (tx) => {
tx.objectStore(SESSIONS_STORE).put(encodeSessionRow(session));
});
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
const db = await this.dbPromise;
const rows = (await getAllRows<EventRow>(db, EVENTS_STORE)).filter((row) => row.sessionId === request.sessionId).sort(compareEventRowsByOrder);
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const slice = rows.slice(offset, offset + limit).map(decodeEventRow);
const nextOffset = offset + slice.length;
return {
items: slice,
nextCursor: nextOffset < rows.length ? String(nextOffset) : undefined,
};
}
async insertEvent(_sessionId: string, event: SessionEvent): Promise<void> {
const db = await this.dbPromise;
await transactionPromise(db, [EVENTS_STORE], "readwrite", (tx) => {
tx.objectStore(EVENTS_STORE).put(encodeEventRow(event));
});
}
async close(): Promise<void> {
const db = await this.dbPromise;
db.close();
}
private openDatabase(): Promise<IDBDatabase> {
return new Promise((resolve, reject) => {
const request = this.indexedDb.open(this.dbName, this.dbVersion);
request.onupgradeneeded = () => {
const db = request.result;
if (!db.objectStoreNames.contains(SESSIONS_STORE)) {
db.createObjectStore(SESSIONS_STORE, { keyPath: "id" });
}
if (!db.objectStoreNames.contains(EVENTS_STORE)) {
const events = db.createObjectStore(EVENTS_STORE, { keyPath: "id" });
events.createIndex(EVENTS_BY_SESSION_INDEX, ["sessionId", "eventIndex", "id"], {
unique: false,
});
} else {
const tx = request.transaction;
if (!tx) {
return;
}
const events = tx.objectStore(EVENTS_STORE);
if (!events.indexNames.contains(EVENTS_BY_SESSION_INDEX)) {
events.createIndex(EVENTS_BY_SESSION_INDEX, ["sessionId", "eventIndex", "id"], {
unique: false,
});
}
}
};
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error ?? new Error("Unable to open IndexedDB"));
});
}
}
type SessionRow = {
id: string;
agent: string;
agentSessionId: string;
lastConnectionId: string;
createdAt: number;
destroyedAt?: number;
sandboxId?: string;
sessionInit?: SessionRecord["sessionInit"];
};
type EventRow = {
id: number | string;
eventIndex?: number;
sessionId: string;
createdAt: number;
connectionId: string;
sender: "client" | "agent";
payload: unknown;
};
function encodeSessionRow(session: SessionRecord): SessionRow {
return {
id: session.id,
agent: session.agent,
agentSessionId: session.agentSessionId,
lastConnectionId: session.lastConnectionId,
createdAt: session.createdAt,
destroyedAt: session.destroyedAt,
sandboxId: session.sandboxId,
sessionInit: session.sessionInit,
};
}
function decodeSessionRow(row: SessionRow): SessionRecord {
return {
id: row.id,
agent: row.agent,
agentSessionId: row.agentSessionId,
lastConnectionId: row.lastConnectionId,
createdAt: row.createdAt,
destroyedAt: row.destroyedAt,
sandboxId: row.sandboxId,
sessionInit: row.sessionInit,
};
}
function encodeEventRow(event: SessionEvent): EventRow {
return {
id: event.id,
eventIndex: event.eventIndex,
sessionId: event.sessionId,
createdAt: event.createdAt,
connectionId: event.connectionId,
sender: event.sender,
payload: event.payload,
};
}
function decodeEventRow(row: EventRow): SessionEvent {
return {
id: String(row.id),
eventIndex: parseEventIndex(row.eventIndex, row.id),
sessionId: row.sessionId,
createdAt: row.createdAt,
connectionId: row.connectionId,
sender: row.sender,
payload: row.payload as SessionEvent["payload"],
};
}
async function getAllRows<T>(db: IDBDatabase, storeName: string): Promise<T[]> {
return await transactionPromise<T[]>(db, [storeName], "readonly", async (tx) => {
const request = tx.objectStore(storeName).getAll();
return (await requestToPromise(request)) as T[];
});
}
function normalizeLimit(limit: number | undefined): number {
if (!Number.isFinite(limit) || (limit ?? 0) < 1) {
return DEFAULT_LIST_LIMIT;
}
return Math.floor(limit as number);
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
function compareEventRowsByOrder(a: EventRow, b: EventRow): number {
const indexA = parseEventIndex(a.eventIndex, a.id);
const indexB = parseEventIndex(b.eventIndex, b.id);
if (indexA !== indexB) {
return indexA - indexB;
}
return String(a.id).localeCompare(String(b.id));
}
function parseEventIndex(value: number | undefined, fallback: number | string): number {
if (typeof value === "number" && Number.isFinite(value)) {
return Math.max(0, Math.floor(value));
}
const parsed = Number.parseInt(String(fallback), 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
function requestToPromise<T>(request: IDBRequest<T>): Promise<T> {
return new Promise((resolve, reject) => {
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error ?? new Error("IndexedDB request failed"));
});
}
function transactionPromise<T>(db: IDBDatabase, stores: string[], mode: IDBTransactionMode, run: (tx: IDBTransaction) => T | Promise<T>): Promise<T> {
return new Promise((resolve, reject) => {
const tx = db.transaction(stores, mode);
let settled = false;
let resultValue: T | undefined;
let runCompleted = false;
let txCompleted = false;
function tryResolve() {
if (settled || !runCompleted || !txCompleted) {
return;
}
settled = true;
resolve(resultValue as T);
}
tx.oncomplete = () => {
txCompleted = true;
tryResolve();
};
tx.onerror = () => {
if (settled) {
return;
}
settled = true;
reject(tx.error ?? new Error("IndexedDB transaction failed"));
};
tx.onabort = () => {
if (settled) {
return;
}
settled = true;
reject(tx.error ?? new Error("IndexedDB transaction aborted"));
};
Promise.resolve(run(tx))
.then((value) => {
resultValue = value;
runCompleted = true;
tryResolve();
})
.catch((error) => {
if (!settled) {
settled = true;
reject(error);
}
try {
tx.abort();
} catch {
// no-op
}
});
});
}

View file

@ -0,0 +1,5 @@
# @sandbox-agent/persist-indexeddb
> **Deprecated:** This package has been deprecated and removed.
Copy the driver source directly into your project. See the [session persistence docs](https://sandboxagent.dev/session-persistence) for guidance.

View file

@ -16,23 +16,16 @@
"import": "./dist/index.js"
}
},
"dependencies": {
"sandbox-agent": "workspace:*"
},
"files": [
"dist"
],
"scripts": {
"build": "tsup",
"typecheck": "tsc --noEmit",
"test": "vitest run",
"test:watch": "vitest"
"typecheck": "tsc --noEmit"
},
"devDependencies": {
"@types/node": "^22.0.0",
"fake-indexeddb": "^6.2.4",
"tsup": "^8.0.0",
"typescript": "^5.7.0",
"vitest": "^3.0.0"
"typescript": "^5.7.0"
}
}

View file

@ -1,314 +1,5 @@
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent";
const DEFAULT_DB_NAME = "sandbox-agent-session-store";
const DEFAULT_DB_VERSION = 2;
const SESSIONS_STORE = "sessions";
const EVENTS_STORE = "events";
const EVENTS_BY_SESSION_INDEX = "by_session_index";
const DEFAULT_LIST_LIMIT = 100;
export interface IndexedDbSessionPersistDriverOptions {
databaseName?: string;
databaseVersion?: number;
indexedDb?: IDBFactory;
}
export class IndexedDbSessionPersistDriver implements SessionPersistDriver {
private readonly indexedDb: IDBFactory;
private readonly dbName: string;
private readonly dbVersion: number;
private readonly dbPromise: Promise<IDBDatabase>;
constructor(options: IndexedDbSessionPersistDriverOptions = {}) {
const indexedDb = options.indexedDb ?? globalThis.indexedDB;
if (!indexedDb) {
throw new Error("IndexedDB is not available in this runtime.");
}
this.indexedDb = indexedDb;
this.dbName = options.databaseName ?? DEFAULT_DB_NAME;
this.dbVersion = options.databaseVersion ?? DEFAULT_DB_VERSION;
this.dbPromise = this.openDatabase();
}
async getSession(id: string): Promise<SessionRecord | undefined> {
const db = await this.dbPromise;
const row = await requestToPromise<IDBValidKey | SessionRow | undefined>(db.transaction(SESSIONS_STORE, "readonly").objectStore(SESSIONS_STORE).get(id));
if (!row || typeof row !== "object") {
return undefined;
}
return decodeSessionRow(row as SessionRow);
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
const db = await this.dbPromise;
const rows = await getAllRows<SessionRow>(db, SESSIONS_STORE);
rows.sort((a, b) => {
if (a.createdAt !== b.createdAt) {
return a.createdAt - b.createdAt;
}
return a.id.localeCompare(b.id);
});
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const slice = rows.slice(offset, offset + limit).map(decodeSessionRow);
const nextOffset = offset + slice.length;
return {
items: slice,
nextCursor: nextOffset < rows.length ? String(nextOffset) : undefined,
};
}
async updateSession(session: SessionRecord): Promise<void> {
const db = await this.dbPromise;
await transactionPromise(db, [SESSIONS_STORE], "readwrite", (tx) => {
tx.objectStore(SESSIONS_STORE).put(encodeSessionRow(session));
});
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
const db = await this.dbPromise;
const rows = (await getAllRows<EventRow>(db, EVENTS_STORE)).filter((row) => row.sessionId === request.sessionId).sort(compareEventRowsByOrder);
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const slice = rows.slice(offset, offset + limit).map(decodeEventRow);
const nextOffset = offset + slice.length;
return {
items: slice,
nextCursor: nextOffset < rows.length ? String(nextOffset) : undefined,
};
}
async insertEvent(_sessionId: string, event: SessionEvent): Promise<void> {
const db = await this.dbPromise;
await transactionPromise(db, [EVENTS_STORE], "readwrite", (tx) => {
tx.objectStore(EVENTS_STORE).put(encodeEventRow(event));
});
}
async close(): Promise<void> {
const db = await this.dbPromise;
db.close();
}
private openDatabase(): Promise<IDBDatabase> {
return new Promise((resolve, reject) => {
const request = this.indexedDb.open(this.dbName, this.dbVersion);
request.onupgradeneeded = () => {
const db = request.result;
if (!db.objectStoreNames.contains(SESSIONS_STORE)) {
db.createObjectStore(SESSIONS_STORE, { keyPath: "id" });
}
if (!db.objectStoreNames.contains(EVENTS_STORE)) {
const events = db.createObjectStore(EVENTS_STORE, { keyPath: "id" });
events.createIndex(EVENTS_BY_SESSION_INDEX, ["sessionId", "eventIndex", "id"], {
unique: false,
});
} else {
const tx = request.transaction;
if (!tx) {
return;
}
const events = tx.objectStore(EVENTS_STORE);
if (!events.indexNames.contains(EVENTS_BY_SESSION_INDEX)) {
events.createIndex(EVENTS_BY_SESSION_INDEX, ["sessionId", "eventIndex", "id"], {
unique: false,
});
}
}
};
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error ?? new Error("Unable to open IndexedDB"));
});
}
}
type SessionRow = {
id: string;
agent: string;
agentSessionId: string;
lastConnectionId: string;
createdAt: number;
destroyedAt?: number;
sandboxId?: string;
sessionInit?: SessionRecord["sessionInit"];
};
type EventRow = {
id: number | string;
eventIndex?: number;
sessionId: string;
createdAt: number;
connectionId: string;
sender: "client" | "agent";
payload: unknown;
};
function encodeSessionRow(session: SessionRecord): SessionRow {
return {
id: session.id,
agent: session.agent,
agentSessionId: session.agentSessionId,
lastConnectionId: session.lastConnectionId,
createdAt: session.createdAt,
destroyedAt: session.destroyedAt,
sandboxId: session.sandboxId,
sessionInit: session.sessionInit,
};
}
function decodeSessionRow(row: SessionRow): SessionRecord {
return {
id: row.id,
agent: row.agent,
agentSessionId: row.agentSessionId,
lastConnectionId: row.lastConnectionId,
createdAt: row.createdAt,
destroyedAt: row.destroyedAt,
sandboxId: row.sandboxId,
sessionInit: row.sessionInit,
};
}
function encodeEventRow(event: SessionEvent): EventRow {
return {
id: event.id,
eventIndex: event.eventIndex,
sessionId: event.sessionId,
createdAt: event.createdAt,
connectionId: event.connectionId,
sender: event.sender,
payload: event.payload,
};
}
function decodeEventRow(row: EventRow): SessionEvent {
return {
id: String(row.id),
eventIndex: parseEventIndex(row.eventIndex, row.id),
sessionId: row.sessionId,
createdAt: row.createdAt,
connectionId: row.connectionId,
sender: row.sender,
payload: row.payload as SessionEvent["payload"],
};
}
async function getAllRows<T>(db: IDBDatabase, storeName: string): Promise<T[]> {
return await transactionPromise<T[]>(db, [storeName], "readonly", async (tx) => {
const request = tx.objectStore(storeName).getAll();
return (await requestToPromise(request)) as T[];
});
}
function normalizeLimit(limit: number | undefined): number {
if (!Number.isFinite(limit) || (limit ?? 0) < 1) {
return DEFAULT_LIST_LIMIT;
}
return Math.floor(limit as number);
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
function compareEventRowsByOrder(a: EventRow, b: EventRow): number {
const indexA = parseEventIndex(a.eventIndex, a.id);
const indexB = parseEventIndex(b.eventIndex, b.id);
if (indexA !== indexB) {
return indexA - indexB;
}
return String(a.id).localeCompare(String(b.id));
}
function parseEventIndex(value: number | undefined, fallback: number | string): number {
if (typeof value === "number" && Number.isFinite(value)) {
return Math.max(0, Math.floor(value));
}
const parsed = Number.parseInt(String(fallback), 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
function requestToPromise<T>(request: IDBRequest<T>): Promise<T> {
return new Promise((resolve, reject) => {
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error ?? new Error("IndexedDB request failed"));
});
}
function transactionPromise<T>(db: IDBDatabase, stores: string[], mode: IDBTransactionMode, run: (tx: IDBTransaction) => T | Promise<T>): Promise<T> {
return new Promise((resolve, reject) => {
const tx = db.transaction(stores, mode);
let settled = false;
let resultValue: T | undefined;
let runCompleted = false;
let txCompleted = false;
function tryResolve() {
if (settled || !runCompleted || !txCompleted) {
return;
}
settled = true;
resolve(resultValue as T);
}
tx.oncomplete = () => {
txCompleted = true;
tryResolve();
};
tx.onerror = () => {
if (settled) {
return;
}
settled = true;
reject(tx.error ?? new Error("IndexedDB transaction failed"));
};
tx.onabort = () => {
if (settled) {
return;
}
settled = true;
reject(tx.error ?? new Error("IndexedDB transaction aborted"));
};
Promise.resolve(run(tx))
.then((value) => {
resultValue = value;
runCompleted = true;
tryResolve();
})
.catch((error) => {
if (!settled) {
settled = true;
reject(error);
}
try {
tx.abort();
} catch {
// no-op
}
});
});
}
throw new Error(
"@sandbox-agent/persist-indexeddb has been deprecated and removed. " +
"Copy the reference implementation into your project instead. " +
"See https://github.com/nichochar/sandbox-agent/tree/main/examples/persist-indexeddb",
);

View file

@ -1,96 +0,0 @@
import "fake-indexeddb/auto";
import { describe, it, expect } from "vitest";
import { IndexedDbSessionPersistDriver } from "../src/index.ts";
function uniqueDbName(prefix: string): string {
return `${prefix}-${Date.now().toString(36)}-${Math.random().toString(36).slice(2, 10)}`;
}
describe("IndexedDbSessionPersistDriver", () => {
it("stores and pages sessions and events", async () => {
const dbName = uniqueDbName("indexeddb-driver");
const driver = new IndexedDbSessionPersistDriver({ databaseName: dbName });
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 100,
});
await driver.updateSession({
id: "s-2",
agent: "mock",
agentSessionId: "a-2",
lastConnectionId: "c-2",
createdAt: 200,
destroyedAt: 300,
});
await driver.insertEvent("s-1", {
id: "evt-1",
eventIndex: 1,
sessionId: "s-1",
createdAt: 1,
connectionId: "c-1",
sender: "client",
payload: { jsonrpc: "2.0", method: "session/prompt", params: { sessionId: "a-1" } },
});
await driver.insertEvent("s-1", {
id: "evt-2",
eventIndex: 2,
sessionId: "s-1",
createdAt: 2,
connectionId: "c-1",
sender: "agent",
payload: { jsonrpc: "2.0", method: "session/update", params: { sessionId: "a-1" } },
});
const loaded = await driver.getSession("s-2");
expect(loaded?.destroyedAt).toBe(300);
const page1 = await driver.listSessions({ limit: 1 });
expect(page1.items).toHaveLength(1);
expect(page1.items[0]?.id).toBe("s-1");
expect(page1.nextCursor).toBeTruthy();
const page2 = await driver.listSessions({ cursor: page1.nextCursor, limit: 1 });
expect(page2.items).toHaveLength(1);
expect(page2.items[0]?.id).toBe("s-2");
expect(page2.nextCursor).toBeUndefined();
const eventsPage = await driver.listEvents({ sessionId: "s-1", limit: 10 });
expect(eventsPage.items).toHaveLength(2);
expect(eventsPage.items[0]?.id).toBe("evt-1");
expect(eventsPage.items[0]?.eventIndex).toBe(1);
expect(eventsPage.items[1]?.id).toBe("evt-2");
expect(eventsPage.items[1]?.eventIndex).toBe(2);
await driver.close();
});
it("persists across driver instances for same database", async () => {
const dbName = uniqueDbName("indexeddb-reopen");
{
const driver = new IndexedDbSessionPersistDriver({ databaseName: dbName });
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 1,
});
await driver.close();
}
{
const driver = new IndexedDbSessionPersistDriver({ databaseName: dbName });
const session = await driver.getSession("s-1");
expect(session?.id).toBe("s-1");
await driver.close();
}
});
});

View file

@ -1,129 +0,0 @@
import "fake-indexeddb/auto";
import { describe, it, expect, beforeAll, afterAll } from "vitest";
import { existsSync, mkdtempSync, rmSync } from "node:fs";
import { dirname, join, resolve } from "node:path";
import { fileURLToPath } from "node:url";
import { tmpdir } from "node:os";
import { SandboxAgent } from "sandbox-agent";
import { spawnSandboxAgent, type SandboxAgentSpawnHandle } from "../../typescript/src/spawn.ts";
import { prepareMockAgentDataHome } from "../../typescript/tests/helpers/mock-agent.ts";
import { IndexedDbSessionPersistDriver } from "../src/index.ts";
const __dirname = dirname(fileURLToPath(import.meta.url));
function findBinary(): string | null {
if (process.env.SANDBOX_AGENT_BIN) {
return process.env.SANDBOX_AGENT_BIN;
}
const cargoPaths = [resolve(__dirname, "../../../target/debug/sandbox-agent"), resolve(__dirname, "../../../target/release/sandbox-agent")];
for (const p of cargoPaths) {
if (existsSync(p)) {
return p;
}
}
return null;
}
function uniqueDbName(prefix: string): string {
return `${prefix}-${Date.now().toString(36)}-${Math.random().toString(36).slice(2, 10)}`;
}
const BINARY_PATH = findBinary();
if (!BINARY_PATH) {
throw new Error("sandbox-agent binary not found. Build it (cargo build -p sandbox-agent) or set SANDBOX_AGENT_BIN.");
}
if (!process.env.SANDBOX_AGENT_BIN) {
process.env.SANDBOX_AGENT_BIN = BINARY_PATH;
}
describe("IndexedDB persistence end-to-end", () => {
let handle: SandboxAgentSpawnHandle;
let baseUrl: string;
let token: string;
let dataHome: string;
beforeAll(async () => {
dataHome = mkdtempSync(join(tmpdir(), "indexeddb-integration-"));
prepareMockAgentDataHome(dataHome);
handle = await spawnSandboxAgent({
enabled: true,
log: "silent",
timeoutMs: 30000,
env: {
XDG_DATA_HOME: dataHome,
HOME: dataHome,
USERPROFILE: dataHome,
APPDATA: join(dataHome, "AppData", "Roaming"),
LOCALAPPDATA: join(dataHome, "AppData", "Local"),
},
});
baseUrl = handle.baseUrl;
token = handle.token;
});
afterAll(async () => {
await handle.dispose();
rmSync(dataHome, { recursive: true, force: true });
});
it("restores sessions/events across sdk instances", async () => {
const dbName = uniqueDbName("sandbox-agent-browser-e2e");
const persist1 = new IndexedDbSessionPersistDriver({ databaseName: dbName });
const sdk1 = await SandboxAgent.connect({
baseUrl,
token,
persist: persist1,
replayMaxEvents: 40,
replayMaxChars: 16000,
});
const created = await sdk1.createSession({ agent: "mock" });
await created.prompt([{ type: "text", text: "indexeddb-first" }]);
const firstConnectionId = created.lastConnectionId;
await sdk1.dispose();
await persist1.close();
const persist2 = new IndexedDbSessionPersistDriver({ databaseName: dbName });
const sdk2 = await SandboxAgent.connect({
baseUrl,
token,
persist: persist2,
replayMaxEvents: 40,
replayMaxChars: 16000,
});
const restored = await sdk2.resumeSession(created.id);
expect(restored.lastConnectionId).not.toBe(firstConnectionId);
await restored.prompt([{ type: "text", text: "indexeddb-second" }]);
const sessions = await sdk2.listSessions({ limit: 20 });
expect(sessions.items.some((entry) => entry.id === created.id)).toBe(true);
const events = await sdk2.getEvents({ sessionId: created.id, limit: 1000 });
expect(events.items.length).toBeGreaterThan(0);
const replayInjected = events.items.find((event) => {
if (event.sender !== "client") {
return false;
}
const payload = event.payload as Record<string, unknown>;
const method = payload.method;
const params = payload.params as Record<string, unknown> | undefined;
const prompt = Array.isArray(params?.prompt) ? params?.prompt : [];
const firstBlock = prompt[0] as Record<string, unknown> | undefined;
return method === "session/prompt" && typeof firstBlock?.text === "string" && firstBlock.text.includes("Previous session history is replayed below");
});
expect(replayInjected).toBeTruthy();
await sdk2.dispose();
await persist2.close();
});
});

View file

@ -0,0 +1,5 @@
# @sandbox-agent/persist-postgres
> **Deprecated:** This package has been deprecated and removed. The implementation now lives as a copy-paste reference in [`examples/persist-postgres`](../../examples/persist-postgres).
Install `pg` directly and copy the driver source into your project. See the [full example](https://github.com/nichochar/sandbox-agent/tree/main/examples/persist-postgres).

View file

@ -1,7 +1,7 @@
{
"name": "@sandbox-agent/persist-postgres",
"version": "0.3.2",
"description": "PostgreSQL persistence driver for the Sandbox Agent TypeScript SDK",
"description": "PostgreSQL persistence driver for the Sandbox Agent TypeScript SDK (DEPRECATED)",
"license": "Apache-2.0",
"repository": {
"type": "git",
@ -16,24 +16,16 @@
"import": "./dist/index.js"
}
},
"dependencies": {
"pg": "^8.16.3",
"sandbox-agent": "workspace:*"
},
"files": [
"dist"
],
"scripts": {
"build": "tsup",
"typecheck": "tsc --noEmit",
"test": "vitest run",
"test:watch": "vitest"
"typecheck": "tsc --noEmit"
},
"devDependencies": {
"@types/node": "^22.0.0",
"@types/pg": "^8.15.6",
"tsup": "^8.0.0",
"typescript": "^5.7.0",
"vitest": "^3.0.0"
"typescript": "^5.7.0"
}
}

View file

@ -1,316 +1,5 @@
import { Pool, type PoolConfig } from "pg";
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent";
const DEFAULT_LIST_LIMIT = 100;
export interface PostgresSessionPersistDriverOptions {
connectionString?: string;
pool?: Pool;
poolConfig?: PoolConfig;
schema?: string;
}
export class PostgresSessionPersistDriver implements SessionPersistDriver {
private readonly pool: Pool;
private readonly ownsPool: boolean;
private readonly schema: string;
private readonly initialized: Promise<void>;
constructor(options: PostgresSessionPersistDriverOptions = {}) {
this.schema = normalizeSchema(options.schema ?? "public");
if (options.pool) {
this.pool = options.pool;
this.ownsPool = false;
} else {
this.pool = new Pool({
connectionString: options.connectionString,
...options.poolConfig,
});
this.ownsPool = true;
}
this.initialized = this.initialize();
}
async getSession(id: string): Promise<SessionRecord | undefined> {
await this.ready();
const result = await this.pool.query<SessionRow>(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json
FROM ${this.table("sessions")}
WHERE id = $1`,
[id],
);
if (result.rows.length === 0) {
return undefined;
}
return decodeSessionRow(result.rows[0]);
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
await this.ready();
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rowsResult = await this.pool.query<SessionRow>(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json
FROM ${this.table("sessions")}
ORDER BY created_at ASC, id ASC
LIMIT $1 OFFSET $2`,
[limit, offset],
);
const countResult = await this.pool.query<{ count: string }>(`SELECT COUNT(*) AS count FROM ${this.table("sessions")}`);
const total = parseInteger(countResult.rows[0]?.count ?? "0");
const nextOffset = offset + rowsResult.rows.length;
return {
items: rowsResult.rows.map(decodeSessionRow),
nextCursor: nextOffset < total ? String(nextOffset) : undefined,
};
}
async updateSession(session: SessionRecord): Promise<void> {
await this.ready();
await this.pool.query(
`INSERT INTO ${this.table("sessions")} (
id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)
ON CONFLICT(id) DO UPDATE SET
agent = EXCLUDED.agent,
agent_session_id = EXCLUDED.agent_session_id,
last_connection_id = EXCLUDED.last_connection_id,
created_at = EXCLUDED.created_at,
destroyed_at = EXCLUDED.destroyed_at,
sandbox_id = EXCLUDED.sandbox_id,
session_init_json = EXCLUDED.session_init_json`,
[
session.id,
session.agent,
session.agentSessionId,
session.lastConnectionId,
session.createdAt,
session.destroyedAt ?? null,
session.sandboxId ?? null,
session.sessionInit ?? null,
],
);
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
await this.ready();
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rowsResult = await this.pool.query<EventRow>(
`SELECT id, event_index, session_id, created_at, connection_id, sender, payload_json
FROM ${this.table("events")}
WHERE session_id = $1
ORDER BY event_index ASC, id ASC
LIMIT $2 OFFSET $3`,
[request.sessionId, limit, offset],
);
const countResult = await this.pool.query<{ count: string }>(`SELECT COUNT(*) AS count FROM ${this.table("events")} WHERE session_id = $1`, [
request.sessionId,
]);
const total = parseInteger(countResult.rows[0]?.count ?? "0");
const nextOffset = offset + rowsResult.rows.length;
return {
items: rowsResult.rows.map(decodeEventRow),
nextCursor: nextOffset < total ? String(nextOffset) : undefined,
};
}
async insertEvent(_sessionId: string, event: SessionEvent): Promise<void> {
await this.ready();
await this.pool.query(
`INSERT INTO ${this.table("events")} (
id, event_index, session_id, created_at, connection_id, sender, payload_json
) VALUES ($1, $2, $3, $4, $5, $6, $7)
ON CONFLICT(id) DO UPDATE SET
event_index = EXCLUDED.event_index,
session_id = EXCLUDED.session_id,
created_at = EXCLUDED.created_at,
connection_id = EXCLUDED.connection_id,
sender = EXCLUDED.sender,
payload_json = EXCLUDED.payload_json`,
[event.id, event.eventIndex, event.sessionId, event.createdAt, event.connectionId, event.sender, event.payload],
);
}
async close(): Promise<void> {
if (!this.ownsPool) {
return;
}
await this.pool.end();
}
private async ready(): Promise<void> {
await this.initialized;
}
private table(name: "sessions" | "events"): string {
return `"${this.schema}"."${name}"`;
}
private async initialize(): Promise<void> {
await this.pool.query(`CREATE SCHEMA IF NOT EXISTS "${this.schema}"`);
await this.pool.query(`
CREATE TABLE IF NOT EXISTS ${this.table("sessions")} (
id TEXT PRIMARY KEY,
agent TEXT NOT NULL,
agent_session_id TEXT NOT NULL,
last_connection_id TEXT NOT NULL,
created_at BIGINT NOT NULL,
destroyed_at BIGINT,
sandbox_id TEXT,
session_init_json JSONB
)
`);
await this.pool.query(`
ALTER TABLE ${this.table("sessions")}
ADD COLUMN IF NOT EXISTS sandbox_id TEXT
`);
await this.pool.query(`
CREATE TABLE IF NOT EXISTS ${this.table("events")} (
id TEXT PRIMARY KEY,
event_index BIGINT NOT NULL,
session_id TEXT NOT NULL,
created_at BIGINT NOT NULL,
connection_id TEXT NOT NULL,
sender TEXT NOT NULL,
payload_json JSONB NOT NULL
)
`);
await this.pool.query(`
ALTER TABLE ${this.table("events")}
ALTER COLUMN id TYPE TEXT USING id::TEXT
`);
await this.pool.query(`
ALTER TABLE ${this.table("events")}
ADD COLUMN IF NOT EXISTS event_index BIGINT
`);
await this.pool.query(`
WITH ranked AS (
SELECT id, ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC) AS ranked_index
FROM ${this.table("events")}
)
UPDATE ${this.table("events")} AS current_events
SET event_index = ranked.ranked_index
FROM ranked
WHERE current_events.id = ranked.id
AND current_events.event_index IS NULL
`);
await this.pool.query(`
ALTER TABLE ${this.table("events")}
ALTER COLUMN event_index SET NOT NULL
`);
await this.pool.query(`
CREATE INDEX IF NOT EXISTS idx_events_session_order
ON ${this.table("events")}(session_id, event_index, id)
`);
}
}
type SessionRow = {
id: string;
agent: string;
agent_session_id: string;
last_connection_id: string;
created_at: string | number;
destroyed_at: string | number | null;
sandbox_id: string | null;
session_init_json: unknown | null;
};
type EventRow = {
id: string | number;
event_index: string | number;
session_id: string;
created_at: string | number;
connection_id: string;
sender: string;
payload_json: unknown;
};
function decodeSessionRow(row: SessionRow): SessionRecord {
return {
id: row.id,
agent: row.agent,
agentSessionId: row.agent_session_id,
lastConnectionId: row.last_connection_id,
createdAt: parseInteger(row.created_at),
destroyedAt: row.destroyed_at === null ? undefined : parseInteger(row.destroyed_at),
sandboxId: row.sandbox_id ?? undefined,
sessionInit: row.session_init_json ? (row.session_init_json as SessionRecord["sessionInit"]) : undefined,
};
}
function decodeEventRow(row: EventRow): SessionEvent {
return {
id: String(row.id),
eventIndex: parseInteger(row.event_index),
sessionId: row.session_id,
createdAt: parseInteger(row.created_at),
connectionId: row.connection_id,
sender: parseSender(row.sender),
payload: row.payload_json as SessionEvent["payload"],
};
}
function normalizeLimit(limit: number | undefined): number {
if (!Number.isFinite(limit) || (limit ?? 0) < 1) {
return DEFAULT_LIST_LIMIT;
}
return Math.floor(limit as number);
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
function parseInteger(value: string | number): number {
const parsed = typeof value === "number" ? value : Number.parseInt(value, 10);
if (!Number.isFinite(parsed)) {
throw new Error(`Invalid integer value returned by postgres: ${String(value)}`);
}
return parsed;
}
function parseSender(value: string): SessionEvent["sender"] {
if (value === "agent" || value === "client") {
return value;
}
throw new Error(`Invalid sender value returned by postgres: ${value}`);
}
function normalizeSchema(schema: string): string {
if (!/^[A-Za-z_][A-Za-z0-9_]*$/.test(schema)) {
throw new Error(`Invalid schema name '${schema}'. Use letters, numbers, and underscores only.`);
}
return schema;
}
throw new Error(
"@sandbox-agent/persist-postgres has been deprecated and removed. " +
"Copy the reference implementation from examples/persist-postgres into your project instead. " +
"See https://github.com/nichochar/sandbox-agent/tree/main/examples/persist-postgres",
);

View file

@ -1,245 +0,0 @@
import { afterAll, afterEach, beforeAll, beforeEach, describe, expect, it } from "vitest";
import { execFileSync } from "node:child_process";
import { existsSync, mkdtempSync, rmSync } from "node:fs";
import { dirname, join, resolve } from "node:path";
import { fileURLToPath } from "node:url";
import { tmpdir } from "node:os";
import { randomUUID } from "node:crypto";
import { Client } from "pg";
import { SandboxAgent } from "sandbox-agent";
import { spawnSandboxAgent, type SandboxAgentSpawnHandle } from "../../typescript/src/spawn.ts";
import { prepareMockAgentDataHome } from "../../typescript/tests/helpers/mock-agent.ts";
import { PostgresSessionPersistDriver } from "../src/index.ts";
const __dirname = dirname(fileURLToPath(import.meta.url));
function findBinary(): string | null {
if (process.env.SANDBOX_AGENT_BIN) {
return process.env.SANDBOX_AGENT_BIN;
}
const cargoPaths = [resolve(__dirname, "../../../target/debug/sandbox-agent"), resolve(__dirname, "../../../target/release/sandbox-agent")];
for (const p of cargoPaths) {
if (existsSync(p)) {
return p;
}
}
return null;
}
const BINARY_PATH = findBinary();
if (!BINARY_PATH) {
throw new Error("sandbox-agent binary not found. Build it (cargo build -p sandbox-agent) or set SANDBOX_AGENT_BIN.");
}
if (!process.env.SANDBOX_AGENT_BIN) {
process.env.SANDBOX_AGENT_BIN = BINARY_PATH;
}
interface PostgresContainer {
containerId: string;
connectionString: string;
}
describe("Postgres persistence driver", () => {
let handle: SandboxAgentSpawnHandle;
let baseUrl: string;
let token: string;
let dataHome: string;
let postgres: PostgresContainer | null = null;
beforeAll(async () => {
dataHome = mkdtempSync(join(tmpdir(), "postgres-integration-"));
prepareMockAgentDataHome(dataHome);
handle = await spawnSandboxAgent({
enabled: true,
log: "silent",
timeoutMs: 30000,
env: {
XDG_DATA_HOME: dataHome,
HOME: dataHome,
USERPROFILE: dataHome,
APPDATA: join(dataHome, "AppData", "Roaming"),
LOCALAPPDATA: join(dataHome, "AppData", "Local"),
},
});
baseUrl = handle.baseUrl;
token = handle.token;
});
beforeEach(async () => {
postgres = await startPostgresContainer();
});
afterEach(() => {
if (postgres) {
stopPostgresContainer(postgres.containerId);
postgres = null;
}
});
afterAll(async () => {
await handle.dispose();
rmSync(dataHome, { recursive: true, force: true });
});
it("persists session/event history across SDK instances and supports replay restore", async () => {
const connectionString = requirePostgres(postgres).connectionString;
const persist1 = new PostgresSessionPersistDriver({
connectionString,
});
const sdk1 = await SandboxAgent.connect({
baseUrl,
token,
persist: persist1,
replayMaxEvents: 40,
replayMaxChars: 16000,
});
const created = await sdk1.createSession({ agent: "mock" });
await created.prompt([{ type: "text", text: "postgres-first" }]);
const firstConnectionId = created.lastConnectionId;
await sdk1.dispose();
await persist1.close();
const persist2 = new PostgresSessionPersistDriver({
connectionString,
});
const sdk2 = await SandboxAgent.connect({
baseUrl,
token,
persist: persist2,
replayMaxEvents: 40,
replayMaxChars: 16000,
});
const restored = await sdk2.resumeSession(created.id);
expect(restored.lastConnectionId).not.toBe(firstConnectionId);
await restored.prompt([{ type: "text", text: "postgres-second" }]);
const sessions = await sdk2.listSessions({ limit: 20 });
expect(sessions.items.some((entry) => entry.id === created.id)).toBe(true);
const events = await sdk2.getEvents({ sessionId: created.id, limit: 1000 });
expect(events.items.length).toBeGreaterThan(0);
expect(events.items.every((event) => typeof event.id === "string")).toBe(true);
expect(events.items.every((event) => Number.isInteger(event.eventIndex))).toBe(true);
for (let i = 1; i < events.items.length; i += 1) {
expect(events.items[i]!.eventIndex).toBeGreaterThanOrEqual(events.items[i - 1]!.eventIndex);
}
const replayInjected = events.items.find((event) => {
if (event.sender !== "client") {
return false;
}
const payload = event.payload as Record<string, unknown>;
const method = payload.method;
const params = payload.params as Record<string, unknown> | undefined;
const prompt = Array.isArray(params?.prompt) ? params?.prompt : [];
const firstBlock = prompt[0] as Record<string, unknown> | undefined;
return method === "session/prompt" && typeof firstBlock?.text === "string" && firstBlock.text.includes("Previous session history is replayed below");
});
expect(replayInjected).toBeTruthy();
await sdk2.dispose();
await persist2.close();
});
});
async function startPostgresContainer(): Promise<PostgresContainer> {
const name = `sandbox-agent-postgres-${randomUUID()}`;
const containerId = runDockerCommand([
"run",
"-d",
"--rm",
"--name",
name,
"-e",
"POSTGRES_USER=postgres",
"-e",
"POSTGRES_PASSWORD=postgres",
"-e",
"POSTGRES_DB=sandboxagent",
"-p",
"127.0.0.1::5432",
"postgres:16-alpine",
]);
const portOutput = runDockerCommand(["port", containerId, "5432/tcp"]);
const port = parsePort(portOutput);
const connectionString = `postgres://postgres:postgres@127.0.0.1:${port}/sandboxagent`;
await waitForPostgres(connectionString);
return {
containerId,
connectionString,
};
}
function stopPostgresContainer(containerId: string): void {
try {
runDockerCommand(["rm", "-f", containerId]);
} catch {
// Container may already be gone when test teardown runs.
}
}
function runDockerCommand(args: string[]): string {
return execFileSync("docker", args, {
encoding: "utf8",
stdio: ["ignore", "pipe", "pipe"],
}).trim();
}
function parsePort(output: string): string {
const firstLine = output.split("\n")[0]?.trim() ?? "";
const match = firstLine.match(/:(\d+)$/);
if (!match) {
throw new Error(`Failed to parse docker port output: '${output}'`);
}
return match[1];
}
async function waitForPostgres(connectionString: string): Promise<void> {
const timeoutMs = 30000;
const deadline = Date.now() + timeoutMs;
let lastError: unknown;
while (Date.now() < deadline) {
const client = new Client({ connectionString });
try {
await client.connect();
await client.query("SELECT 1");
await client.end();
return;
} catch (error) {
lastError = error;
try {
await client.end();
} catch {
// Ignore cleanup failures while retrying.
}
await delay(250);
}
}
throw new Error(`Postgres container did not become ready: ${String(lastError)}`);
}
function delay(ms: number): Promise<void> {
return new Promise((resolvePromise) => setTimeout(resolvePromise, ms));
}
function requirePostgres(container: PostgresContainer | null): PostgresContainer {
if (!container) {
throw new Error("Postgres container was not initialized for this test.");
}
return container;
}

View file

@ -0,0 +1,5 @@
# @sandbox-agent/persist-rivet
> **Deprecated:** This package has been deprecated and removed.
Copy the driver source directly into your project. See the [session persistence docs](https://sandboxagent.dev/session-persistence) for guidance.

View file

@ -16,30 +16,16 @@
"import": "./dist/index.js"
}
},
"dependencies": {
"sandbox-agent": "workspace:*"
},
"peerDependencies": {
"rivetkit": ">=0.5.0"
},
"peerDependenciesMeta": {
"rivetkit": {
"optional": true
}
},
"files": [
"dist"
],
"scripts": {
"build": "tsup",
"typecheck": "tsc --noEmit",
"test": "vitest run",
"test:watch": "vitest"
"typecheck": "tsc --noEmit"
},
"devDependencies": {
"@types/node": "^22.0.0",
"tsup": "^8.0.0",
"typescript": "^5.7.0",
"vitest": "^3.0.0"
"typescript": "^5.7.0"
}
}

View file

@ -1,168 +1,5 @@
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent";
/** Structural type compatible with rivetkit's ActorContext without importing it. */
export interface ActorContextLike {
state: Record<string, unknown>;
}
export interface RivetPersistData {
sessions: Record<string, SessionRecord>;
events: Record<string, SessionEvent[]>;
}
export type RivetPersistState = {
_sandboxAgentPersist: RivetPersistData;
};
export interface RivetSessionPersistDriverOptions {
/** Maximum number of sessions to retain. Oldest are evicted first. Default: 1024. */
maxSessions?: number;
/** Maximum events per session. Oldest are trimmed first. Default: 500. */
maxEventsPerSession?: number;
/** Key on `c.state` where persist data is stored. Default: `"_sandboxAgentPersist"`. */
stateKey?: string;
}
const DEFAULT_MAX_SESSIONS = 1024;
const DEFAULT_MAX_EVENTS_PER_SESSION = 500;
const DEFAULT_LIST_LIMIT = 100;
const DEFAULT_STATE_KEY = "_sandboxAgentPersist";
export class RivetSessionPersistDriver implements SessionPersistDriver {
private readonly maxSessions: number;
private readonly maxEventsPerSession: number;
private readonly stateKey: string;
private readonly ctx: ActorContextLike;
constructor(ctx: ActorContextLike, options: RivetSessionPersistDriverOptions = {}) {
this.ctx = ctx;
this.maxSessions = normalizeCap(options.maxSessions, DEFAULT_MAX_SESSIONS);
this.maxEventsPerSession = normalizeCap(options.maxEventsPerSession, DEFAULT_MAX_EVENTS_PER_SESSION);
this.stateKey = options.stateKey ?? DEFAULT_STATE_KEY;
// Auto-initialize if absent; preserve existing data on actor wake.
if (!this.ctx.state[this.stateKey]) {
this.ctx.state[this.stateKey] = { sessions: {}, events: {} } satisfies RivetPersistData;
}
}
private get data(): RivetPersistData {
return this.ctx.state[this.stateKey] as RivetPersistData;
}
async getSession(id: string): Promise<SessionRecord | undefined> {
const session = this.data.sessions[id];
return session ? cloneSessionRecord(session) : undefined;
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
const sorted = Object.values(this.data.sessions).sort((a, b) => {
if (a.createdAt !== b.createdAt) {
return a.createdAt - b.createdAt;
}
return a.id.localeCompare(b.id);
});
const page = paginate(sorted, request);
return {
items: page.items.map(cloneSessionRecord),
nextCursor: page.nextCursor,
};
}
async updateSession(session: SessionRecord): Promise<void> {
this.data.sessions[session.id] = { ...session };
if (!this.data.events[session.id]) {
this.data.events[session.id] = [];
}
const ids = Object.keys(this.data.sessions);
if (ids.length <= this.maxSessions) {
return;
}
const overflow = ids.length - this.maxSessions;
const removable = Object.values(this.data.sessions)
.sort((a, b) => {
if (a.createdAt !== b.createdAt) {
return a.createdAt - b.createdAt;
}
return a.id.localeCompare(b.id);
})
.slice(0, overflow)
.map((s) => s.id);
for (const sessionId of removable) {
delete this.data.sessions[sessionId];
delete this.data.events[sessionId];
}
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
const all = [...(this.data.events[request.sessionId] ?? [])].sort((a, b) => {
if (a.eventIndex !== b.eventIndex) {
return a.eventIndex - b.eventIndex;
}
return a.id.localeCompare(b.id);
});
const page = paginate(all, request);
return {
items: page.items.map(cloneSessionEvent),
nextCursor: page.nextCursor,
};
}
async insertEvent(sessionId: string, event: SessionEvent): Promise<void> {
const events = this.data.events[sessionId] ?? [];
events.push(cloneSessionEvent(event));
if (events.length > this.maxEventsPerSession) {
events.splice(0, events.length - this.maxEventsPerSession);
}
this.data.events[sessionId] = events;
}
}
function cloneSessionRecord(session: SessionRecord): SessionRecord {
return {
...session,
sessionInit: session.sessionInit ? (JSON.parse(JSON.stringify(session.sessionInit)) as SessionRecord["sessionInit"]) : undefined,
};
}
function cloneSessionEvent(event: SessionEvent): SessionEvent {
return {
...event,
payload: JSON.parse(JSON.stringify(event.payload)) as SessionEvent["payload"],
};
}
function normalizeCap(value: number | undefined, fallback: number): number {
if (!Number.isFinite(value) || (value ?? 0) < 1) {
return fallback;
}
return Math.floor(value as number);
}
function paginate<T>(items: T[], request: ListPageRequest): ListPage<T> {
const offset = parseCursor(request.cursor);
const limit = normalizeCap(request.limit, DEFAULT_LIST_LIMIT);
const slice = items.slice(offset, offset + limit);
const nextOffset = offset + slice.length;
return {
items: slice,
nextCursor: nextOffset < items.length ? String(nextOffset) : undefined,
};
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
throw new Error(
"@sandbox-agent/persist-rivet has been deprecated and removed. " +
"Copy the reference implementation into your project instead. " +
"See https://github.com/nichochar/sandbox-agent/tree/main/examples/persist-rivet",
);

View file

@ -1,236 +0,0 @@
import { describe, it, expect } from "vitest";
import { RivetSessionPersistDriver } from "../src/index.ts";
import type { RivetPersistData } from "../src/index.ts";
function makeCtx() {
return { state: {} as Record<string, unknown> };
}
describe("RivetSessionPersistDriver", () => {
it("auto-initializes state on construction", () => {
const ctx = makeCtx();
new RivetSessionPersistDriver(ctx);
const data = ctx.state._sandboxAgentPersist as RivetPersistData;
expect(data).toBeDefined();
expect(data.sessions).toEqual({});
expect(data.events).toEqual({});
});
it("preserves existing state on construction (actor wake)", async () => {
const ctx = makeCtx();
const driver1 = new RivetSessionPersistDriver(ctx);
await driver1.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 100,
});
// Simulate actor wake: new driver instance, same state object
const driver2 = new RivetSessionPersistDriver(ctx);
const session = await driver2.getSession("s-1");
expect(session?.id).toBe("s-1");
expect(session?.createdAt).toBe(100);
});
it("stores and retrieves sessions", async () => {
const driver = new RivetSessionPersistDriver(makeCtx());
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 100,
});
await driver.updateSession({
id: "s-2",
agent: "mock",
agentSessionId: "a-2",
lastConnectionId: "c-2",
createdAt: 200,
destroyedAt: 300,
});
const loaded = await driver.getSession("s-2");
expect(loaded?.destroyedAt).toBe(300);
const missing = await driver.getSession("s-nonexistent");
expect(missing).toBeUndefined();
});
it("pages sessions sorted by createdAt", async () => {
const driver = new RivetSessionPersistDriver(makeCtx());
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 100,
});
await driver.updateSession({
id: "s-2",
agent: "mock",
agentSessionId: "a-2",
lastConnectionId: "c-2",
createdAt: 200,
});
const page1 = await driver.listSessions({ limit: 1 });
expect(page1.items).toHaveLength(1);
expect(page1.items[0]?.id).toBe("s-1");
expect(page1.nextCursor).toBeTruthy();
const page2 = await driver.listSessions({ cursor: page1.nextCursor, limit: 1 });
expect(page2.items).toHaveLength(1);
expect(page2.items[0]?.id).toBe("s-2");
expect(page2.nextCursor).toBeUndefined();
});
it("stores and pages events", async () => {
const driver = new RivetSessionPersistDriver(makeCtx());
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 1,
});
await driver.insertEvent("s-1", {
id: "evt-1",
eventIndex: 1,
sessionId: "s-1",
createdAt: 1,
connectionId: "c-1",
sender: "client",
payload: { jsonrpc: "2.0", method: "session/prompt", params: { sessionId: "a-1" } },
});
await driver.insertEvent("s-1", {
id: "evt-2",
eventIndex: 2,
sessionId: "s-1",
createdAt: 2,
connectionId: "c-1",
sender: "agent",
payload: { jsonrpc: "2.0", method: "session/update", params: { sessionId: "a-1" } },
});
const eventsPage = await driver.listEvents({ sessionId: "s-1", limit: 10 });
expect(eventsPage.items).toHaveLength(2);
expect(eventsPage.items[0]?.id).toBe("evt-1");
expect(eventsPage.items[0]?.eventIndex).toBe(1);
expect(eventsPage.items[1]?.id).toBe("evt-2");
expect(eventsPage.items[1]?.eventIndex).toBe(2);
});
it("evicts oldest sessions when maxSessions exceeded", async () => {
const driver = new RivetSessionPersistDriver(makeCtx(), { maxSessions: 2 });
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 100,
});
await driver.updateSession({
id: "s-2",
agent: "mock",
agentSessionId: "a-2",
lastConnectionId: "c-2",
createdAt: 200,
});
// Adding a third session should evict the oldest (s-1)
await driver.updateSession({
id: "s-3",
agent: "mock",
agentSessionId: "a-3",
lastConnectionId: "c-3",
createdAt: 300,
});
expect(await driver.getSession("s-1")).toBeUndefined();
expect(await driver.getSession("s-2")).toBeDefined();
expect(await driver.getSession("s-3")).toBeDefined();
});
it("trims oldest events when maxEventsPerSession exceeded", async () => {
const driver = new RivetSessionPersistDriver(makeCtx(), { maxEventsPerSession: 2 });
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 1,
});
for (let i = 1; i <= 3; i++) {
await driver.insertEvent("s-1", {
id: `evt-${i}`,
eventIndex: i,
sessionId: "s-1",
createdAt: i,
connectionId: "c-1",
sender: "client",
payload: { jsonrpc: "2.0", method: "session/prompt", params: { sessionId: "a-1" } },
});
}
const page = await driver.listEvents({ sessionId: "s-1" });
expect(page.items).toHaveLength(2);
// Oldest event (evt-1) should be trimmed
expect(page.items[0]?.id).toBe("evt-2");
expect(page.items[1]?.id).toBe("evt-3");
});
it("clones data to prevent external mutation", async () => {
const driver = new RivetSessionPersistDriver(makeCtx());
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 1,
});
const s1 = await driver.getSession("s-1");
const s2 = await driver.getSession("s-1");
expect(s1).toEqual(s2);
expect(s1).not.toBe(s2); // Different object references
});
it("supports custom stateKey", async () => {
const ctx = makeCtx();
const driver = new RivetSessionPersistDriver(ctx, { stateKey: "myPersist" });
await driver.updateSession({
id: "s-1",
agent: "mock",
agentSessionId: "a-1",
lastConnectionId: "c-1",
createdAt: 1,
});
expect((ctx.state.myPersist as RivetPersistData).sessions["s-1"]).toBeDefined();
expect(ctx.state._sandboxAgentPersist).toBeUndefined();
});
it("returns empty results for unknown session events", async () => {
const driver = new RivetSessionPersistDriver(makeCtx());
const page = await driver.listEvents({ sessionId: "nonexistent" });
expect(page.items).toHaveLength(0);
expect(page.nextCursor).toBeUndefined();
});
});

View file

@ -0,0 +1,5 @@
# @sandbox-agent/persist-sqlite
> **Deprecated:** This package has been deprecated and removed. The implementation now lives as a copy-paste reference in [`examples/persist-sqlite`](../../examples/persist-sqlite).
Install `better-sqlite3` directly and copy the driver source into your project. See the [full example](https://github.com/nichochar/sandbox-agent/tree/main/examples/persist-sqlite).

View file

@ -16,24 +16,17 @@
"import": "./dist/index.js"
}
},
"dependencies": {
"better-sqlite3": "^11.0.0",
"sandbox-agent": "workspace:*"
},
"dependencies": {},
"files": [
"dist"
],
"scripts": {
"build": "tsup",
"typecheck": "tsc --noEmit",
"test": "vitest run",
"test:watch": "vitest"
"typecheck": "tsc --noEmit"
},
"devDependencies": {
"@types/better-sqlite3": "^7.0.0",
"@types/node": "^22.0.0",
"tsup": "^8.0.0",
"typescript": "^5.7.0",
"vitest": "^3.0.0"
"typescript": "^5.7.0"
}
}

View file

@ -1,294 +1,5 @@
import Database from "better-sqlite3";
import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent";
const DEFAULT_LIST_LIMIT = 100;
export interface SQLiteSessionPersistDriverOptions {
filename?: string;
}
export class SQLiteSessionPersistDriver implements SessionPersistDriver {
private readonly db: Database.Database;
constructor(options: SQLiteSessionPersistDriverOptions = {}) {
this.db = new Database(options.filename ?? ":memory:");
this.initialize();
}
async getSession(id: string): Promise<SessionRecord | undefined> {
const row = this.db
.prepare(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json
FROM sessions WHERE id = ?`,
)
.get(id) as SessionRow | undefined;
if (!row) {
return undefined;
}
return decodeSessionRow(row);
}
async listSessions(request: ListPageRequest = {}): Promise<ListPage<SessionRecord>> {
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rows = this.db
.prepare(
`SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json
FROM sessions
ORDER BY created_at ASC, id ASC
LIMIT ? OFFSET ?`,
)
.all(limit, offset) as SessionRow[];
const countRow = this.db.prepare(`SELECT COUNT(*) as count FROM sessions`).get() as { count: number };
const nextOffset = offset + rows.length;
return {
items: rows.map(decodeSessionRow),
nextCursor: nextOffset < countRow.count ? String(nextOffset) : undefined,
};
}
async updateSession(session: SessionRecord): Promise<void> {
this.db
.prepare(
`INSERT INTO sessions (
id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json
) VALUES (?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(id) DO UPDATE SET
agent = excluded.agent,
agent_session_id = excluded.agent_session_id,
last_connection_id = excluded.last_connection_id,
created_at = excluded.created_at,
destroyed_at = excluded.destroyed_at,
sandbox_id = excluded.sandbox_id,
session_init_json = excluded.session_init_json`,
)
.run(
session.id,
session.agent,
session.agentSessionId,
session.lastConnectionId,
session.createdAt,
session.destroyedAt ?? null,
session.sandboxId ?? null,
session.sessionInit ? JSON.stringify(session.sessionInit) : null,
);
}
async listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>> {
const offset = parseCursor(request.cursor);
const limit = normalizeLimit(request.limit);
const rows = this.db
.prepare(
`SELECT id, event_index, session_id, created_at, connection_id, sender, payload_json
FROM events
WHERE session_id = ?
ORDER BY event_index ASC, id ASC
LIMIT ? OFFSET ?`,
)
.all(request.sessionId, limit, offset) as EventRow[];
const countRow = this.db.prepare(`SELECT COUNT(*) as count FROM events WHERE session_id = ?`).get(request.sessionId) as { count: number };
const nextOffset = offset + rows.length;
return {
items: rows.map(decodeEventRow),
nextCursor: nextOffset < countRow.count ? String(nextOffset) : undefined,
};
}
async insertEvent(_sessionId: string, event: SessionEvent): Promise<void> {
this.db
.prepare(
`INSERT INTO events (
id, event_index, session_id, created_at, connection_id, sender, payload_json
) VALUES (?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(id) DO UPDATE SET
event_index = excluded.event_index,
session_id = excluded.session_id,
created_at = excluded.created_at,
connection_id = excluded.connection_id,
sender = excluded.sender,
payload_json = excluded.payload_json`,
)
.run(event.id, event.eventIndex, event.sessionId, event.createdAt, event.connectionId, event.sender, JSON.stringify(event.payload));
}
close(): void {
this.db.close();
}
private initialize(): void {
this.db.exec(`
CREATE TABLE IF NOT EXISTS sessions (
id TEXT PRIMARY KEY,
agent TEXT NOT NULL,
agent_session_id TEXT NOT NULL,
last_connection_id TEXT NOT NULL,
created_at INTEGER NOT NULL,
destroyed_at INTEGER,
sandbox_id TEXT,
session_init_json TEXT
)
`);
const sessionColumns = this.db.prepare(`PRAGMA table_info(sessions)`).all() as TableInfoRow[];
if (!sessionColumns.some((column) => column.name === "sandbox_id")) {
this.db.exec(`ALTER TABLE sessions ADD COLUMN sandbox_id TEXT`);
}
this.ensureEventsTable();
}
private ensureEventsTable(): void {
const tableInfo = this.db.prepare(`PRAGMA table_info(events)`).all() as TableInfoRow[];
if (tableInfo.length === 0) {
this.createEventsTable();
return;
}
const idColumn = tableInfo.find((column) => column.name === "id");
const hasEventIndex = tableInfo.some((column) => column.name === "event_index");
const idType = (idColumn?.type ?? "").trim().toUpperCase();
const idIsText = idType === "TEXT";
if (!idIsText || !hasEventIndex) {
this.rebuildEventsTable(hasEventIndex);
}
this.db.exec(`
CREATE INDEX IF NOT EXISTS idx_events_session_order
ON events(session_id, event_index, id)
`);
}
private createEventsTable(): void {
this.db.exec(`
CREATE TABLE IF NOT EXISTS events (
id TEXT PRIMARY KEY,
event_index INTEGER NOT NULL,
session_id TEXT NOT NULL,
created_at INTEGER NOT NULL,
connection_id TEXT NOT NULL,
sender TEXT NOT NULL,
payload_json TEXT NOT NULL
);
CREATE INDEX IF NOT EXISTS idx_events_session_order
ON events(session_id, event_index, id)
`);
}
private rebuildEventsTable(hasEventIndex: boolean): void {
this.db.exec(`
ALTER TABLE events RENAME TO events_legacy;
`);
this.createEventsTable();
if (hasEventIndex) {
this.db.exec(`
INSERT INTO events (id, event_index, session_id, created_at, connection_id, sender, payload_json)
SELECT
CAST(id AS TEXT),
COALESCE(event_index, ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC)),
session_id,
created_at,
connection_id,
sender,
payload_json
FROM events_legacy
`);
} else {
this.db.exec(`
INSERT INTO events (id, event_index, session_id, created_at, connection_id, sender, payload_json)
SELECT
CAST(id AS TEXT),
ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC),
session_id,
created_at,
connection_id,
sender,
payload_json
FROM events_legacy
`);
}
this.db.exec(`DROP TABLE events_legacy`);
}
}
type SessionRow = {
id: string;
agent: string;
agent_session_id: string;
last_connection_id: string;
created_at: number;
destroyed_at: number | null;
sandbox_id: string | null;
session_init_json: string | null;
};
type EventRow = {
id: string;
event_index: number;
session_id: string;
created_at: number;
connection_id: string;
sender: "client" | "agent";
payload_json: string;
};
type TableInfoRow = {
name: string;
type: string;
};
function decodeSessionRow(row: SessionRow): SessionRecord {
return {
id: row.id,
agent: row.agent,
agentSessionId: row.agent_session_id,
lastConnectionId: row.last_connection_id,
createdAt: row.created_at,
destroyedAt: row.destroyed_at ?? undefined,
sandboxId: row.sandbox_id ?? undefined,
sessionInit: row.session_init_json ? (JSON.parse(row.session_init_json) as SessionRecord["sessionInit"]) : undefined,
};
}
function decodeEventRow(row: EventRow): SessionEvent {
return {
id: row.id,
eventIndex: row.event_index,
sessionId: row.session_id,
createdAt: row.created_at,
connectionId: row.connection_id,
sender: row.sender,
payload: JSON.parse(row.payload_json),
};
}
function normalizeLimit(limit: number | undefined): number {
if (!Number.isFinite(limit) || (limit ?? 0) < 1) {
return DEFAULT_LIST_LIMIT;
}
return Math.floor(limit as number);
}
function parseCursor(cursor: string | undefined): number {
if (!cursor) {
return 0;
}
const parsed = Number.parseInt(cursor, 10);
if (!Number.isFinite(parsed) || parsed < 0) {
return 0;
}
return parsed;
}
throw new Error(
"@sandbox-agent/persist-sqlite has been deprecated and removed. " +
"Copy the reference implementation from examples/persist-sqlite into your project instead. " +
"See https://github.com/nichochar/sandbox-agent/tree/main/examples/persist-sqlite",
);

View file

@ -1,131 +0,0 @@
import { describe, it, expect, beforeAll, afterAll } from "vitest";
import { existsSync, mkdtempSync, rmSync } from "node:fs";
import { dirname, join, resolve } from "node:path";
import { fileURLToPath } from "node:url";
import { tmpdir } from "node:os";
import { SandboxAgent } from "sandbox-agent";
import { spawnSandboxAgent, type SandboxAgentSpawnHandle } from "../../typescript/src/spawn.ts";
import { prepareMockAgentDataHome } from "../../typescript/tests/helpers/mock-agent.ts";
import { SQLiteSessionPersistDriver } from "../src/index.ts";
const __dirname = dirname(fileURLToPath(import.meta.url));
function findBinary(): string | null {
if (process.env.SANDBOX_AGENT_BIN) {
return process.env.SANDBOX_AGENT_BIN;
}
const cargoPaths = [resolve(__dirname, "../../../target/debug/sandbox-agent"), resolve(__dirname, "../../../target/release/sandbox-agent")];
for (const p of cargoPaths) {
if (existsSync(p)) {
return p;
}
}
return null;
}
const BINARY_PATH = findBinary();
if (!BINARY_PATH) {
throw new Error("sandbox-agent binary not found. Build it (cargo build -p sandbox-agent) or set SANDBOX_AGENT_BIN.");
}
if (!process.env.SANDBOX_AGENT_BIN) {
process.env.SANDBOX_AGENT_BIN = BINARY_PATH;
}
describe("SQLite persistence driver", () => {
let handle: SandboxAgentSpawnHandle;
let baseUrl: string;
let token: string;
let dataHome: string;
beforeAll(async () => {
dataHome = mkdtempSync(join(tmpdir(), "sqlite-integration-"));
prepareMockAgentDataHome(dataHome);
handle = await spawnSandboxAgent({
enabled: true,
log: "silent",
timeoutMs: 30000,
env: {
XDG_DATA_HOME: dataHome,
HOME: dataHome,
USERPROFILE: dataHome,
APPDATA: join(dataHome, "AppData", "Roaming"),
LOCALAPPDATA: join(dataHome, "AppData", "Local"),
},
});
baseUrl = handle.baseUrl;
token = handle.token;
});
afterAll(async () => {
await handle.dispose();
rmSync(dataHome, { recursive: true, force: true });
});
it("persists session/event history across SDK instances and supports replay restore", async () => {
const tempDir = mkdtempSync(join(tmpdir(), "sqlite-persist-"));
const dbPath = join(tempDir, "session-store.db");
const persist1 = new SQLiteSessionPersistDriver({ filename: dbPath });
const sdk1 = await SandboxAgent.connect({
baseUrl,
token,
persist: persist1,
replayMaxEvents: 40,
replayMaxChars: 16000,
});
const created = await sdk1.createSession({ agent: "mock" });
await created.prompt([{ type: "text", text: "sqlite-first" }]);
const firstConnectionId = created.lastConnectionId;
await sdk1.dispose();
persist1.close();
const persist2 = new SQLiteSessionPersistDriver({ filename: dbPath });
const sdk2 = await SandboxAgent.connect({
baseUrl,
token,
persist: persist2,
replayMaxEvents: 40,
replayMaxChars: 16000,
});
const restored = await sdk2.resumeSession(created.id);
expect(restored.lastConnectionId).not.toBe(firstConnectionId);
await restored.prompt([{ type: "text", text: "sqlite-second" }]);
const sessions = await sdk2.listSessions({ limit: 20 });
expect(sessions.items.some((entry) => entry.id === created.id)).toBe(true);
const events = await sdk2.getEvents({ sessionId: created.id, limit: 1000 });
expect(events.items.length).toBeGreaterThan(0);
expect(events.items.every((event) => typeof event.id === "string")).toBe(true);
expect(events.items.every((event) => Number.isInteger(event.eventIndex))).toBe(true);
for (let i = 1; i < events.items.length; i += 1) {
expect(events.items[i]!.eventIndex).toBeGreaterThanOrEqual(events.items[i - 1]!.eventIndex);
}
const replayInjected = events.items.find((event) => {
if (event.sender !== "client") {
return false;
}
const payload = event.payload as Record<string, unknown>;
const method = payload.method;
const params = payload.params as Record<string, unknown> | undefined;
const prompt = Array.isArray(params?.prompt) ? params?.prompt : [];
const firstBlock = prompt[0] as Record<string, unknown> | undefined;
return method === "session/prompt" && typeof firstBlock?.text === "string" && firstBlock.text.includes("Previous session history is replayed below");
});
expect(replayInjected).toBeTruthy();
await sdk2.dispose();
persist2.close();
rmSync(tempDir, { recursive: true, force: true });
});
});

View file

@ -1,8 +0,0 @@
import { defineConfig } from "vitest/config";
export default defineConfig({
test: {
include: ["tests/**/*.test.ts"],
testTimeout: 60000,
},
});

View file

@ -1770,7 +1770,7 @@ export class SandboxAgent {
};
try {
await this.persist.insertEvent(event);
await this.persist.insertEvent(localSessionId, event);
break;
} catch (error) {
if (!isSessionEventIndexConflict(error) || attempt === MAX_EVENT_INDEX_INSERT_RETRIES - 1) {

View file

@ -7,15 +7,17 @@ const DEFAULT_PREVIEW_TTL_SECONDS = 4 * 60 * 60;
type DaytonaCreateParams = NonNullable<Parameters<Daytona["create"]>[0]>;
type DaytonaCreateOverrides = Partial<DaytonaCreateParams>;
export interface DaytonaProviderOptions {
create?: DaytonaCreateParams | (() => DaytonaCreateParams | Promise<DaytonaCreateParams>);
create?: DaytonaCreateOverrides | (() => DaytonaCreateOverrides | Promise<DaytonaCreateOverrides>);
image?: string;
agentPort?: number;
previewTtlSeconds?: number;
deleteTimeoutSeconds?: number;
}
async function resolveCreateOptions(value: DaytonaProviderOptions["create"]): Promise<DaytonaCreateParams | undefined> {
async function resolveCreateOptions(value: DaytonaProviderOptions["create"]): Promise<DaytonaCreateOverrides | undefined> {
if (!value) return undefined;
if (typeof value === "function") return await value();
return value;

View file

@ -70,19 +70,19 @@ class StrictUniqueSessionPersistDriver implements SessionPersistDriver {
return this.events.listEvents(request);
}
async insertEvent(event: SessionEvent): Promise<void> {
async insertEvent(sessionId: string, event: SessionEvent): Promise<void> {
await sleep(5);
const indexes = this.eventIndexesBySession.get(event.sessionId) ?? new Set<number>();
const indexes = this.eventIndexesBySession.get(sessionId) ?? new Set<number>();
if (indexes.has(event.eventIndex)) {
throw new Error("UNIQUE constraint failed: sandbox_agent_events.session_id, sandbox_agent_events.event_index");
}
indexes.add(event.eventIndex);
this.eventIndexesBySession.set(event.sessionId, indexes);
this.eventIndexesBySession.set(sessionId, indexes);
await sleep(5);
await this.events.insertEvent(event);
await this.events.insertEvent(sessionId, event);
}
}