feat: refresh docs and agent schema

This commit is contained in:
Nathan Flurry 2026-01-25 03:04:12 -08:00
parent a49ea094f3
commit 0fbf6272b1
39 changed files with 3127 additions and 1806 deletions

View file

@ -18,6 +18,34 @@ Universal schema guidance:
- When changing the HTTP API, update the TypeScript SDK and CLI together.
- Do not make breaking changes to API endpoints.
### CLI ⇄ HTTP endpoint map (keep in sync)
- `sandbox-agent agents list``GET /v1/agents`
- `sandbox-agent agents install``POST /v1/agents/{agent}/install`
- `sandbox-agent agents modes``GET /v1/agents/{agent}/modes`
- `sandbox-agent sessions create``POST /v1/sessions/{sessionId}`
- `sandbox-agent sessions send-message``POST /v1/sessions/{sessionId}/messages`
- `sandbox-agent sessions events` / `get-messages``GET /v1/sessions/{sessionId}/events`
- `sandbox-agent sessions events-sse``GET /v1/sessions/{sessionId}/events/sse`
- `sandbox-agent sessions reply-question``POST /v1/sessions/{sessionId}/questions/{questionId}/reply`
- `sandbox-agent sessions reject-question``POST /v1/sessions/{sessionId}/questions/{questionId}/reject`
- `sandbox-agent sessions reply-permission``POST /v1/sessions/{sessionId}/permissions/{permissionId}/reply`
### Default port references (update when CLI default changes)
- `frontend/packages/web/src/App.tsx`
- `README.md`
- `docs/cli.mdx`
- `docs/frontend.mdx`
- `docs/index.mdx`
- `docs/quickstart.mdx`
- `docs/typescript-sdk.mdx`
- `docs/deployments/cloudflare-sandboxes.mdx`
- `docs/deployments/daytona.mdx`
- `docs/deployments/docker.mdx`
- `docs/deployments/e2b.mdx`
- `docs/deployments/vercel-sandboxes.mdx`
## Git Commits
- Do not include any co-authors in commit messages (no `Co-Authored-By` lines)

View file

@ -1,94 +1,15 @@
# Sandbox Agent SDK
Run inside sandboxes to provide support
Universal API for running Claude Code, Codex, OpenCode, and Amp inside sandboxes.
- **Any coding agent**: Universal API to interact with all agents with full feature coverage
- **Server Mode**: Run as HTTP server from any sandbox provider or as TypeScript & Python SDK
- **Universal session schema**: Universal schema to store agent transcripts
- **Supports your sandbox provider**: Daytona, E2B, Vercel Sandboxes, [add your own](TODO)
- **Lightweight, portable Rust binary**: Install anywhere with 1 curl command
- **Compatible with Vercel AI SDK**: TODO
Documentation lives in `docs/` (Mintlify). Start with:
## Quickstart
- `docs/index.mdx` for the overview
- `docs/quickstart.mdx` to run the daemon
- `docs/http-api.mdx` and `docs/cli.mdx` for API references
Start with the SDK:
Quickstart (local dev):
```bash
sandbox-agent --token "$SANDBOX_TOKEN" --host 127.0.0.1 --port 8787
```
TODO
```
To run this in server mode, install with:
```
TODO
```
And run with:
```
TODO
```
See the example for your provider of choice:
- TODO
- [Add your own](TODO)
## Security
TODO: Tokens
TODO: Using a gateawy
TODO: BYO tokens with extractor
## Demo Frontend
TODO: Screenshot
This project provides a demo frontend for testing the connection. Run it with:
```
TODO
```
## Agent Compatibility Matrix
TODO
## Reference
### TypeScript SDK
TODO
### HTTP API
TODO
### CLI
TODO
## FAQ
TODO
- Why not use PTY? This is the recommended option for XXXX
- Why not use <feature that already exists on sandbox API>?
- Does it support <platform>?
- Can I use this with my personal OpenAPI & Claude tokens?
## Project Scope
This project aims to solve 3 problems with agents:
- **Universal Coding Agent API**: Claude Code, Codex, Amp, and OpenCode all have put a lot of work in to the agent scaffold. Each have respective pros and cons and need to be easy to be swapped between.
- **Agents In Sandboxes**: There are many complications with running agents inside of sandbox providers. This lets you run a simple curl command to spawn an HTTP server for using any agent from within the sandbox.
- **Agent Transcript**: Maintaining agent transcripts is difficult since the agent manages its own sessions. This provides a simpler way to read and retrieve agent transcripts in your system.
Features out of scope:
- **Storage of sessions on disk**: Sessions are already stored by the respective coding agents on disk. It's assumed that the consumer is streaming data from this machine to an extral storage, such as Postgres, ClickHouse, or Rivet.
- **Direct LLM wrappers**: Use the [Vercel AI SDK](https://ai-sdk.dev/docs/introduction) if you want to impelment your own agent from scratch
- **Git Repo Management**: Just use git commands or the features provided by your sandbox provider of choice.
- **Sandbox Provider API**: Sandbox providers have many nuanced differences in their API, it does not make sense for us to try to provide a custom layer. Instead, we opt to provide skills that lets you integrate this project with sandbox providers.

View file

@ -0,0 +1,26 @@
---
title: "Agent Compatibility"
description: "Supported agents, install methods, and streaming formats."
---
## Compatibility matrix
| Agent | Provider | Binary | Install method | Session ID | Streaming format |
|-------|----------|--------|----------------|------------|------------------|
| Claude Code | Anthropic | `claude` | curl raw binary from GCS | `session_id` | JSONL via stdout |
| Codex | OpenAI | `codex` | curl tarball from GitHub releases | `thread_id` | JSONL via stdout |
| OpenCode | Multi-provider | `opencode` | curl tarball from GitHub releases | `session_id` | SSE or JSONL |
| Amp | Sourcegraph | `amp` | curl raw binary from GCS | `session_id` | JSONL via stdout |
## Agent modes
- **OpenCode**: discovered via the server API.
- **Claude Code / Codex / Amp**: hardcoded modes (typically `build`, `plan`, or `custom`).
## Capability notes
- **Questions / permissions**: OpenCode natively supports these workflows. Claude plan approval is normalized into a question event.
- **Streaming**: all agents stream events; OpenCode uses SSE while others use JSONL.
- **Files and images**: normalized via `UniversalMessagePart` with `File` and `Image` parts.
See [Universal API](/universal-api) for feature coverage details.

45
docs/architecture.mdx Normal file
View file

@ -0,0 +1,45 @@
---
title: "Architecture"
description: "How the daemon, schemas, and agents fit together."
---
Sandbox Agent SDK is built around a single daemon that runs inside the sandbox and exposes a universal HTTP API. Clients use the API (or the TypeScript SDK / CLI) to create sessions, send messages, and stream events.
## Components
- **Daemon**: Rust HTTP server that manages agent processes and streaming.
- **Universal schema**: Shared input/output types for messages and events.
- **SDKs & CLI**: Convenience wrappers around the HTTP API.
## Session model
- **Session ID**: Client-provided primary session identifier.
- **Agent session ID**: Underlying ID from the agent (thread/session). This is surfaced in events but is not the primary key.
## Event streaming
- Events are stored in memory per session and assigned a monotonically increasing `id`.
- `/events` returns a slice of events by offset/limit.
- `/events/sse` streams new events from the same offset semantics.
## Agent integration strategies
### Subprocess per session
Claude Code, Codex, and Amp run as subprocesses. The daemon reads JSONL output from stdout and converts each event into a UniversalEvent.
### Shared server (OpenCode)
OpenCode runs as a shared server. The daemon connects via HTTP and SSE, then converts OpenCode events to UniversalEvents.
## Human-in-the-loop
Questions and permission prompts are normalized into the universal schema:
- Question events surface as `questionAsked` with selectable options.
- Permission events surface as `permissionAsked` with `reply: once | always | reject`.
- Claude plan approval is normalized into a question event (approve/reject).
## Authentication
The daemon uses a **global token** configured at startup. All HTTP and CLI operations reuse the same token and are validated against the `Authorization` header (`Bearer` or `Token`) or `x-sandbox-token`.

111
docs/cli.mdx Normal file
View file

@ -0,0 +1,111 @@
---
title: "CLI"
description: "CLI reference and server flags."
---
The `sandbox-agent` CLI mirrors the HTTP API so you can script everything without writing client code.
## Server flags
```bash
sandbox-agent --token "$SANDBOX_TOKEN" --host 127.0.0.1 --port 8787
```
- `--token`: global token for all requests.
- `--no-token`: disable auth (local dev only).
- `--host`, `--port`: bind address.
- `--cors-allow-origin`, `--cors-allow-method`, `--cors-allow-header`, `--cors-allow-credentials`: configure CORS.
## Agent commands
<details>
<summary><strong>agents list</strong></summary>
```bash
sandbox-agent agents list --endpoint http://127.0.0.1:8787
```
</details>
<details>
<summary><strong>agents install</strong></summary>
```bash
sandbox-agent agents install claude --reinstall --endpoint http://127.0.0.1:8787
```
</details>
<details>
<summary><strong>agents modes</strong></summary>
```bash
sandbox-agent agents modes claude --endpoint http://127.0.0.1:8787
```
</details>
## Session commands
<details>
<summary><strong>sessions create</strong></summary>
```bash
sandbox-agent sessions create my-session \
--agent claude \
--agent-mode build \
--permission-mode default \
--endpoint http://127.0.0.1:8787
```
</details>
<details>
<summary><strong>sessions send-message</strong></summary>
```bash
sandbox-agent sessions send-message my-session \
--message "Summarize the repository" \
--endpoint http://127.0.0.1:8787
```
</details>
<details>
<summary><strong>sessions events</strong></summary>
```bash
sandbox-agent sessions events my-session --offset 0 --limit 50 --endpoint http://127.0.0.1:8787
```
</details>
<details>
<summary><strong>sessions events-sse</strong></summary>
```bash
sandbox-agent sessions events-sse my-session --offset 0 --endpoint http://127.0.0.1:8787
```
</details>
<details>
<summary><strong>sessions reply-question</strong></summary>
```bash
sandbox-agent sessions reply-question my-session QUESTION_ID \
--answers "yes" \
--endpoint http://127.0.0.1:8787
```
</details>
<details>
<summary><strong>sessions reject-question</strong></summary>
```bash
sandbox-agent sessions reject-question my-session QUESTION_ID --endpoint http://127.0.0.1:8787
```
</details>
<details>
<summary><strong>sessions reply-permission</strong></summary>
```bash
sandbox-agent sessions reply-permission my-session PERMISSION_ID \
--reply once \
--endpoint http://127.0.0.1:8787
```
</details>

View file

@ -0,0 +1,21 @@
---
title: "Cloudflare Sandboxes"
description: "Deploy the daemon in Cloudflare Sandboxes."
---
## Steps
1. Create a Cloudflare Sandbox with a Linux runtime.
2. Install the agent binaries and the sandbox-agent daemon.
3. Start the daemon and expose the HTTP port.
```bash
export SANDBOX_TOKEN="..."
cargo run -p sandbox-agent -- \
--token "$SANDBOX_TOKEN" \
--host 0.0.0.0 \
--port 8787
```
4. Connect your client to the sandbox endpoint.

View file

@ -0,0 +1,21 @@
---
title: "Daytona"
description: "Run the daemon in a Daytona workspace."
---
## Steps
1. Create a Daytona workspace with Rust and curl available.
2. Install or build the sandbox-agent binary.
3. Start the daemon and expose port `8787` (or your preferred port).
```bash
export SANDBOX_TOKEN="..."
cargo run -p sandbox-agent -- \
--token "$SANDBOX_TOKEN" \
--host 0.0.0.0 \
--port 8787
```
4. Use your Daytona port forwarding to reach the daemon from your client.

View file

@ -0,0 +1,27 @@
---
title: "Docker (dev)"
description: "Build and run the daemon in a Docker container."
---
## Build the binary
Use the release Dockerfile to build a static binary:
```bash
docker build -f docker/release/linux-x86_64.Dockerfile -t sandbox-agent-build .
docker run --rm -v "$PWD/artifacts:/artifacts" sandbox-agent-build
```
The binary will be written to `./artifacts/sandbox-agent-x86_64-unknown-linux-musl`.
## Run the daemon
```bash
docker run --rm -p 8787:8787 \
-v "$PWD/artifacts:/artifacts" \
debian:bookworm-slim \
/artifacts/sandbox-agent-x86_64-unknown-linux-musl --token "$SANDBOX_TOKEN" --host 0.0.0.0 --port 8787
```
You can now access the API at `http://localhost:8787`.

25
docs/deployments/e2b.mdx Normal file
View file

@ -0,0 +1,25 @@
---
title: "E2B"
description: "Deploy the daemon inside an E2B sandbox."
---
## Steps
1. Start an E2B sandbox with network access.
2. Install the agent binaries you need (Claude, Codex, OpenCode, Amp).
3. Run the daemon and expose its port.
Example startup script:
```bash
export SANDBOX_TOKEN="..."
# Install sandbox-agent binary (or build from source)
# TODO: replace with release download once published
cargo run -p sandbox-agent -- \
--token "$SANDBOX_TOKEN" \
--host 0.0.0.0 \
--port 8787
```
4. Configure your client to connect to the sandbox endpoint.

View file

@ -0,0 +1,21 @@
---
title: "Vercel Sandboxes"
description: "Run the daemon inside Vercel Sandboxes."
---
## Steps
1. Provision a Vercel Sandbox with network access and storage.
2. Install the agent binaries you need.
3. Run the daemon and expose the port.
```bash
export SANDBOX_TOKEN="..."
cargo run -p sandbox-agent -- \
--token "$SANDBOX_TOKEN" \
--host 0.0.0.0 \
--port 8787
```
4. Configure your client to use the sandbox URL.

82
docs/docs.json Normal file
View file

@ -0,0 +1,82 @@
{
"$schema": "https://mintlify.com/docs.json",
"theme": "dark",
"name": "Sandbox Agent SDK",
"colors": {
"primary": "#ff4f00",
"light": "#ff6a2a",
"dark": "#cc3f00",
"background": "#000000",
"card": "#1c1c1e",
"border": "#2c2c2e",
"inputBackground": "#2c2c2e",
"inputBorder": "#3a3a3c",
"text": "#ffffff",
"muted": "#8e8e93",
"success": "#30d158",
"warning": "#ff4f00",
"danger": "#ff3b30",
"purple": "#bf5af2"
},
"favicon": "/favicon.svg",
"logo": {
"light": "/logo/light.svg",
"dark": "/logo/dark.svg"
},
"navigation": {
"tabs": [
{
"tab": "Guides",
"groups": [
{
"group": "Getting started",
"pages": [
"index",
"quickstart",
"architecture",
"agent-compatibility",
"universal-api"
]
},
{
"group": "Operations",
"pages": [
"frontend"
]
}
]
},
{
"tab": "Reference",
"groups": [
{
"group": "Interfaces",
"pages": [
"cli",
"http-api",
"typescript-sdk"
]
}
]
},
{
"tab": "Deployments",
"groups": [
{
"group": "Examples",
"pages": [
"deployments/docker",
"deployments/e2b",
"deployments/daytona",
"deployments/vercel-sandboxes",
"deployments/cloudflare-sandboxes"
]
}
]
}
]
},
"styles": [
"/theme.css"
]
}

19
docs/favicon.svg Normal file
View file

@ -0,0 +1,19 @@
<svg width="24" height="24" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M9.06145 23.1079C5.26816 22.3769 -3.39077 20.6274 1.4173 5.06384C9.6344 6.09939 16.9728 14.0644 9.06145 23.1079Z" fill="url(#paint0_linear_17557_2021)"/>
<path d="M8.91928 23.0939C5.27642 21.2223 0.78371 4.20891 17.0071 0C20.7569 7.19341 19.6212 16.5452 8.91928 23.0939Z" fill="url(#paint1_linear_17557_2021)"/>
<path d="M8.91388 23.0788C8.73534 19.8817 10.1585 9.08525 23.5699 13.1107C23.1812 20.1229 18.984 26.4182 8.91388 23.0788Z" fill="url(#paint2_linear_17557_2021)"/>
<defs>
<linearGradient id="paint0_linear_17557_2021" x1="3.77557" y1="5.91571" x2="5.23185" y2="21.5589" gradientUnits="userSpaceOnUse">
<stop stop-color="#18E299"/>
<stop offset="1" stop-color="#15803D"/>
</linearGradient>
<linearGradient id="paint1_linear_17557_2021" x1="12.1711" y1="-0.718425" x2="10.1897" y2="22.9832" gradientUnits="userSpaceOnUse">
<stop stop-color="#16A34A"/>
<stop offset="1" stop-color="#4ADE80"/>
</linearGradient>
<linearGradient id="paint2_linear_17557_2021" x1="23.1327" y1="15.353" x2="9.33841" y2="18.5196" gradientUnits="userSpaceOnUse">
<stop stop-color="#4ADE80"/>
<stop offset="1" stop-color="#0D9373"/>
</linearGradient>
</defs>
</svg>

After

Width:  |  Height:  |  Size: 1.2 KiB

20
docs/frontend.mdx Normal file
View file

@ -0,0 +1,20 @@
---
title: "Frontend Demo"
description: "Run the Vite + React UI for testing the daemon."
---
The demo frontend lives at `frontend/packages/web`.
## Run locally
```bash
pnpm install
pnpm --filter @sandbox-agent/web dev
```
The UI expects:
- Endpoint (e.g. `http://127.0.0.1:8787`)
- Optional token
If you see CORS errors, enable CORS on the daemon with `--cors-allow-origin` and related flags.

157
docs/http-api.mdx Normal file
View file

@ -0,0 +1,157 @@
---
title: "HTTP API"
description: "Endpoint reference for the sandbox agent daemon."
---
All endpoints are under `/v1`. Authentication uses the daemon-level token via `Authorization: Bearer <token>` or `x-sandbox-token`.
## Sessions
<details>
<summary><strong>POST /v1/sessions/{sessionId}</strong> - Create session</summary>
Request:
```json
{
"agent": "claude",
"agentMode": "build",
"permissionMode": "default",
"model": "claude-3-5-sonnet",
"variant": "high",
"agentVersion": "latest"
}
```
Response:
```json
{
"healthy": true,
"agentSessionId": "..."
}
```
</details>
<details>
<summary><strong>POST /v1/sessions/{sessionId}/messages</strong> - Send message</summary>
Request:
```json
{
"message": "Describe the repository."
}
```
</details>
<details>
<summary><strong>GET /v1/sessions/{sessionId}/events</strong> - Fetch events</summary>
Query params:
- `offset`: last-seen event id (exclusive)
- `limit`: max number of events
Response:
```json
{
"events": [
{
"id": 1,
"timestamp": "2026-01-25T10:00:00Z",
"sessionId": "my-session",
"agent": "claude",
"agentSessionId": "...",
"data": { "message": { "role": "assistant", "parts": [{ "type": "text", "text": "..." }] } }
}
],
"hasMore": false
}
```
</details>
<details>
<summary><strong>GET /v1/sessions/{sessionId}/events/sse</strong> - Stream events (SSE)</summary>
Query params:
- `offset`: last-seen event id (exclusive)
SSE payloads are `UniversalEvent` JSON.
</details>
<details>
<summary><strong>POST /v1/sessions/{sessionId}/questions/{questionId}/reply</strong></summary>
Request:
```json
{ "answers": [["Option A"], ["Option B", "Option C"]] }
```
</details>
<details>
<summary><strong>POST /v1/sessions/{sessionId}/questions/{questionId}/reject</strong></summary>
Request:
```json
{}
```
</details>
<details>
<summary><strong>POST /v1/sessions/{sessionId}/permissions/{permissionId}/reply</strong></summary>
Request:
```json
{ "reply": "once" }
```
</details>
## Agents
<details>
<summary><strong>GET /v1/agents</strong> - List agents</summary>
Response:
```json
{
"agents": [
{ "id": "claude", "installed": true, "version": "...", "path": "/usr/local/bin/claude" }
]
}
```
</details>
<details>
<summary><strong>POST /v1/agents/{agentId}/install</strong> - Install agent</summary>
Request:
```json
{ "reinstall": false }
```
</details>
<details>
<summary><strong>GET /v1/agents/{agentId}/modes</strong> - List modes</summary>
Response:
```json
{
"modes": [
{ "id": "build", "name": "Build", "description": "Default coding mode" }
]
}
```
</details>
## Error handling
All errors use RFC 7807 Problem Details and stable `type` strings (e.g. `urn:sandbox-agent:error:session_not_found`).

68
docs/index.mdx Normal file
View file

@ -0,0 +1,68 @@
---
title: "Overview"
description: "Universal API for running Claude Code, Codex, OpenCode, and Amp inside sandboxes."
---
Sandbox Agent SDK is a universal API and daemon for running coding agents inside sandboxes. It standardizes agent sessions, events, and human-in-the-loop workflows across Claude Code, Codex, OpenCode, and Amp.
## At a glance
- Universal HTTP API and TypeScript SDK
- Runs inside sandboxes with a lightweight Rust daemon
- Streams events in a shared UniversalEvent schema
- Supports questions and permission workflows
- Designed for multi-provider sandbox environments
## Quickstart
Run the daemon locally:
```bash
sandbox-agent --token "$SANDBOX_TOKEN" --host 127.0.0.1 --port 8787
```
Send a message:
```bash
curl -X POST "http://127.0.0.1:8787/v1/sessions/my-session" \
-H "Authorization: Bearer $SANDBOX_TOKEN" \
-H "Content-Type: application/json" \
-d '{"agent":"claude"}'
curl -X POST "http://127.0.0.1:8787/v1/sessions/my-session/messages" \
-H "Authorization: Bearer $SANDBOX_TOKEN" \
-H "Content-Type: application/json" \
-d '{"message":"Explain the repo structure."}'
```
See the full quickstart in [Quickstart](/quickstart).
## What this project solves
- **Universal Coding Agent API**: standardize tool calls, messages, and events across agents.
- **Agents in sandboxes**: run a single HTTP daemon inside any sandbox provider.
- **Agent transcripts**: stream or persist a universal event log in your own storage.
## Project scope
**In scope**
- Agent session orchestration inside a sandbox
- Streaming events in a universal schema
- Human-in-the-loop questions and permissions
- TypeScript SDK and CLI wrappers
**Out of scope**
- Persistent storage of sessions on disk
- Building custom LLM agents (use Vercel AI SDK for that)
- Sandbox provider APIs (use provider SDKs or custom glue)
- Git repo management
## Next steps
- Read the [Architecture](/architecture) overview
- Review [Agent compatibility](/agent-compatibility)
- See the [HTTP API](/http-api) and [CLI](/cli)
- Run the [Frontend demo](/frontend)
- Use the [TypeScript SDK](/typescript-sdk)

21
docs/logo/dark.svg Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 12 KiB

21
docs/logo/light.svg Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 12 KiB

75
docs/quickstart.mdx Normal file
View file

@ -0,0 +1,75 @@
---
title: "Quickstart"
description: "Start the daemon and send your first message."
---
## 1. Run the daemon
Use the installed binary, or `cargo run` in development.
```bash
sandbox-agent --token "$SANDBOX_TOKEN" --host 127.0.0.1 --port 8787
```
If you want to run without auth (local dev only):
```bash
sandbox-agent --no-token --host 127.0.0.1 --port 8787
```
### CORS (frontend usage)
If you are calling the daemon from a browser, enable CORS explicitly:
```bash
sandbox-agent \
--token "$SANDBOX_TOKEN" \
--cors-allow-origin "http://localhost:5173" \
--cors-allow-method "GET" \
--cors-allow-method "POST" \
--cors-allow-header "Authorization" \
--cors-allow-header "Content-Type" \
--cors-allow-credentials
```
## 2. Create a session
```bash
curl -X POST "http://127.0.0.1:8787/v1/sessions/my-session" \
-H "Authorization: Bearer $SANDBOX_TOKEN" \
-H "Content-Type: application/json" \
-d '{"agent":"claude","agentMode":"build","permissionMode":"default"}'
```
## 3. Send a message
```bash
curl -X POST "http://127.0.0.1:8787/v1/sessions/my-session/messages" \
-H "Authorization: Bearer $SANDBOX_TOKEN" \
-H "Content-Type: application/json" \
-d '{"message":"Summarize the repository and suggest next steps."}'
```
## 4. Read events
```bash
curl "http://127.0.0.1:8787/v1/sessions/my-session/events?offset=0&limit=50" \
-H "Authorization: Bearer $SANDBOX_TOKEN"
```
For streaming output, use SSE:
```bash
curl "http://127.0.0.1:8787/v1/sessions/my-session/events/sse?offset=0" \
-H "Authorization: Bearer $SANDBOX_TOKEN"
```
## 5. CLI shortcuts
The CLI mirrors the HTTP API:
```bash
sandbox-agent sessions create my-session --agent claude --endpoint http://127.0.0.1:8787 --token "$SANDBOX_TOKEN"
sandbox-agent sessions send-message my-session --message "Hello" --endpoint http://127.0.0.1:8787 --token "$SANDBOX_TOKEN"
```

72
docs/theme.css Normal file
View file

@ -0,0 +1,72 @@
:root {
color-scheme: dark;
--sa-primary: #ff4f00;
--sa-bg: #000000;
--sa-card: #1c1c1e;
--sa-border: #2c2c2e;
--sa-input-bg: #2c2c2e;
--sa-input-border: #3a3a3c;
--sa-text: #ffffff;
--sa-muted: #8e8e93;
--sa-success: #30d158;
--sa-warning: #ff4f00;
--sa-danger: #ff3b30;
--sa-purple: #bf5af2;
}
html,
body {
background-color: var(--sa-bg);
color: var(--sa-text);
}
a {
color: var(--sa-primary);
}
a:hover {
color: var(--sa-warning);
}
hr {
border-color: var(--sa-border);
}
input,
textarea,
select {
background-color: var(--sa-input-bg);
border: 1px solid var(--sa-input-border);
color: var(--sa-text);
}
code,
pre {
background-color: var(--sa-card);
border: 1px solid var(--sa-border);
color: var(--sa-text);
}
.card,
.mintlify-card,
.docs-card {
background-color: var(--sa-card);
border: 1px solid var(--sa-border);
}
.muted,
.text-muted {
color: var(--sa-muted);
}
.alert-success {
border-color: var(--sa-success);
}
.alert-warning {
border-color: var(--sa-warning);
}
.alert-danger {
border-color: var(--sa-danger);
}

100
docs/typescript-sdk.mdx Normal file
View file

@ -0,0 +1,100 @@
---
title: "TypeScript SDK"
description: "Generated types and a thin fetch-based client."
---
The TypeScript SDK is generated from the OpenAPI spec produced by the Rust server.
## Generate types
```bash
pnpm --filter @sandbox-agent/typescript-sdk generate
```
This runs:
- `cargo run -p sandbox-agent-openapi-gen` to emit OpenAPI JSON
- `openapi-typescript` to generate types
## Usage
```ts
import { SandboxDaemonClient } from "@sandbox-agent/typescript-sdk";
const client = new SandboxDaemonClient({
baseUrl: "http://127.0.0.1:8787",
token: process.env.SANDBOX_TOKEN,
});
await client.createSession("my-session", { agent: "claude" });
await client.postMessage("my-session", { message: "Hello" });
const events = await client.getEvents("my-session", { offset: 0, limit: 50 });
```
## Endpoint mapping
<details>
<summary><strong>client.listAgents()</strong></summary>
Maps to `GET /v1/agents`.
</details>
<details>
<summary><strong>client.installAgent(agentId, body)</strong></summary>
Maps to `POST /v1/agents/{agentId}/install`.
</details>
<details>
<summary><strong>client.getAgentModes(agentId)</strong></summary>
Maps to `GET /v1/agents/{agentId}/modes`.
</details>
<details>
<summary><strong>client.createSession(sessionId, body)</strong></summary>
Maps to `POST /v1/sessions/{sessionId}`.
</details>
<details>
<summary><strong>client.postMessage(sessionId, body)</strong></summary>
Maps to `POST /v1/sessions/{sessionId}/messages`.
</details>
<details>
<summary><strong>client.getEvents(sessionId, params)</strong></summary>
Maps to `GET /v1/sessions/{sessionId}/events`.
</details>
<details>
<summary><strong>client.getEventsSse(sessionId, params)</strong></summary>
Maps to `GET /v1/sessions/{sessionId}/events/sse` (raw SSE response).
</details>
<details>
<summary><strong>client.streamEvents(sessionId, params)</strong></summary>
Helper that parses SSE into `UniversalEvent` objects.
</details>
<details>
<summary><strong>client.replyQuestion(sessionId, questionId, body)</strong></summary>
Maps to `POST /v1/sessions/{sessionId}/questions/{questionId}/reply`.
</details>
<details>
<summary><strong>client.rejectQuestion(sessionId, questionId)</strong></summary>
Maps to `POST /v1/sessions/{sessionId}/questions/{questionId}/reject`.
</details>
<details>
<summary><strong>client.replyPermission(sessionId, permissionId, body)</strong></summary>
Maps to `POST /v1/sessions/{sessionId}/permissions/{permissionId}/reply`.
</details>

30
docs/universal-api.mdx Normal file
View file

@ -0,0 +1,30 @@
---
title: "Universal API"
description: "Feature checklist and normalization rules."
---
## Feature checklist
- [x] Session creation and lifecycle events
- [x] Message streaming (assistant and tool messages)
- [x] Tool call and tool result normalization
- [x] File and image parts
- [x] Human-in-the-loop questions
- [x] Permission prompts and replies
- [x] Plan approval normalization (Claude -> question)
- [x] Event streaming over SSE
- [ ] Persistent storage (out of scope)
## Normalization rules
- **Session ID** is always the client-provided ID.
- **Agent session ID** is surfaced in events but never replaces the primary session ID.
- **Tool calls** map to `UniversalMessagePart::ToolCall` and results to `ToolResult`.
- **File and image parts** map to `AttachmentSource` with `Path`, `Url`, or base64 `Data`.
## Agent mode vs permission mode
- **agentMode**: behavior or system prompt strategy (build/plan/custom).
- **permissionMode**: capability restrictions (default/plan/bypass).
These are separate concepts and must be configured independently.

View file

@ -1,3 +1,4 @@
use std::collections::HashMap;
use std::io::Write;
use std::path::PathBuf;
@ -5,6 +6,10 @@ use clap::{Args, Parser, Subcommand};
use reqwest::blocking::Client as HttpClient;
use reqwest::Method;
use sandbox_agent_agent_management::agents::AgentManager;
use sandbox_agent_agent_management::credentials::{
extract_all_credentials, AuthType, CredentialExtractionOptions, ExtractedCredentials,
ProviderCredentials,
};
use sandbox_agent_core::router::{
AgentInstallRequest, AppState, AuthConfig, CreateSessionRequest, MessageRequest,
PermissionReply, PermissionReplyRequest, QuestionReplyRequest,
@ -26,35 +31,39 @@ struct Cli {
#[command(subcommand)]
command: Option<Command>,
#[arg(long, default_value = "127.0.0.1")]
#[arg(long, short = 'H', default_value = "127.0.0.1")]
host: String,
#[arg(long, default_value_t = 8787)]
#[arg(long, short = 'p', default_value_t = 2468)]
port: u16,
#[arg(long)]
#[arg(long, short = 't')]
token: Option<String>,
#[arg(long)]
#[arg(long, short = 'n')]
no_token: bool,
#[arg(long = "cors-allow-origin")]
#[arg(long = "cors-allow-origin", short = 'O')]
cors_allow_origin: Vec<String>,
#[arg(long = "cors-allow-method")]
#[arg(long = "cors-allow-method", short = 'M')]
cors_allow_method: Vec<String>,
#[arg(long = "cors-allow-header")]
#[arg(long = "cors-allow-header", short = 'A')]
cors_allow_header: Vec<String>,
#[arg(long = "cors-allow-credentials")]
#[arg(long = "cors-allow-credentials", short = 'C')]
cors_allow_credentials: bool,
}
#[derive(Subcommand, Debug)]
enum Command {
/// Manage installed agents and their modes.
Agents(AgentsArgs),
/// Create sessions and interact with session events.
Sessions(SessionsArgs),
/// Inspect locally discovered credentials.
Credentials(CredentialsArgs),
}
#[derive(Args, Debug)]
@ -69,42 +78,65 @@ struct SessionsArgs {
command: SessionsCommand,
}
#[derive(Args, Debug)]
struct CredentialsArgs {
#[command(subcommand)]
command: CredentialsCommand,
}
#[derive(Subcommand, Debug)]
enum AgentsCommand {
/// List all agents and install status.
List(ClientArgs),
/// Install or reinstall an agent.
Install(InstallAgentArgs),
/// Show available modes for an agent.
Modes(AgentModesArgs),
}
#[derive(Subcommand, Debug)]
enum CredentialsCommand {
/// Extract credentials using local discovery rules.
Extract(CredentialsExtractArgs),
}
#[derive(Subcommand, Debug)]
enum SessionsCommand {
/// Create a new session for an agent.
Create(CreateSessionArgs),
#[command(name = "send-message")]
/// Send a message to an existing session.
SendMessage(SessionMessageArgs),
#[command(name = "get-messages")]
/// Alias for events; returns session events.
GetMessages(SessionEventsArgs),
#[command(name = "events")]
/// Fetch session events with offset/limit.
Events(SessionEventsArgs),
#[command(name = "events-sse")]
/// Stream session events over SSE.
EventsSse(SessionEventsSseArgs),
#[command(name = "reply-question")]
/// Reply to a question event.
ReplyQuestion(QuestionReplyArgs),
#[command(name = "reject-question")]
/// Reject a question event.
RejectQuestion(QuestionRejectArgs),
#[command(name = "reply-permission")]
/// Reply to a permission request.
ReplyPermission(PermissionReplyArgs),
}
#[derive(Args, Debug, Clone)]
struct ClientArgs {
#[arg(long)]
#[arg(long, short = 'e')]
endpoint: Option<String>,
}
#[derive(Args, Debug)]
struct InstallAgentArgs {
agent: String,
#[arg(long)]
#[arg(long, short = 'r')]
reinstall: bool,
#[command(flatten)]
client: ClientArgs,
@ -120,17 +152,17 @@ struct AgentModesArgs {
#[derive(Args, Debug)]
struct CreateSessionArgs {
session_id: String,
#[arg(long)]
#[arg(long, short = 'a')]
agent: String,
#[arg(long)]
#[arg(long, short = 'g')]
agent_mode: Option<String>,
#[arg(long)]
#[arg(long, short = 'p')]
permission_mode: Option<String>,
#[arg(long)]
#[arg(long, short = 'm')]
model: Option<String>,
#[arg(long)]
#[arg(long, short = 'v')]
variant: Option<String>,
#[arg(long)]
#[arg(long, short = 'A')]
agent_version: Option<String>,
#[command(flatten)]
client: ClientArgs,
@ -139,7 +171,7 @@ struct CreateSessionArgs {
#[derive(Args, Debug)]
struct SessionMessageArgs {
session_id: String,
#[arg(long)]
#[arg(long, short = 'm')]
message: String,
#[command(flatten)]
client: ClientArgs,
@ -148,9 +180,9 @@ struct SessionMessageArgs {
#[derive(Args, Debug)]
struct SessionEventsArgs {
session_id: String,
#[arg(long)]
#[arg(long, short = 'o')]
offset: Option<u64>,
#[arg(long)]
#[arg(long, short = 'l')]
limit: Option<u64>,
#[command(flatten)]
client: ClientArgs,
@ -159,7 +191,7 @@ struct SessionEventsArgs {
#[derive(Args, Debug)]
struct SessionEventsSseArgs {
session_id: String,
#[arg(long)]
#[arg(long, short = 'o')]
offset: Option<u64>,
#[command(flatten)]
client: ClientArgs,
@ -169,7 +201,7 @@ struct SessionEventsSseArgs {
struct QuestionReplyArgs {
session_id: String,
question_id: String,
#[arg(long)]
#[arg(long, short = 'a')]
answers: String,
#[command(flatten)]
client: ClientArgs,
@ -187,12 +219,26 @@ struct QuestionRejectArgs {
struct PermissionReplyArgs {
session_id: String,
permission_id: String,
#[arg(long)]
#[arg(long, short = 'r')]
reply: PermissionReply,
#[command(flatten)]
client: ClientArgs,
}
#[derive(Args, Debug)]
struct CredentialsExtractArgs {
#[arg(long, short = 'a', value_enum)]
agent: Option<CredentialAgent>,
#[arg(long, short = 'p')]
provider: Option<String>,
#[arg(long, short = 'd')]
home_dir: Option<PathBuf>,
#[arg(long, short = 'n')]
no_oauth: bool,
#[arg(long, short = 'r')]
reveal: bool,
}
#[derive(Debug, Error)]
enum CliError {
#[error("missing --token or --no-token for server mode")]
@ -280,6 +326,7 @@ fn run_client(command: &Command, cli: &Cli) -> Result<(), CliError> {
match command {
Command::Agents(subcommand) => run_agents(&subcommand.command, cli),
Command::Sessions(subcommand) => run_sessions(&subcommand.command, cli),
Command::Credentials(subcommand) => run_credentials(&subcommand.command),
}
}
@ -380,6 +427,200 @@ fn run_sessions(command: &SessionsCommand, cli: &Cli) -> Result<(), CliError> {
}
}
fn run_credentials(command: &CredentialsCommand) -> Result<(), CliError> {
match command {
CredentialsCommand::Extract(args) => {
let mut options = CredentialExtractionOptions::new();
if let Some(home_dir) = args.home_dir.clone() {
options.home_dir = Some(home_dir);
}
if args.no_oauth {
options.include_oauth = false;
}
let credentials = extract_all_credentials(&options);
if let Some(agent) = args.agent.clone() {
let token = select_token_for_agent(&credentials, agent, args.provider.as_deref())?;
write_stdout_line(&token)?;
return Ok(());
}
if let Some(provider) = args.provider.as_deref() {
let token = select_token_for_provider(&credentials, provider)?;
write_stdout_line(&token)?;
return Ok(());
}
let output = credentials_to_output(credentials, args.reveal);
let pretty = serde_json::to_string_pretty(&output)?;
write_stdout_line(&pretty)?;
Ok(())
}
}
}
#[derive(Serialize)]
struct CredentialsOutput {
anthropic: Option<CredentialSummary>,
openai: Option<CredentialSummary>,
other: HashMap<String, CredentialSummary>,
}
#[derive(Serialize)]
struct CredentialSummary {
provider: String,
source: String,
auth_type: String,
api_key: String,
redacted: bool,
}
#[derive(clap::ValueEnum, Clone, Debug)]
enum CredentialAgent {
Claude,
Codex,
Opencode,
Amp,
}
fn credentials_to_output(credentials: ExtractedCredentials, reveal: bool) -> CredentialsOutput {
CredentialsOutput {
anthropic: credentials.anthropic.map(|cred| summarize_credential(&cred, reveal)),
openai: credentials.openai.map(|cred| summarize_credential(&cred, reveal)),
other: credentials
.other
.into_iter()
.map(|(key, cred)| (key, summarize_credential(&cred, reveal)))
.collect(),
}
}
fn summarize_credential(credential: &ProviderCredentials, reveal: bool) -> CredentialSummary {
let api_key = if reveal {
credential.api_key.clone()
} else {
redact_key(&credential.api_key)
};
CredentialSummary {
provider: credential.provider.clone(),
source: credential.source.clone(),
auth_type: match credential.auth_type {
AuthType::ApiKey => "api_key".to_string(),
AuthType::Oauth => "oauth".to_string(),
},
api_key,
redacted: !reveal,
}
}
fn redact_key(key: &str) -> String {
let trimmed = key.trim();
let len = trimmed.len();
if len <= 8 {
return "****".to_string();
}
let prefix = &trimmed[..4];
let suffix = &trimmed[len - 4..];
format!("{prefix}...{suffix}")
}
fn select_token_for_agent(
credentials: &ExtractedCredentials,
agent: CredentialAgent,
provider: Option<&str>,
) -> Result<String, CliError> {
match agent {
CredentialAgent::Claude | CredentialAgent::Amp => {
if let Some(provider) = provider {
if provider != "anthropic" {
return Err(CliError::Server(format!(
"agent {:?} only supports provider anthropic",
agent
)));
}
}
select_token_for_provider(credentials, "anthropic")
}
CredentialAgent::Codex => {
if let Some(provider) = provider {
if provider != "openai" {
return Err(CliError::Server(format!(
"agent {:?} only supports provider openai",
agent
)));
}
}
select_token_for_provider(credentials, "openai")
}
CredentialAgent::Opencode => {
if let Some(provider) = provider {
return select_token_for_provider(credentials, provider);
}
if let Some(openai) = credentials.openai.as_ref() {
return Ok(openai.api_key.clone());
}
if let Some(anthropic) = credentials.anthropic.as_ref() {
return Ok(anthropic.api_key.clone());
}
if credentials.other.len() == 1 {
if let Some((_, cred)) = credentials.other.iter().next() {
return Ok(cred.api_key.clone());
}
}
let available = available_providers(credentials);
if available.is_empty() {
Err(CliError::Server(
"no credentials found for opencode".to_string(),
))
} else {
Err(CliError::Server(format!(
"multiple providers available for opencode: {} (use --provider)",
available.join(", ")
)))
}
}
}
}
fn select_token_for_provider(
credentials: &ExtractedCredentials,
provider: &str,
) -> Result<String, CliError> {
if let Some(cred) = provider_credential(credentials, provider) {
Ok(cred.api_key.clone())
} else {
Err(CliError::Server(format!(
"no credentials found for provider {provider}"
)))
}
}
fn provider_credential<'a>(
credentials: &'a ExtractedCredentials,
provider: &str,
) -> Option<&'a ProviderCredentials> {
match provider {
"openai" => credentials.openai.as_ref(),
"anthropic" => credentials.anthropic.as_ref(),
_ => credentials.other.get(provider),
}
}
fn available_providers(credentials: &ExtractedCredentials) -> Vec<String> {
let mut providers = Vec::new();
if credentials.openai.is_some() {
providers.push("openai".to_string());
}
if credentials.anthropic.is_some() {
providers.push("anthropic".to_string());
}
for key in credentials.other.keys() {
providers.push(key.clone());
}
providers.sort();
providers.dedup();
providers
}
fn build_cors_layer(cli: &Cli) -> Result<Option<CorsLayer>, CliError> {
let has_config = !cli.cors_allow_origin.is_empty()
|| !cli.cors_allow_method.is_empty()

View file

@ -184,6 +184,9 @@ struct SessionState {
model: Option<String>,
variant: Option<String>,
agent_session_id: Option<String>,
ended: bool,
ended_exit_code: Option<i32>,
ended_message: Option<String>,
next_event_id: u64,
events: Vec<UniversalEvent>,
pending_questions: HashSet<String>,
@ -213,6 +216,9 @@ impl SessionState {
model: request.model.clone(),
variant: request.variant.clone(),
agent_session_id: None,
ended: false,
ended_exit_code: None,
ended_message: None,
next_event_id: 0,
events: Vec::new(),
pending_questions: HashSet::new(),
@ -290,6 +296,23 @@ impl SessionState {
fn take_permission(&mut self, permission_id: &str) -> bool {
self.pending_permissions.remove(permission_id)
}
fn mark_ended(&mut self, exit_code: Option<i32>, message: String) {
self.ended = true;
self.ended_exit_code = exit_code;
self.ended_message = Some(message);
}
fn ended_error(&self) -> Option<SandboxError> {
if !self.ended {
return None;
}
Some(SandboxError::AgentProcessExited {
agent: self.agent.as_str().to_string(),
exit_code: self.ended_exit_code,
stderr: self.ended_message.clone(),
})
}
}
#[derive(Debug)]
@ -408,7 +431,7 @@ impl SessionManager {
session_id: String,
message: String,
) -> Result<(), SandboxError> {
let session_snapshot = self.session_snapshot(&session_id).await?;
let session_snapshot = self.session_snapshot(&session_id, false).await?;
if session_snapshot.agent == AgentId::Opencode {
self.ensure_opencode_stream(session_id.clone()).await?;
self.send_opencode_prompt(&session_snapshot, &message).await?;
@ -511,6 +534,9 @@ impl SessionManager {
let session = sessions.get_mut(session_id).ok_or_else(|| SandboxError::SessionNotFound {
session_id: session_id.to_string(),
})?;
if let Some(err) = session.ended_error() {
return Err(err);
}
if !session.take_question(question_id) {
return Err(SandboxError::InvalidRequest {
message: format!("unknown question id: {question_id}"),
@ -542,6 +568,9 @@ impl SessionManager {
let session = sessions.get_mut(session_id).ok_or_else(|| SandboxError::SessionNotFound {
session_id: session_id.to_string(),
})?;
if let Some(err) = session.ended_error() {
return Err(err);
}
if !session.take_question(question_id) {
return Err(SandboxError::InvalidRequest {
message: format!("unknown question id: {question_id}"),
@ -574,6 +603,9 @@ impl SessionManager {
let session = sessions.get_mut(session_id).ok_or_else(|| SandboxError::SessionNotFound {
session_id: session_id.to_string(),
})?;
if let Some(err) = session.ended_error() {
return Err(err);
}
if !session.take_permission(permission_id) {
return Err(SandboxError::InvalidRequest {
message: format!("unknown permission id: {permission_id}"),
@ -595,11 +627,20 @@ impl SessionManager {
Ok(())
}
async fn session_snapshot(&self, session_id: &str) -> Result<SessionSnapshot, SandboxError> {
async fn session_snapshot(
&self,
session_id: &str,
allow_ended: bool,
) -> Result<SessionSnapshot, SandboxError> {
let sessions = self.sessions.lock().await;
let session = sessions.get(session_id).ok_or_else(|| SandboxError::SessionNotFound {
session_id: session_id.to_string(),
})?;
if !allow_ended {
if let Some(err) = session.ended_error() {
return Err(err);
}
}
Ok(SessionSnapshot::from(session))
}
@ -641,26 +682,47 @@ impl SessionManager {
Ok(Ok(status)) if status.success() => {}
Ok(Ok(status)) => {
let message = format!("agent exited with status {:?}", status);
self.record_error(&session_id, message, Some("process_exit".to_string()), None)
self.record_error(
&session_id,
message.clone(),
Some("process_exit".to_string()),
None,
)
.await;
self.mark_session_ended(&session_id, status.code(), &message)
.await;
}
Ok(Err(err)) => {
let message = format!("failed to wait for agent: {err}");
self.record_error(
&session_id,
format!("failed to wait for agent: {err}"),
message.clone(),
Some("process_wait_failed".to_string()),
None,
)
.await;
self.mark_session_ended(
&session_id,
None,
&message,
)
.await;
}
Err(err) => {
let message = format!("failed to join agent task: {err}");
self.record_error(
&session_id,
format!("failed to join agent task: {err}"),
message.clone(),
Some("process_wait_failed".to_string()),
None,
)
.await;
self.mark_session_ended(
&session_id,
None,
&message,
)
.await;
}
}
}
@ -707,6 +769,16 @@ impl SessionManager {
.await;
}
async fn mark_session_ended(&self, session_id: &str, exit_code: Option<i32>, message: &str) {
let mut sessions = self.sessions.lock().await;
if let Some(session) = sessions.get_mut(session_id) {
if session.ended {
return;
}
session.mark_ended(exit_code, message.to_string());
}
}
async fn ensure_opencode_stream(self: &Arc<Self>, session_id: String) -> Result<(), SandboxError> {
let agent_session_id = {
let mut sessions = self.sessions.lock().await;
@ -744,6 +816,12 @@ impl SessionManager {
None,
)
.await;
self.mark_session_ended(
&session_id,
None,
"opencode server unavailable",
)
.await;
return;
}
};
@ -759,6 +837,12 @@ impl SessionManager {
None,
)
.await;
self.mark_session_ended(
&session_id,
None,
"opencode sse connection failed",
)
.await;
return;
}
};
@ -773,6 +857,12 @@ impl SessionManager {
None,
)
.await;
self.mark_session_ended(
&session_id,
None,
"opencode sse error",
)
.await;
return;
}
@ -789,6 +879,12 @@ impl SessionManager {
None,
)
.await;
self.mark_session_ended(
&session_id,
None,
"opencode sse stream error",
)
.await;
return;
}
};

View file

@ -0,0 +1,155 @@
use crate::{
message_from_parts,
message_from_text,
text_only_from_parts,
ConversionError,
CrashInfo,
EventConversion,
UniversalEventData,
UniversalMessage,
UniversalMessageParsed,
UniversalMessagePart,
};
use crate::amp as schema;
use serde_json::{Map, Value};
pub fn event_to_universal(event: &schema::StreamJsonMessage) -> EventConversion {
let schema::StreamJsonMessage {
content,
error,
id,
tool_call,
type_,
} = event;
match type_ {
schema::StreamJsonMessageType::Message => {
let text = content.clone().unwrap_or_default();
let mut message = message_from_text("assistant", text);
if let UniversalMessage::Parsed(parsed) = &mut message {
parsed.id = id.clone();
}
EventConversion::new(UniversalEventData::Message { message })
}
schema::StreamJsonMessageType::ToolCall => {
let tool_call = tool_call.as_ref();
let part = if let Some(tool_call) = tool_call {
let schema::ToolCall { arguments, id, name } = tool_call;
let input = match arguments {
schema::ToolCallArguments::Variant0(text) => Value::String(text.clone()),
schema::ToolCallArguments::Variant1(map) => Value::Object(map.clone()),
};
UniversalMessagePart::ToolCall {
id: Some(id.clone()),
name: name.clone(),
input,
}
} else {
UniversalMessagePart::Unknown { raw: Value::Null }
};
let mut message = message_from_parts("assistant", vec![part]);
if let UniversalMessage::Parsed(parsed) = &mut message {
parsed.id = id.clone();
}
EventConversion::new(UniversalEventData::Message { message })
}
schema::StreamJsonMessageType::ToolResult => {
let output = content
.clone()
.map(Value::String)
.unwrap_or(Value::Null);
let part = UniversalMessagePart::ToolResult {
id: id.clone(),
name: None,
output,
is_error: None,
};
let message = message_from_parts("tool", vec![part]);
EventConversion::new(UniversalEventData::Message { message })
}
schema::StreamJsonMessageType::Error => {
let message = error.clone().unwrap_or_else(|| "amp error".to_string());
let crash = CrashInfo {
message,
kind: Some("amp".to_string()),
details: serde_json::to_value(event).ok(),
};
EventConversion::new(UniversalEventData::Error { error: crash })
}
schema::StreamJsonMessageType::Done => EventConversion::new(UniversalEventData::Unknown {
raw: serde_json::to_value(event).unwrap_or(Value::Null),
}),
}
}
pub fn universal_event_to_amp(event: &UniversalEventData) -> Result<schema::StreamJsonMessage, ConversionError> {
match event {
UniversalEventData::Message { message } => {
let parsed = match message {
UniversalMessage::Parsed(parsed) => parsed,
UniversalMessage::Unparsed { .. } => {
return Err(ConversionError::Unsupported("unparsed message"))
}
};
let content = text_only_from_parts(&parsed.parts)?;
Ok(schema::StreamJsonMessage {
content: Some(content),
error: None,
id: parsed.id.clone(),
tool_call: None,
type_: schema::StreamJsonMessageType::Message,
})
}
_ => Err(ConversionError::Unsupported("amp event")),
}
}
pub fn message_to_universal(message: &schema::Message) -> UniversalMessage {
let schema::Message {
role,
content,
tool_calls,
} = message;
let mut parts = vec![UniversalMessagePart::Text {
text: content.clone(),
}];
for call in tool_calls {
let schema::ToolCall { arguments, id, name } = call;
let input = match arguments {
schema::ToolCallArguments::Variant0(text) => Value::String(text.clone()),
schema::ToolCallArguments::Variant1(map) => Value::Object(map.clone()),
};
parts.push(UniversalMessagePart::ToolCall {
id: Some(id.clone()),
name: name.clone(),
input,
});
}
UniversalMessage::Parsed(UniversalMessageParsed {
role: role.to_string(),
id: None,
metadata: Map::new(),
parts,
})
}
pub fn universal_message_to_message(
message: &UniversalMessage,
) -> Result<schema::Message, ConversionError> {
let parsed = match message {
UniversalMessage::Parsed(parsed) => parsed,
UniversalMessage::Unparsed { .. } => {
return Err(ConversionError::Unsupported("unparsed message"))
}
};
let content = text_only_from_parts(&parsed.parts)?;
Ok(schema::Message {
role: match parsed.role.as_str() {
"user" => schema::MessageRole::User,
"assistant" => schema::MessageRole::Assistant,
"system" => schema::MessageRole::System,
_ => schema::MessageRole::User,
},
content,
tool_calls: vec![],
})
}

View file

@ -0,0 +1,239 @@
use crate::{
message_from_parts,
message_from_text,
text_only_from_parts,
ConversionError,
EventConversion,
QuestionInfo,
QuestionOption,
QuestionRequest,
UniversalEventData,
UniversalMessage,
UniversalMessageParsed,
UniversalMessagePart,
};
use serde_json::{Map, Value};
pub fn event_to_universal_with_session(
event: &Value,
session_id: String,
) -> EventConversion {
let event_type = event.get("type").and_then(Value::as_str).unwrap_or("");
match event_type {
"assistant" => assistant_event_to_universal(event),
"tool_use" => tool_use_event_to_universal(event, session_id),
"tool_result" => tool_result_event_to_universal(event),
"result" => result_event_to_universal(event),
_ => EventConversion::new(UniversalEventData::Unknown { raw: event.clone() }),
}
}
pub fn universal_event_to_claude(event: &UniversalEventData) -> Result<Value, ConversionError> {
match event {
UniversalEventData::Message { message } => {
let parsed = match message {
UniversalMessage::Parsed(parsed) => parsed,
UniversalMessage::Unparsed { .. } => {
return Err(ConversionError::Unsupported("unparsed message"))
}
};
let text = text_only_from_parts(&parsed.parts)?;
Ok(Value::Object(Map::from_iter([
("type".to_string(), Value::String("assistant".to_string())),
(
"message".to_string(),
Value::Object(Map::from_iter([(
"content".to_string(),
Value::Array(vec![Value::Object(Map::from_iter([(
"type".to_string(),
Value::String("text".to_string()),
), (
"text".to_string(),
Value::String(text),
)]))]),
)])),
),
])))
}
_ => Err(ConversionError::Unsupported("claude event")),
}
}
pub fn prompt_to_universal(prompt: &str) -> UniversalMessage {
message_from_text("user", prompt.to_string())
}
pub fn universal_message_to_prompt(message: &UniversalMessage) -> Result<String, ConversionError> {
let parsed = match message {
UniversalMessage::Parsed(parsed) => parsed,
UniversalMessage::Unparsed { .. } => {
return Err(ConversionError::Unsupported("unparsed message"))
}
};
text_only_from_parts(&parsed.parts)
}
fn assistant_event_to_universal(event: &Value) -> EventConversion {
let content = event
.get("message")
.and_then(|msg| msg.get("content"))
.and_then(Value::as_array)
.cloned()
.unwrap_or_default();
let mut parts = Vec::new();
for block in content {
let block_type = block.get("type").and_then(Value::as_str).unwrap_or("");
match block_type {
"text" => {
if let Some(text) = block.get("text").and_then(Value::as_str) {
parts.push(UniversalMessagePart::Text {
text: text.to_string(),
});
}
}
"tool_use" => {
if let Some(name) = block.get("name").and_then(Value::as_str) {
let input = block.get("input").cloned().unwrap_or(Value::Null);
let id = block.get("id").and_then(Value::as_str).map(|s| s.to_string());
parts.push(UniversalMessagePart::ToolCall {
id,
name: name.to_string(),
input,
});
}
}
_ => parts.push(UniversalMessagePart::Unknown { raw: block }),
}
}
let message = UniversalMessage::Parsed(UniversalMessageParsed {
role: "assistant".to_string(),
id: None,
metadata: Map::new(),
parts,
});
EventConversion::new(UniversalEventData::Message { message })
}
fn tool_use_event_to_universal(event: &Value, session_id: String) -> EventConversion {
let tool_use = event.get("tool_use");
let name = tool_use
.and_then(|tool| tool.get("name"))
.and_then(Value::as_str)
.unwrap_or("");
let input = tool_use
.and_then(|tool| tool.get("input"))
.cloned()
.unwrap_or(Value::Null);
let id = tool_use
.and_then(|tool| tool.get("id"))
.and_then(Value::as_str)
.map(|s| s.to_string());
if name == "AskUserQuestion" {
if let Some(question) =
question_from_claude_input(&input, id.clone(), session_id.clone())
{
return EventConversion::new(UniversalEventData::QuestionAsked {
question_asked: question,
});
}
}
let message = message_from_parts(
"assistant",
vec![UniversalMessagePart::ToolCall {
id,
name: name.to_string(),
input,
}],
);
EventConversion::new(UniversalEventData::Message { message })
}
fn tool_result_event_to_universal(event: &Value) -> EventConversion {
let tool_result = event.get("tool_result");
let output = tool_result
.and_then(|tool| tool.get("content"))
.cloned()
.unwrap_or(Value::Null);
let is_error = tool_result
.and_then(|tool| tool.get("is_error"))
.and_then(Value::as_bool);
let id = tool_result
.and_then(|tool| tool.get("id"))
.and_then(Value::as_str)
.map(|s| s.to_string());
let message = message_from_parts(
"tool",
vec![UniversalMessagePart::ToolResult {
id,
name: None,
output,
is_error,
}],
);
EventConversion::new(UniversalEventData::Message { message })
}
fn result_event_to_universal(event: &Value) -> EventConversion {
let result_text = event
.get("result")
.and_then(Value::as_str)
.unwrap_or("")
.to_string();
let session_id = event
.get("session_id")
.and_then(Value::as_str)
.map(|s| s.to_string());
let message = message_from_text("assistant", result_text);
EventConversion::new(UniversalEventData::Message { message }).with_session(session_id)
}
fn question_from_claude_input(
input: &Value,
tool_id: Option<String>,
session_id: String,
) -> Option<QuestionRequest> {
let questions = input.get("questions").and_then(Value::as_array)?;
let mut parsed_questions = Vec::new();
for question in questions {
let question_text = question.get("question")?.as_str()?.to_string();
let header = question
.get("header")
.and_then(Value::as_str)
.map(|s| s.to_string());
let multi_select = question
.get("multiSelect")
.and_then(Value::as_bool);
let options = question
.get("options")
.and_then(Value::as_array)
.map(|options| {
options
.iter()
.filter_map(|option| {
let label = option.get("label")?.as_str()?.to_string();
let description = option
.get("description")
.and_then(Value::as_str)
.map(|s| s.to_string());
Some(QuestionOption { label, description })
})
.collect::<Vec<_>>()
})?;
parsed_questions.push(QuestionInfo {
question: question_text,
header,
options,
multi_select,
custom: None,
});
}
Some(QuestionRequest {
id: tool_id.unwrap_or_else(|| "claude-question".to_string()),
session_id,
questions: parsed_questions,
tool: None,
})
}

View file

@ -0,0 +1,375 @@
use crate::{
extract_message_from_value,
text_only_from_parts,
AttachmentSource,
ConversionError,
CrashInfo,
EventConversion,
Started,
UniversalEventData,
UniversalMessage,
UniversalMessageParsed,
UniversalMessagePart,
};
use crate::codex as schema;
use serde_json::{Map, Value};
pub fn event_to_universal(event: &schema::ThreadEvent) -> EventConversion {
let schema::ThreadEvent {
error,
item,
thread_id,
type_,
} = event;
match type_ {
schema::ThreadEventType::ThreadCreated | schema::ThreadEventType::ThreadUpdated => {
let started = Started {
message: Some(type_.to_string()),
details: serde_json::to_value(event).ok(),
};
EventConversion::new(UniversalEventData::Started { started })
.with_session(thread_id.clone())
}
schema::ThreadEventType::ItemCreated | schema::ThreadEventType::ItemUpdated => {
if let Some(item) = item.as_ref() {
let message = thread_item_to_message(item);
EventConversion::new(UniversalEventData::Message { message })
.with_session(thread_id.clone())
} else {
EventConversion::new(UniversalEventData::Unknown {
raw: serde_json::to_value(event).unwrap_or(Value::Null),
})
}
}
schema::ThreadEventType::Error => {
let message = extract_message_from_value(&Value::Object(error.clone()))
.unwrap_or_else(|| "codex error".to_string());
let crash = CrashInfo {
message,
kind: Some("error".to_string()),
details: Some(Value::Object(error.clone())),
};
EventConversion::new(UniversalEventData::Error { error: crash })
.with_session(thread_id.clone())
}
}
}
pub fn universal_event_to_codex(event: &UniversalEventData) -> Result<schema::ThreadEvent, ConversionError> {
match event {
UniversalEventData::Message { message } => {
let parsed = match message {
UniversalMessage::Parsed(parsed) => parsed,
UniversalMessage::Unparsed { .. } => {
return Err(ConversionError::Unsupported("unparsed message"))
}
};
let id = parsed.id.clone().ok_or(ConversionError::MissingField("message.id"))?;
let content = text_only_from_parts(&parsed.parts)?;
let role = match parsed.role.as_str() {
"user" => Some(schema::ThreadItemRole::User),
"assistant" => Some(schema::ThreadItemRole::Assistant),
"system" => Some(schema::ThreadItemRole::System),
_ => None,
};
let item = schema::ThreadItem {
content: Some(schema::ThreadItemContent::Variant0(content)),
id,
role,
status: None,
type_: schema::ThreadItemType::Message,
};
Ok(schema::ThreadEvent {
error: Map::new(),
item: Some(item),
thread_id: None,
type_: schema::ThreadEventType::ItemCreated,
})
}
_ => Err(ConversionError::Unsupported("codex event")),
}
}
pub fn message_to_universal(message: &schema::Message) -> UniversalMessage {
let schema::Message { role, content } = message;
UniversalMessage::Parsed(UniversalMessageParsed {
role: role.to_string(),
id: None,
metadata: Map::new(),
parts: vec![UniversalMessagePart::Text {
text: content.clone(),
}],
})
}
pub fn universal_message_to_message(
message: &UniversalMessage,
) -> Result<schema::Message, ConversionError> {
let parsed = match message {
UniversalMessage::Parsed(parsed) => parsed,
UniversalMessage::Unparsed { .. } => {
return Err(ConversionError::Unsupported("unparsed message"))
}
};
let content = text_only_from_parts(&parsed.parts)?;
Ok(schema::Message {
role: match parsed.role.as_str() {
"user" => schema::MessageRole::User,
"assistant" => schema::MessageRole::Assistant,
"system" => schema::MessageRole::System,
_ => schema::MessageRole::User,
},
content,
})
}
pub fn inputs_to_universal_message(inputs: &[schema::Input], role: &str) -> UniversalMessage {
let parts = inputs.iter().map(input_to_universal_part).collect();
UniversalMessage::Parsed(UniversalMessageParsed {
role: role.to_string(),
id: None,
metadata: Map::new(),
parts,
})
}
pub fn input_to_universal_part(input: &schema::Input) -> UniversalMessagePart {
let schema::Input {
content,
mime_type,
path,
type_,
} = input;
let raw = serde_json::to_value(input).unwrap_or(Value::Null);
match type_ {
schema::InputType::Text => match content {
Some(content) => UniversalMessagePart::Text {
text: content.clone(),
},
None => UniversalMessagePart::Unknown { raw },
},
schema::InputType::File => {
let source = if let Some(path) = path {
AttachmentSource::Path { path: path.clone() }
} else if let Some(content) = content {
AttachmentSource::Data {
data: content.clone(),
encoding: None,
}
} else {
return UniversalMessagePart::Unknown { raw };
};
UniversalMessagePart::File {
source,
mime_type: mime_type.clone(),
filename: None,
raw: Some(raw),
}
}
schema::InputType::Image => {
let source = if let Some(path) = path {
AttachmentSource::Path { path: path.clone() }
} else if let Some(content) = content {
AttachmentSource::Data {
data: content.clone(),
encoding: None,
}
} else {
return UniversalMessagePart::Unknown { raw };
};
UniversalMessagePart::Image {
source,
mime_type: mime_type.clone(),
alt: None,
raw: Some(raw),
}
}
}
}
pub fn universal_message_to_inputs(
message: &UniversalMessage,
) -> Result<Vec<schema::Input>, ConversionError> {
let parsed = match message {
UniversalMessage::Parsed(parsed) => parsed,
UniversalMessage::Unparsed { .. } => {
return Err(ConversionError::Unsupported("unparsed message"))
}
};
universal_parts_to_inputs(&parsed.parts)
}
pub fn universal_parts_to_inputs(
parts: &[UniversalMessagePart],
) -> Result<Vec<schema::Input>, ConversionError> {
let mut inputs = Vec::new();
for part in parts {
match part {
UniversalMessagePart::Text { text } => inputs.push(schema::Input {
content: Some(text.clone()),
mime_type: None,
path: None,
type_: schema::InputType::Text,
}),
UniversalMessagePart::File {
source,
mime_type,
..
} => inputs.push(input_from_attachment(source, mime_type.as_ref(), schema::InputType::File)?),
UniversalMessagePart::Image {
source, mime_type, ..
} => inputs.push(input_from_attachment(
source,
mime_type.as_ref(),
schema::InputType::Image,
)?),
UniversalMessagePart::ToolCall { .. }
| UniversalMessagePart::ToolResult { .. }
| UniversalMessagePart::FunctionCall { .. }
| UniversalMessagePart::FunctionResult { .. }
| UniversalMessagePart::Error { .. }
| UniversalMessagePart::Unknown { .. } => {
return Err(ConversionError::Unsupported("unsupported part"))
}
}
}
if inputs.is_empty() {
return Err(ConversionError::MissingField("parts"));
}
Ok(inputs)
}
fn input_from_attachment(
source: &AttachmentSource,
mime_type: Option<&String>,
input_type: schema::InputType,
) -> Result<schema::Input, ConversionError> {
match source {
AttachmentSource::Path { path } => Ok(schema::Input {
content: None,
mime_type: mime_type.cloned(),
path: Some(path.clone()),
type_: input_type,
}),
AttachmentSource::Data { data, encoding } => {
if let Some(encoding) = encoding.as_deref() {
if encoding != "base64" {
return Err(ConversionError::Unsupported("codex data encoding"));
}
}
Ok(schema::Input {
content: Some(data.clone()),
mime_type: mime_type.cloned(),
path: None,
type_: input_type,
})
}
AttachmentSource::Url { .. } => Err(ConversionError::Unsupported("codex input url")),
}
}
fn thread_item_to_message(item: &schema::ThreadItem) -> UniversalMessage {
let schema::ThreadItem {
content,
id,
role,
status,
type_,
} = item;
let mut metadata = Map::new();
metadata.insert("itemType".to_string(), Value::String(type_.to_string()));
if let Some(status) = status {
metadata.insert("status".to_string(), Value::String(status.to_string()));
}
let role = role
.as_ref()
.map(|role| role.to_string())
.unwrap_or_else(|| "assistant".to_string());
let parts = match type_ {
schema::ThreadItemType::Message => message_parts_from_codex_content(content),
schema::ThreadItemType::FunctionCall => vec![function_call_part_from_codex(id, content)],
schema::ThreadItemType::FunctionResult => vec![function_result_part_from_codex(id, content)],
};
UniversalMessage::Parsed(UniversalMessageParsed {
role,
id: Some(id.clone()),
metadata,
parts,
})
}
fn message_parts_from_codex_content(
content: &Option<schema::ThreadItemContent>,
) -> Vec<UniversalMessagePart> {
match content {
Some(schema::ThreadItemContent::Variant0(text)) => {
vec![UniversalMessagePart::Text { text: text.clone() }]
}
Some(schema::ThreadItemContent::Variant1(raw)) => {
vec![UniversalMessagePart::Unknown {
raw: serde_json::to_value(raw).unwrap_or(Value::Null),
}]
}
None => Vec::new(),
}
}
fn function_call_part_from_codex(
item_id: &str,
content: &Option<schema::ThreadItemContent>,
) -> UniversalMessagePart {
let raw = thread_item_content_to_value(content);
let name = extract_object_field(&raw, "name");
let arguments = extract_object_value(&raw, "arguments").unwrap_or_else(|| raw.clone());
UniversalMessagePart::FunctionCall {
id: Some(item_id.to_string()),
name,
arguments,
raw: Some(raw),
}
}
fn function_result_part_from_codex(
item_id: &str,
content: &Option<schema::ThreadItemContent>,
) -> UniversalMessagePart {
let raw = thread_item_content_to_value(content);
let name = extract_object_field(&raw, "name");
let result = extract_object_value(&raw, "result")
.or_else(|| extract_object_value(&raw, "output"))
.or_else(|| extract_object_value(&raw, "content"))
.unwrap_or_else(|| raw.clone());
UniversalMessagePart::FunctionResult {
id: Some(item_id.to_string()),
name,
result,
is_error: None,
raw: Some(raw),
}
}
fn thread_item_content_to_value(content: &Option<schema::ThreadItemContent>) -> Value {
match content {
Some(schema::ThreadItemContent::Variant0(text)) => Value::String(text.clone()),
Some(schema::ThreadItemContent::Variant1(raw)) => {
Value::Array(raw.iter().cloned().map(Value::Object).collect())
}
None => Value::Null,
}
}
fn extract_object_field(raw: &Value, field: &str) -> Option<String> {
extract_object_value(raw, field)
.and_then(|value| value.as_str().map(|s| s.to_string()))
}
fn extract_object_value(raw: &Value, field: &str) -> Option<Value> {
match raw {
Value::Object(map) => map.get(field).cloned(),
Value::Array(values) => values
.first()
.and_then(|value| value.as_object())
.and_then(|map| map.get(field).cloned()),
_ => None,
}
}

View file

@ -0,0 +1,4 @@
pub mod amp;
pub mod claude;
pub mod codex;
pub mod opencode;

View file

@ -0,0 +1,958 @@
use crate::{
extract_message_from_value,
AttachmentSource,
ConversionError,
CrashInfo,
EventConversion,
PermissionRequest,
PermissionToolRef,
QuestionInfo,
QuestionOption,
QuestionRequest,
QuestionToolRef,
Started,
UniversalEventData,
UniversalMessage,
UniversalMessageParsed,
UniversalMessagePart,
};
use crate::opencode as schema;
use serde::{Deserialize, Serialize};
use serde_json::{Map, Value};
pub fn event_to_universal(event: &schema::Event) -> EventConversion {
match event {
schema::Event::MessageUpdated(updated) => {
let schema::EventMessageUpdated { properties, type_: _ } = updated;
let schema::EventMessageUpdatedProperties { info } = properties;
let (message, session_id) = message_from_opencode(info);
EventConversion::new(UniversalEventData::Message { message })
.with_session(session_id)
}
schema::Event::MessagePartUpdated(updated) => {
let schema::EventMessagePartUpdated { properties, type_: _ } = updated;
let schema::EventMessagePartUpdatedProperties { part, delta } = properties;
let (message, session_id) = part_to_message(part, delta.as_ref());
EventConversion::new(UniversalEventData::Message { message })
.with_session(session_id)
}
schema::Event::QuestionAsked(asked) => {
let schema::EventQuestionAsked { properties, type_: _ } = asked;
let question = question_request_from_opencode(properties);
let session_id = question.session_id.clone();
EventConversion::new(UniversalEventData::QuestionAsked { question_asked: question })
.with_session(Some(session_id))
}
schema::Event::PermissionAsked(asked) => {
let schema::EventPermissionAsked { properties, type_: _ } = asked;
let permission = permission_request_from_opencode(properties);
let session_id = permission.session_id.clone();
EventConversion::new(UniversalEventData::PermissionAsked { permission_asked: permission })
.with_session(Some(session_id))
}
schema::Event::SessionCreated(created) => {
let schema::EventSessionCreated { properties, type_: _ } = created;
let schema::EventSessionCreatedProperties { info } = properties;
let details = serde_json::to_value(info).ok();
let started = Started {
message: Some("session.created".to_string()),
details,
};
EventConversion::new(UniversalEventData::Started { started })
}
schema::Event::SessionError(error) => {
let schema::EventSessionError { properties, type_: _ } = error;
let schema::EventSessionErrorProperties {
error: _error,
session_id,
} = properties;
let message = extract_message_from_value(&serde_json::to_value(properties).unwrap_or(Value::Null))
.unwrap_or_else(|| "opencode session error".to_string());
let crash = CrashInfo {
message,
kind: Some("session.error".to_string()),
details: serde_json::to_value(properties).ok(),
};
EventConversion::new(UniversalEventData::Error { error: crash })
.with_session(session_id.clone())
}
_ => EventConversion::new(UniversalEventData::Unknown {
raw: serde_json::to_value(event).unwrap_or(Value::Null),
}),
}
}
pub fn universal_event_to_opencode(event: &UniversalEventData) -> Result<schema::Event, ConversionError> {
match event {
UniversalEventData::QuestionAsked { question_asked } => {
let properties = question_request_to_opencode(question_asked)?;
Ok(schema::Event::QuestionAsked(schema::EventQuestionAsked {
properties,
type_: "question.asked".to_string(),
}))
}
UniversalEventData::PermissionAsked { permission_asked } => {
let properties = permission_request_to_opencode(permission_asked)?;
Ok(schema::Event::PermissionAsked(schema::EventPermissionAsked {
properties,
type_: "permission.asked".to_string(),
}))
}
_ => Err(ConversionError::Unsupported("opencode event")),
}
}
pub fn universal_message_to_parts(
message: &UniversalMessage,
) -> Result<Vec<schema::TextPartInput>, ConversionError> {
let parsed = match message {
UniversalMessage::Parsed(parsed) => parsed,
UniversalMessage::Unparsed { .. } => {
return Err(ConversionError::Unsupported("unparsed message"))
}
};
let mut parts = Vec::new();
for part in &parsed.parts {
match part {
UniversalMessagePart::Text { text } => {
parts.push(text_part_input_from_text(text));
}
UniversalMessagePart::ToolCall { .. }
| UniversalMessagePart::ToolResult { .. }
| UniversalMessagePart::FunctionCall { .. }
| UniversalMessagePart::FunctionResult { .. }
| UniversalMessagePart::File { .. }
| UniversalMessagePart::Image { .. }
| UniversalMessagePart::Error { .. }
| UniversalMessagePart::Unknown { .. } => {
return Err(ConversionError::Unsupported("non-text part"))
}
}
}
if parts.is_empty() {
return Err(ConversionError::MissingField("parts"));
}
Ok(parts)
}
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(untagged)]
pub enum OpencodePartInput {
Text(schema::TextPartInput),
File(schema::FilePartInput),
}
pub fn universal_message_to_part_inputs(
message: &UniversalMessage,
) -> Result<Vec<OpencodePartInput>, ConversionError> {
let parsed = match message {
UniversalMessage::Parsed(parsed) => parsed,
UniversalMessage::Unparsed { .. } => {
return Err(ConversionError::Unsupported("unparsed message"))
}
};
universal_parts_to_part_inputs(&parsed.parts)
}
pub fn universal_parts_to_part_inputs(
parts: &[UniversalMessagePart],
) -> Result<Vec<OpencodePartInput>, ConversionError> {
let mut inputs = Vec::new();
for part in parts {
inputs.push(universal_part_to_opencode_input(part)?);
}
if inputs.is_empty() {
return Err(ConversionError::MissingField("parts"));
}
Ok(inputs)
}
pub fn universal_part_to_opencode_input(
part: &UniversalMessagePart,
) -> Result<OpencodePartInput, ConversionError> {
match part {
UniversalMessagePart::Text { text } => Ok(OpencodePartInput::Text(
text_part_input_from_text(text),
)),
UniversalMessagePart::File {
source,
mime_type,
filename,
..
} => Ok(OpencodePartInput::File(file_part_input_from_universal(
source,
mime_type.as_deref(),
filename.as_ref(),
)?)),
UniversalMessagePart::Image {
source, mime_type, ..
} => Ok(OpencodePartInput::File(file_part_input_from_universal(
source,
mime_type.as_deref(),
None,
)?)),
UniversalMessagePart::ToolCall { .. }
| UniversalMessagePart::ToolResult { .. }
| UniversalMessagePart::FunctionCall { .. }
| UniversalMessagePart::FunctionResult { .. }
| UniversalMessagePart::Error { .. }
| UniversalMessagePart::Unknown { .. } => {
Err(ConversionError::Unsupported("unsupported part"))
}
}
}
fn text_part_input_from_text(text: &str) -> schema::TextPartInput {
schema::TextPartInput {
id: None,
ignored: None,
metadata: Map::new(),
synthetic: None,
text: text.to_string(),
time: None,
type_: "text".to_string(),
}
}
pub fn text_part_input_to_universal(part: &schema::TextPartInput) -> UniversalMessage {
let schema::TextPartInput {
id,
ignored,
metadata,
synthetic,
text,
time,
type_,
} = part;
let mut metadata = metadata.clone();
if let Some(id) = id {
metadata.insert("partId".to_string(), Value::String(id.clone()));
}
if let Some(ignored) = ignored {
metadata.insert("ignored".to_string(), Value::Bool(*ignored));
}
if let Some(synthetic) = synthetic {
metadata.insert("synthetic".to_string(), Value::Bool(*synthetic));
}
if let Some(time) = time {
metadata.insert(
"time".to_string(),
serde_json::to_value(time).unwrap_or(Value::Null),
);
}
metadata.insert("type".to_string(), Value::String(type_.clone()));
UniversalMessage::Parsed(UniversalMessageParsed {
role: "user".to_string(),
id: None,
metadata,
parts: vec![UniversalMessagePart::Text { text: text.clone() }],
})
}
fn file_part_input_from_universal(
source: &AttachmentSource,
mime_type: Option<&str>,
filename: Option<&String>,
) -> Result<schema::FilePartInput, ConversionError> {
let mime = mime_type.ok_or(ConversionError::MissingField("mime_type"))?;
let url = attachment_source_to_opencode_url(source, mime)?;
Ok(schema::FilePartInput {
filename: filename.cloned(),
id: None,
mime: mime.to_string(),
source: None,
type_: "file".to_string(),
url,
})
}
fn attachment_source_to_opencode_url(
source: &AttachmentSource,
mime_type: &str,
) -> Result<String, ConversionError> {
match source {
AttachmentSource::Url { url } => Ok(url.clone()),
AttachmentSource::Path { path } => Ok(format!("file://{}", path)),
AttachmentSource::Data { data, encoding } => {
let encoding = encoding.as_deref().unwrap_or("base64");
if encoding != "base64" {
return Err(ConversionError::Unsupported("opencode data encoding"));
}
Ok(format!("data:{};base64,{}", mime_type, data))
}
}
}
fn message_from_opencode(message: &schema::Message) -> (UniversalMessage, Option<String>) {
match message {
schema::Message::UserMessage(user) => {
let schema::UserMessage {
agent,
id,
model,
role,
session_id,
summary,
system,
time,
tools,
variant,
} = user;
let mut metadata = Map::new();
metadata.insert("agent".to_string(), Value::String(agent.clone()));
metadata.insert(
"model".to_string(),
serde_json::to_value(model).unwrap_or(Value::Null),
);
metadata.insert(
"time".to_string(),
serde_json::to_value(time).unwrap_or(Value::Null),
);
metadata.insert(
"tools".to_string(),
serde_json::to_value(tools).unwrap_or(Value::Null),
);
if let Some(summary) = summary {
metadata.insert(
"summary".to_string(),
serde_json::to_value(summary).unwrap_or(Value::Null),
);
}
if let Some(system) = system {
metadata.insert("system".to_string(), Value::String(system.clone()));
}
if let Some(variant) = variant {
metadata.insert("variant".to_string(), Value::String(variant.clone()));
}
let parsed = UniversalMessageParsed {
role: role.clone(),
id: Some(id.clone()),
metadata,
parts: Vec::new(),
};
(
UniversalMessage::Parsed(parsed),
Some(session_id.clone()),
)
}
schema::Message::AssistantMessage(assistant) => {
let schema::AssistantMessage {
agent,
cost,
error,
finish,
id,
mode,
model_id,
parent_id,
path,
provider_id,
role,
session_id,
summary,
time,
tokens,
} = assistant;
let mut metadata = Map::new();
metadata.insert("agent".to_string(), Value::String(agent.clone()));
metadata.insert(
"cost".to_string(),
serde_json::to_value(cost).unwrap_or(Value::Null),
);
metadata.insert("mode".to_string(), Value::String(mode.clone()));
metadata.insert("modelId".to_string(), Value::String(model_id.clone()));
metadata.insert("providerId".to_string(), Value::String(provider_id.clone()));
metadata.insert("parentId".to_string(), Value::String(parent_id.clone()));
metadata.insert(
"path".to_string(),
serde_json::to_value(path).unwrap_or(Value::Null),
);
metadata.insert(
"tokens".to_string(),
serde_json::to_value(tokens).unwrap_or(Value::Null),
);
metadata.insert(
"time".to_string(),
serde_json::to_value(time).unwrap_or(Value::Null),
);
if let Some(error) = error {
metadata.insert(
"error".to_string(),
serde_json::to_value(error).unwrap_or(Value::Null),
);
}
if let Some(finish) = finish {
metadata.insert("finish".to_string(), Value::String(finish.clone()));
}
if let Some(summary) = summary {
metadata.insert(
"summary".to_string(),
serde_json::to_value(summary).unwrap_or(Value::Null),
);
}
let parsed = UniversalMessageParsed {
role: role.clone(),
id: Some(id.clone()),
metadata,
parts: Vec::new(),
};
(
UniversalMessage::Parsed(parsed),
Some(session_id.clone()),
)
}
}
}
fn part_to_message(part: &schema::Part, delta: Option<&String>) -> (UniversalMessage, Option<String>) {
match part {
schema::Part::Variant0(text_part) => {
let schema::TextPart {
id,
ignored,
message_id,
metadata,
session_id,
synthetic,
text,
time,
type_,
} = text_part;
let mut part_metadata = base_part_metadata(message_id, id, delta);
part_metadata.insert("type".to_string(), Value::String(type_.clone()));
if let Some(ignored) = ignored {
part_metadata.insert("ignored".to_string(), Value::Bool(*ignored));
}
if let Some(synthetic) = synthetic {
part_metadata.insert("synthetic".to_string(), Value::Bool(*synthetic));
}
if let Some(time) = time {
part_metadata.insert(
"time".to_string(),
serde_json::to_value(time).unwrap_or(Value::Null),
);
}
if !metadata.is_empty() {
part_metadata.insert(
"partMetadata".to_string(),
Value::Object(metadata.clone()),
);
}
let parsed = UniversalMessageParsed {
role: "assistant".to_string(),
id: Some(message_id.clone()),
metadata: part_metadata,
parts: vec![UniversalMessagePart::Text { text: text.clone() }],
};
(UniversalMessage::Parsed(parsed), Some(session_id.clone()))
}
schema::Part::Variant1 {
agent: _agent,
command: _command,
description: _description,
id,
message_id,
model: _model,
prompt: _prompt,
session_id,
type_: _type,
} => unknown_part_message(message_id, id, session_id, serde_json::to_value(part).unwrap_or(Value::Null), delta),
schema::Part::Variant2(reasoning_part) => {
let schema::ReasoningPart {
id,
message_id,
metadata: _metadata,
session_id,
text: _text,
time: _time,
type_: _type,
} = reasoning_part;
unknown_part_message(
message_id,
id,
session_id,
serde_json::to_value(reasoning_part).unwrap_or(Value::Null),
delta,
)
}
schema::Part::Variant3(file_part) => {
let schema::FilePart {
filename: _filename,
id,
message_id,
mime: _mime,
session_id,
source: _source,
type_: _type,
url: _url,
} = file_part;
let part_metadata = base_part_metadata(message_id, id, delta);
let part = file_part_to_universal_part(file_part);
let parsed = UniversalMessageParsed {
role: "assistant".to_string(),
id: Some(message_id.clone()),
metadata: part_metadata,
parts: vec![part],
};
(UniversalMessage::Parsed(parsed), Some(session_id.clone()))
}
schema::Part::Variant4(tool_part) => {
let schema::ToolPart {
call_id,
id,
message_id,
metadata,
session_id,
state,
tool,
type_,
} = tool_part;
let mut part_metadata = base_part_metadata(message_id, id, delta);
part_metadata.insert("type".to_string(), Value::String(type_.clone()));
part_metadata.insert("callId".to_string(), Value::String(call_id.clone()));
part_metadata.insert("tool".to_string(), Value::String(tool.clone()));
if !metadata.is_empty() {
part_metadata.insert(
"partMetadata".to_string(),
Value::Object(metadata.clone()),
);
}
let (mut parts, state_meta) = tool_state_to_parts(call_id, tool, state);
if let Some(state_meta) = state_meta {
part_metadata.insert("toolState".to_string(), state_meta);
}
let parsed = UniversalMessageParsed {
role: "assistant".to_string(),
id: Some(message_id.clone()),
metadata: part_metadata,
parts: parts.drain(..).collect(),
};
(UniversalMessage::Parsed(parsed), Some(session_id.clone()))
}
schema::Part::Variant5(step_start) => {
let schema::StepStartPart {
id,
message_id,
session_id,
snapshot: _snapshot,
type_: _type,
} = step_start;
unknown_part_message(
message_id,
id,
session_id,
serde_json::to_value(step_start).unwrap_or(Value::Null),
delta,
)
}
schema::Part::Variant6(step_finish) => {
let schema::StepFinishPart {
cost: _cost,
id,
message_id,
reason: _reason,
session_id,
snapshot: _snapshot,
tokens: _tokens,
type_: _type,
} = step_finish;
unknown_part_message(
message_id,
id,
session_id,
serde_json::to_value(step_finish).unwrap_or(Value::Null),
delta,
)
}
schema::Part::Variant7(snapshot_part) => {
let schema::SnapshotPart {
id,
message_id,
session_id,
snapshot: _snapshot,
type_: _type,
} = snapshot_part;
unknown_part_message(
message_id,
id,
session_id,
serde_json::to_value(snapshot_part).unwrap_or(Value::Null),
delta,
)
}
schema::Part::Variant8(patch_part) => {
let schema::PatchPart {
files: _files,
hash: _hash,
id,
message_id,
session_id,
type_: _type,
} = patch_part;
unknown_part_message(
message_id,
id,
session_id,
serde_json::to_value(patch_part).unwrap_or(Value::Null),
delta,
)
}
schema::Part::Variant9(agent_part) => {
let schema::AgentPart {
id,
message_id,
name: _name,
session_id,
source: _source,
type_: _type,
} = agent_part;
unknown_part_message(
message_id,
id,
session_id,
serde_json::to_value(agent_part).unwrap_or(Value::Null),
delta,
)
}
schema::Part::Variant10(retry_part) => {
let schema::RetryPart {
attempt: _attempt,
error: _error,
id,
message_id,
session_id,
time: _time,
type_: _type,
} = retry_part;
unknown_part_message(
message_id,
id,
session_id,
serde_json::to_value(retry_part).unwrap_or(Value::Null),
delta,
)
}
schema::Part::Variant11(compaction_part) => {
let schema::CompactionPart {
auto: _auto,
id,
message_id,
session_id,
type_: _type,
} = compaction_part;
unknown_part_message(
message_id,
id,
session_id,
serde_json::to_value(compaction_part).unwrap_or(Value::Null),
delta,
)
}
}
}
fn base_part_metadata(message_id: &str, part_id: &str, delta: Option<&String>) -> Map<String, Value> {
let mut metadata = Map::new();
metadata.insert("messageId".to_string(), Value::String(message_id.to_string()));
metadata.insert("partId".to_string(), Value::String(part_id.to_string()));
if let Some(delta) = delta {
metadata.insert("delta".to_string(), Value::String(delta.clone()));
}
metadata
}
fn unknown_part_message(
message_id: &str,
part_id: &str,
session_id: &str,
raw: Value,
delta: Option<&String>,
) -> (UniversalMessage, Option<String>) {
let metadata = base_part_metadata(message_id, part_id, delta);
let parsed = UniversalMessageParsed {
role: "assistant".to_string(),
id: Some(message_id.to_string()),
metadata,
parts: vec![UniversalMessagePart::Unknown { raw }],
};
(UniversalMessage::Parsed(parsed), Some(session_id.to_string()))
}
fn file_part_to_universal_part(file_part: &schema::FilePart) -> UniversalMessagePart {
let schema::FilePart {
filename,
id: _id,
message_id: _message_id,
mime,
session_id: _session_id,
source: _source,
type_: _type,
url,
} = file_part;
let raw = serde_json::to_value(file_part).unwrap_or(Value::Null);
let source = AttachmentSource::Url { url: url.clone() };
if mime.starts_with("image/") {
UniversalMessagePart::Image {
source,
mime_type: Some(mime.clone()),
alt: filename.clone(),
raw: Some(raw),
}
} else {
UniversalMessagePart::File {
source,
mime_type: Some(mime.clone()),
filename: filename.clone(),
raw: Some(raw),
}
}
}
fn tool_state_to_parts(
call_id: &str,
tool: &str,
state: &schema::ToolState,
) -> (Vec<UniversalMessagePart>, Option<Value>) {
match state {
schema::ToolState::Pending(state) => {
let schema::ToolStatePending { input, raw, status } = state;
let mut meta = Map::new();
meta.insert("status".to_string(), Value::String(status.clone()));
meta.insert("raw".to_string(), Value::String(raw.clone()));
meta.insert("input".to_string(), Value::Object(input.clone()));
(
vec![UniversalMessagePart::ToolCall {
id: Some(call_id.to_string()),
name: tool.to_string(),
input: Value::Object(input.clone()),
}],
Some(Value::Object(meta)),
)
}
schema::ToolState::Running(state) => {
let schema::ToolStateRunning {
input,
metadata,
status,
time,
title,
} = state;
let mut meta = Map::new();
meta.insert("status".to_string(), Value::String(status.clone()));
meta.insert("input".to_string(), Value::Object(input.clone()));
meta.insert("metadata".to_string(), Value::Object(metadata.clone()));
meta.insert(
"time".to_string(),
serde_json::to_value(time).unwrap_or(Value::Null),
);
if let Some(title) = title {
meta.insert("title".to_string(), Value::String(title.clone()));
}
(
vec![UniversalMessagePart::ToolCall {
id: Some(call_id.to_string()),
name: tool.to_string(),
input: Value::Object(input.clone()),
}],
Some(Value::Object(meta)),
)
}
schema::ToolState::Completed(state) => {
let schema::ToolStateCompleted {
attachments,
input,
metadata,
output,
status,
time,
title,
} = state;
let mut meta = Map::new();
meta.insert("status".to_string(), Value::String(status.clone()));
meta.insert("input".to_string(), Value::Object(input.clone()));
meta.insert("metadata".to_string(), Value::Object(metadata.clone()));
meta.insert(
"time".to_string(),
serde_json::to_value(time).unwrap_or(Value::Null),
);
meta.insert("title".to_string(), Value::String(title.clone()));
if !attachments.is_empty() {
meta.insert(
"attachments".to_string(),
serde_json::to_value(attachments).unwrap_or(Value::Null),
);
}
let mut parts = vec![UniversalMessagePart::ToolResult {
id: Some(call_id.to_string()),
name: Some(tool.to_string()),
output: Value::String(output.clone()),
is_error: Some(false),
}];
for attachment in attachments {
parts.push(file_part_to_universal_part(attachment));
}
(parts, Some(Value::Object(meta)))
}
schema::ToolState::Error(state) => {
let schema::ToolStateError {
error,
input,
metadata,
status,
time,
} = state;
let mut meta = Map::new();
meta.insert("status".to_string(), Value::String(status.clone()));
meta.insert("error".to_string(), Value::String(error.clone()));
meta.insert("input".to_string(), Value::Object(input.clone()));
meta.insert("metadata".to_string(), Value::Object(metadata.clone()));
meta.insert(
"time".to_string(),
serde_json::to_value(time).unwrap_or(Value::Null),
);
(
vec![UniversalMessagePart::ToolResult {
id: Some(call_id.to_string()),
name: Some(tool.to_string()),
output: Value::String(error.clone()),
is_error: Some(true),
}],
Some(Value::Object(meta)),
)
}
}
}
fn question_request_from_opencode(request: &schema::QuestionRequest) -> QuestionRequest {
let schema::QuestionRequest {
id,
questions,
session_id,
tool,
} = request;
QuestionRequest {
id: id.clone().into(),
session_id: session_id.clone().into(),
questions: questions
.iter()
.map(|question| {
let schema::QuestionInfo {
custom,
header,
multiple,
options,
question,
} = question;
QuestionInfo {
question: question.clone(),
header: Some(header.clone()),
options: options
.iter()
.map(|opt| {
let schema::QuestionOption { description, label } = opt;
QuestionOption {
label: label.clone(),
description: Some(description.clone()),
}
})
.collect(),
multi_select: *multiple,
custom: *custom,
}
})
.collect(),
tool: tool.as_ref().map(|tool| {
let schema::QuestionRequestTool { message_id, call_id } = tool;
QuestionToolRef {
message_id: message_id.clone(),
call_id: call_id.clone(),
}
}),
}
}
fn permission_request_from_opencode(request: &schema::PermissionRequest) -> PermissionRequest {
let schema::PermissionRequest {
always,
id,
metadata,
patterns,
permission,
session_id,
tool,
} = request;
PermissionRequest {
id: id.clone().into(),
session_id: session_id.clone().into(),
permission: permission.clone(),
patterns: patterns.clone(),
metadata: metadata.clone(),
always: always.clone(),
tool: tool.as_ref().map(|tool| {
let schema::PermissionRequestTool { message_id, call_id } = tool;
PermissionToolRef {
message_id: message_id.clone(),
call_id: call_id.clone(),
}
}),
}
}
fn question_request_to_opencode(request: &QuestionRequest) -> Result<schema::QuestionRequest, ConversionError> {
let id = schema::QuestionRequestId::try_from(request.id.as_str())
.map_err(|err| ConversionError::InvalidValue(err.to_string()))?;
let session_id = schema::QuestionRequestSessionId::try_from(request.session_id.as_str())
.map_err(|err| ConversionError::InvalidValue(err.to_string()))?;
let questions = request
.questions
.iter()
.map(|question| schema::QuestionInfo {
question: question.question.clone(),
header: question
.header
.clone()
.unwrap_or_else(|| "Question".to_string()),
options: question
.options
.iter()
.map(|opt| schema::QuestionOption {
label: opt.label.clone(),
description: opt.description.clone().unwrap_or_default(),
})
.collect(),
multiple: question.multi_select,
custom: question.custom,
})
.collect();
Ok(schema::QuestionRequest {
id,
session_id,
questions,
tool: request.tool.as_ref().map(|tool| schema::QuestionRequestTool {
message_id: tool.message_id.clone(),
call_id: tool.call_id.clone(),
}),
})
}
fn permission_request_to_opencode(
request: &PermissionRequest,
) -> Result<schema::PermissionRequest, ConversionError> {
let id = schema::PermissionRequestId::try_from(request.id.as_str())
.map_err(|err| ConversionError::InvalidValue(err.to_string()))?;
let session_id = schema::PermissionRequestSessionId::try_from(request.session_id.as_str())
.map_err(|err| ConversionError::InvalidValue(err.to_string()))?;
Ok(schema::PermissionRequest {
id,
session_id,
permission: request.permission.clone(),
patterns: request.patterns.clone(),
metadata: request.metadata.clone(),
always: request.always.clone(),
tool: request.tool.as_ref().map(|tool| schema::PermissionRequestTool {
message_id: tool.message_id.clone(),
call_id: tool.call_id.clone(),
}),
})
}

File diff suppressed because it is too large Load diff

View file

@ -4,11 +4,11 @@
"type": "module",
"license": "Apache-2.0",
"scripts": {
"extract": "tsx src/index.ts",
"extract:opencode": "tsx src/index.ts --agent=opencode",
"extract:claude": "tsx src/index.ts --agent=claude",
"extract:codex": "tsx src/index.ts --agent=codex",
"extract:amp": "tsx src/index.ts --agent=amp"
"extract": "tsx ../../src/agents/index.ts",
"extract:opencode": "tsx ../../src/agents/index.ts --agent=opencode",
"extract:claude": "tsx ../../src/agents/index.ts --agent=claude",
"extract:codex": "tsx ../../src/agents/index.ts --agent=codex",
"extract:amp": "tsx ../../src/agents/index.ts --agent=amp"
},
"dependencies": {
"ts-json-schema-generator": "^2.4.0",

14
spec.md
View file

@ -468,25 +468,27 @@ build typescript examples of how to deploy this to the given providres:
these should each have a vitest unit test to test. cloudflaer is trickier since it requires a more complex setup.
## readme docs
## docs
write a readme that doubles as docs for:
Docs live in the `docs/` folder (Mintlify). The root `README.md` should stay brief and link to the docs site or local docs.
Write docs that cover:
- architecture
- agent compatibility
- deployemnt guide (these should be links to working examples)
- deployment guide (link to working examples)
- docker (for dev)
- e2b
- daytona
- vercel sandboxes
- cloudflare sandboxes
- universal agent api feature checklist
- quesitons
- questions
- approve plan
- etc (ie you need to infer what features are required to imeplment and what is optional)
- etc (infer what features are required vs optional)
- cli
- http api
- running the example frontend
- typescript sdk
use the collapsible github sections for things like each api endpoint or each typescript sdk endpoint to collapse more info. this keeps the page readable.
Use collapsible sections for each API endpoint or TypeScript SDK endpoint to keep the page readable.

View file

@ -2,7 +2,14 @@ import { createHash } from "crypto";
import { existsSync, mkdirSync, readFileSync, writeFileSync, statSync } from "fs";
import { join } from "path";
const CACHE_DIR = join(import.meta.dirname, "..", ".cache");
const CACHE_DIR = join(
import.meta.dirname,
"..",
"..",
"resources",
"agent-schemas",
".cache"
);
const DEFAULT_TTL_MS = 24 * 60 * 60 * 1000; // 24 hours
interface CacheEntry<T> {

View file

@ -28,7 +28,7 @@ const TARGET_TYPES = [
];
function findTypesPath(): string | null {
const baseDir = join(import.meta.dirname, "..");
const baseDir = join(import.meta.dirname, "..", "..", "resources", "agent-schemas");
for (const relativePath of POSSIBLE_PATHS) {
const fullPath = join(baseDir, relativePath);
@ -54,7 +54,7 @@ export async function extractClaudeSchema(): Promise<NormalizedSchema> {
const config: Config = {
path: typesPath,
tsconfig: join(import.meta.dirname, "..", "tsconfig.json"),
tsconfig: join(import.meta.dirname, "..", "..", "resources", "agent-schemas", "tsconfig.json"),
type: "*",
skipTypeCheck: true,
topRef: false,

View file

@ -24,7 +24,7 @@ const TARGET_TYPES = [
];
function findTypesPath(): string | null {
const baseDir = join(import.meta.dirname, "..");
const baseDir = join(import.meta.dirname, "..", "..", "resources", "agent-schemas");
for (const relativePath of POSSIBLE_PATHS) {
const fullPath = join(baseDir, relativePath);
@ -50,7 +50,7 @@ export async function extractCodexSchema(): Promise<NormalizedSchema> {
const config: Config = {
path: typesPath,
tsconfig: join(import.meta.dirname, "..", "tsconfig.json"),
tsconfig: join(import.meta.dirname, "..", "..", "resources", "agent-schemas", "tsconfig.json"),
type: "*",
skipTypeCheck: true,
topRef: false,

View file

@ -6,7 +6,8 @@ import { extractCodexSchema } from "./codex.js";
import { extractAmpSchema } from "./amp.js";
import { validateSchema, type NormalizedSchema } from "./normalize.js";
const DIST_DIR = join(import.meta.dirname, "..", "dist");
const RESOURCE_DIR = join(import.meta.dirname, "..", "..", "resources", "agent-schemas");
const DIST_DIR = join(RESOURCE_DIR, "dist");
type AgentName = "opencode" | "claude" | "codex" | "amp";

View file

@ -22,8 +22,9 @@
## CLI
- [x] Implement clap CLI flags: `--token`, `--no-token`, `--host`, `--port`, CORS flags
- [x] Implement a CLI endpoint for every HTTP endpoint
- [ ] Update `CLAUDE.md` to keep CLI endpoints in sync with HTTP API changes
- [x] Update `CLAUDE.md` to keep CLI endpoints in sync with HTTP API changes
- [x] Prefix CLI API requests with `/v1`
- [x] Add CLI credentials extractor subcommand
## HTTP API Endpoints
- [x] POST `/agents/{}/install` with `reinstall` handling
@ -96,3 +97,4 @@
- [x] implement release pipeline
- implement e2b example
- implement typescript "start locally" by pulling form server using version
- [x] Move agent schema sources to src/agents